Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
India
On-site
Job Summary: Job Title: Senior Data Engineer – Machine Learning & Data Engineering Location: Gurgaon [IND] Department: Data Engineering / Data Science Employment Type: Full-Time YoE: 5-10 About the Role: We are looking for a Senior Data Engineer with a strong background in machine learning infrastructure , data pipeline development , and collaboration with data scientists to drive the deployment and scalability of advanced analytics and AI solutions. You will play a pivotal role in building and optimizing data systems that power ML models, dashboards, and strategic insights across the company. Key Responsibilities: Design, develop, and optimize scalable data pipelines and ETL/ELT processes to support ML workflows and analytics. Collaborate with data scientists to operationalize machine learning models in production environments (batch, real-time). Build and maintain data lakes, data warehouses, and feature stores using modern cloud technologies (e.g., AWS/GCP/Azure, Snowflake, Databricks). Implement and maintain ML infrastructure, including model versioning, CI/CD for ML, and monitoring tools (MLflow, Airflow, Kubeflow, etc.). Develop and enforce data quality, governance, and security standards. Troubleshoot data issues and support the lifecycle of model development to deployment. Partner with software engineers and DevOps teams to ensure data systems are robust, scalable, and secure. Mentor junior engineers and provide technical leadership on data and ML infrastructure. Qualifications: Required: 5+ years of experience in data engineering, ML infrastructure, or a related field. Proficient in Python, SQL, and big data processing frameworks (Spark, Flink, or similar). Experience with orchestration tools like Apache Airflow, Prefect, or Luigi. Hands-on experience deploying and managing machine learning models in production. Deep knowledge of cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes). Familiarity with CI/CD tools for data and ML pipelines. Experience with version control, testing, and reproducibility in data workflows. Preferred: Experience with feature stores (e.g., Feast), ML experiment tracking (e.g., MLflow), and monitoring solutions. Background in supporting NLP, computer vision, or time-series ML models. Strong communication skills and ability to work cross-functionally with data scientists, analysts, and engineers. Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Posted 8 hours ago
0 years
0 Lacs
India
Remote
Job Summary: WHO ARE YOU? Passionate and motivated. Driven, with an entrepreneurial spirit. Resourceful, innovative, forward thinking and committed. At Live Nation Entertainment, our people embrace these qualities, so if this sounds like you then please read on! THE ROLE As the Abuse Operations Engineering Lead, you'll be part of a mission critical team protecting the Ticketmaster platforms from abusive entities or those who deploy abusive digital behaviours designed to circumvent our controls that protect fair access to tickets. Abuse Operations is a centrally managed command and control centre for abuse investigations, escalations, policies, and tooling for all Ticketmaster properties systems. Abuse Operations Engineers must be able work independently across a broad tech stack, multi-task concurrent problems, perform triage and prioritization as necessary with discretion and pragmatic judgment. They provide expert coordination and perform analysis and remediation of abuse for supported products and services, maintaining a high standard from diagnostics and communication while driving to complete resolution. They actively reduce operational effort by creating/improving automation or working with Software Engineering teams to improve self-healing and self-service tooling, documentation, and processes. WHAT THIS ROLE WILL DO Provide 1st line support for all Ticketmaster abuse queries Perform on-call duty as part of a global team monitoring the availability and performance of the ticketing systems and APIs used by third-party services, as well as the various internal services and systems on which these interfaces depend. Resolve advanced issues and provide advanced troubleshooting for escalations. Provide Subject Matter Expertise to cross-functional teams on abuse issues, including strategy, issue troubleshooting, and product & tool requirements. Drive continuous improvements to our products, tools, configurations, APIs and processes by sharing learnings, constructive feedback, and design input with internal technical teams and integrators. Independently learn new technologies and master Ticketmaster ticketing platforms products and services to provide 'full stack' diagnostics to help determine the root cause of issues, and where appropriate help our integrators through their issues. Ensure runbooks, resolution responses, internal processes and integration documentation are up to date and to a high standard suitable for internal stakeholder usage. Work on automation to reduce toil WHAT THIS PERSON WILL BRING BA/BS degree in computer science or related field or relevant work experience in lieu of degree. Experience with bot detection and blocking systems. Troubleshooting skills ranging from diagnosing low-level request issues to large-scale issues with correlating data between various third-party partners and in-house systems Proficiency in Bash/Python/Go etc for operations scripts and text processing. Working knowledge of HTTP protocol and basic web systems, and analysis tools such as Splunk and Kibana/ELK stack, and database products (Oracle/MySQL/DataBricks/Snowflake/etc.) Experience working with a 24/7 shift based team. Experience in a global, fast-paced environment, resolving multiple interrupt-driven priorities simultaneously Passionate and motivated, resourceful, innovative, forward-thinking Strong English language communication skills and the ability to collaborate closely with remote team members Ability to work with autonomy while ensuring that new knowledge is shared with technology teams Committed and able to adapt quickly Embrace continuous learning and continuous improvement
Posted 8 hours ago
2.0 years
7 - 8 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-219080 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, build, and support data ingestion, transformation, and delivery pipelines across structured and unstructured sources within the enterprise data engineering. Manage and monitor day-to-day operations of the data engineering environment, ensuring high availability, performance, and data integrity. Collaborate with data architects, data governance, platform engineering, and business teams to support data integration use cases across R&D, Clinical, Regulatory, and Commercial functions. Integrate data from laboratory systems, clinical platforms, regulatory systems, and third-party data sources into enterprise data repositories. Implement and maintain metadata capture, data lineage, and data quality checks across pipelines to meet governance and compliance requirements. Support real-time and batch data flows using technologies such as Databricks, Kafka, Delta Lake, or similar. Work within GxP-aligned environments, ensuring compliance with data privacy, audit, and quality control standards. Partner with data stewards and business analysts to support self-service data access, reporting, and analytics enablement. Maintain operational documentation, runbooks, and process automation scripts for continuous improvement of data fabric operations. Participate in incident resolution and root cause analysis, ensuring timely and effective remediation of data pipeline issues. Create documentation, playbooks, and best practices for metadata ingestion, data lineage, and catalog usage. Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must have Skills: Build and maintain data pipelines to ingest and update metadata into enterprise data catalog platforms in biotech or life sciences or pharma. Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. experience in data engineering, data operations, or related roles, with at least 2+ years in life sciences, biotech, or pharmaceutical environments. Experience with cloud platforms (e.g., AWS, Azure, or GCP) for data pipeline and storage solutions. Understanding of data governance frameworks, metadata management, and data lineage tracking. Strong problem-solving skills, attention to detail, and ability to manage multiple priorities in a dynamic environment. Effective communication and collaboration skills to work across technical and business stakeholders. Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Preferred Qualifications: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Basic Qualifications: Master’s degree and 3 to 4 + years of Computer Science, IT or related field experience Bachelor’s degree and 5 to 8 + years of Computer Science, IT or related field experience Diploma and 7 to 9 years of Computer Science, IT or related field experience Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills : Excellent verbal and written communication skills. High degree of professionalism and interpersonal skills. Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.
Posted 8 hours ago
8.0 - 13.0 years
6 - 8 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-219115 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Specialist IS Bus Sys Analyst, Neural Nexus What you will do Let’s do this. Let’s change the world. In this vital role you will support the delivery of emerging AI/ML capabilities within Amgen's Neural Nexus program. As part of the Commercial Technology Data & Analytics team, you will collaborate with product owners and cross-functional partners to help design, implement, and iterate on a layered ecosystem passionate about DIAL (Data, Insights, Action, and Learning). Collaborate with the Commercial Data & Analytics (CD&A) team to help realize business value through the application of commercial data and emerging AI/ML technologies. Support delivery activities within the Scaled Agile Framework (SAFe), partnering with Engineering and Product Management to shape roadmaps, prioritize releases, and maintain a refined product backlog. Contribute to backlog management by helping break down Epics into Features and Sprint-ready User Stories, ensuring clear articulation of requirements and well-defined Acceptance Criteria and Definitions of Done. Ensure non-functional requirements are represented and prioritized within the backlog to maintain performance, scalability, and compliance standards. Collaborate with UX to align technical requirements, business processes, and scenarios with user-centered design. Assist in the development and delivery of engaging product demonstrations for internal and external partners. Support documentation efforts to maintain accurate records of system configurations, processes, and enhancements. Contribute to the launch and growth of Neural Nexus product teams focused on data connectivity, predictive modeling, and fast-cycle value delivery for commercial teams. Provide input for governance discussions and help prepare materials to support executive alignment on technology strategy and investment. What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a highly skilled and experienced Specialist IS Business Analyst with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders with these qualifications. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree with 8 to 13 years of experience in Information Systems Experience with writing user requirements and acceptance criteria Affinity to work in a DevOps environment and Agile mind set Ability to work in a team environment, effectively interacting with others Ability to meet deadlines and schedules and be accountable Must-Have Skills Excellent problem-solving skills and a passion for solving complex challenges in for AI-driven technologies Experience with Agile software development methodologies (Scrum) Superb communication skills and the ability to work with senior leadership with confidence and clarity Has experience with writing user requirements and acceptance criteria in agile project management systems such as JIRA Experience in managing product features for PI planning and developing product roadmaps and user journeys Good-to-Have Skills: Demonstrated expertise in data and analytics and related technology concepts Understanding of data and analytics software systems strategy, governance, and infrastructure Familiarity with low-code, no-code test automation software Technical thought leadership Able to communicate technical or complex subject matters in business terms Jira Align experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Soft Skills: Able to work under minimal supervision Excellent analytical and gap/fit assessment skills Strong verbal and written communication skills Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: Experience with cloud-based data technologies (e.g., Databricks, Redshift, S3 buckets) AWS (similar cloud-based platforms) Experience with design patterns, data structures, test-driven development Knowledge of NLP techniques for text analysis and sentiment analysis What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 8 hours ago
8.0 years
4 - 10 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-217915 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Mgr Software Development Engineering What you will do Let’s do this. Let’s change the world. In this vital role you will Provide technical leadership to enhance the culture of innovation, automation, and solving difficult scientific and business challenges. Technical leadership includes providing vision and direction to develop scalable reliable solutions. Provide leadership to select right-sized and appropriate tools and architectures based on requirements, data source format, and current technologies Develop, refactor, research and improve Weave cloud platform capabilities. Understand business drivers and technical needs so our cloud services seamlessly, automatically, and securely provides them the best service. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Build strong partnership with partner Build data products and service processes which perform data transformation, metadata extraction, workload management and error processing management to ensure high quality data Provide clear documentation for delivered solutions and processes, integrating documentation Collaborate with business partners to understand user stories and ensure technical solution/build can deliver to those needs Work with multi-functional teams to design and document effective and efficient solutions. Develop organisational change strategies and assist in their implementation. Mentor junior data engineers on standard processes in the industry and in the Amgen data landscape What we expect of you We are all different, yet we all use our unique contributions to serve patients. The professional we seek is someone with these qualifications. Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 8 to 13 years of relevant experience Must-Have Skills: Superb communication and interpersonal skills, with the ability to work closely with multi-functional GTM, product, and engineering teams. Minimum of 10+ years overall Software Engineer or Cloud Architect experience Minimum 3+ years in architecture role using public cloud solutions such as AWS Experience with AWS Technology stack Good-to-Have Skills: Familiarity with big data technologies, AI platforms, and cloud-based data solutions. Ability to work effectively across matrixed organizations and lead collaboration between data and AI teams. Passion for technology and customer success, particularly in driving innovative AI and data solutions. Experience working with teams of data scientists, software engineers and business experts to drive insights Experience with AWS Services such as EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway. Experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc) Solid understanding of relevant data standards and industry trends Ability to understand new business requirements and prioritize them for delivery Experience working in biopharma/life sciences industry Proficient in one of the coding languages (Python, Java, Scala) Hands on experience writing SQL using any RDBMS (Redshift, Postgres, MySQL, Teradata, Oracle, etc.). Experience with Schema Design & Dimensional data modeling. Experience with software DevOps CI/CD tools, such Git, Jenkins, Linux, and Shell Script Hands on experience using Databricks/Jupyter or similar notebook environment. Experience working with GxP systems Experience working in an agile environment (i.e. user stories, iterative development, etc.) Experience working with test-driven development and software test automation Experience working in a Product environment Good overall understanding of business, manufacturing, and laboratory systems common in the pharmaceutical industry, as well as the integration of these systems through applicable standards. Soft Skills: Excellent analytical and solving skills. Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 8 hours ago
130.0 years
3 - 10 Lacs
Hyderābād
On-site
Job Description Associate Manager, Scientific Data Engineering The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Join a team that is passionate about using data, analytics, and insights to drive decision-making and create custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company's IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Center helps ensure we can manage and improve each location, from investing in the growth, success, and well-being of our people to making sure colleagues from each IT division feel a sense of belonging, to managing critical emergencies. Together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview Design, develop, and maintain data pipelines to extract data from various sources and populate a data lake and data warehouse. Work closely with data scientists, analysts, and business teams to understand data requirements and deliver solutions aligned with business goals. Build and maintain platforms that support data ingestion, transformation, and orchestration across various data sources, both internal and external. Use data orchestration, logging, and monitoring tools to build resilient pipelines. Automate data flows and pipeline monitoring to ensure scalability, performance, and resilience of the platform. Monitor, troubleshoot, and resolve issues related to the data integration platform, ensuring uptime and reliability. Maintain thorough documentation for integration processes, configurations, and code to ensure easy onboarding for new team members and future scalability. Develop pipelines to ingest data into cloud data warehouses. Establish, modify and maintain data structures and associated components. Create and deliver standard reports in accordance with stakeholder needs and conforming to agreed standards. Work within a matrix organizational structure, reporting to both the functional manager and the project manager. Participate in project planning, execution, and delivery, ensuring alignment with both functional and project goals. What should you have Bachelors’ degree in Information Technology, Computer Science or any Technology stream. 1 to 3 years of developing data pipelines & data infrastructure, ideally within a drug development or life sciences context. Demonstrated expertise in delivering large-scale information management technology solutions encompassing data integration and self-service analytics enablement. Experienced in software/data engineering practices (including versioning, release management, deployment of datasets, agile & related software tools). Ability to design, build and unit test applications on Spark framework on Python . Build PySpark based applications for both batch and streaming requirements, which will require in-depth knowledge on Databricks/ Hadoop. Experience working with storage frameworks like Delta Lake/ Iceberg Experience working with MPP Datawarehouse’s like Redshift Cloud-native, ideally AWS certified. Strong working knowledge of at least one Reporting/Insight generation technology Good interpersonal and communication skills (verbal and written). Proven record of delivering high-quality results. Product and customer-centric approach. Innovative thinking, experimental mindset. Mandatory Skills Skill Category Skills Foundational Data Concepts SQL (Intermediate / Advanced) Python (Intermediate) Cloud Fundamentals (AWS Focus) AWS Console, IAM roles, regions, concept of cloud computing AWS S3 Data Processing & Transformation Apache Spark (Concepts & Usage) Databricks (Platform Usage), Unity Catalog, Delta Lake ETL & Orchestration AWS Glue (ETL, Catalog), Lambda Apache Airflow (DAGs and Orchestration) or other orchestration tool dbt (Data Build Tool) Matillion (or similar ETL tool) Data Storage & Querying Amazon Redshift / Azure Synapse Trino / Equivalent AWS Athena / Query Federation Data Quality & Governance Data Quality Concepts / Implementation Data Observability Concepts Collibra / equivalent tool Real-time / Streaming Apache Kafka (Concepts & Usage) DevOps & Automation CI / CD concepts, Pipelines (GitHub Actions / Jenkins / Azure DevOps) Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who we are: We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What we look for: Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 08/26/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R353468
Posted 8 hours ago
8.0 years
0 Lacs
Hyderābād
On-site
India Information Technology (IT) Group Functions Job Reference # 316907BR City Hyderabad Job Type Full Time Your role Are you passionate about data engineering? Are you keen to manage data that can materially help teams improve the way they work? You have the opportunity to join the team working on the new DevLens initiative, an ambitious data initiative to support Developer Productivity and Engineering Excellence for over 40,000 IT employees. Together with your team you will contribute to building and maintaining innovative data products spanning all engineering activities across UBS IT lines of business. We’re looking for a passionate Data Engineer to: develop, support and improve data pipelines and data products with attention to governance including sourcing, lineage, modelling, security, quality, distribution and efficiency analyze and organize raw data and be able to combine multiple datasets of varying quality take ownership and drive deliveries within a supportive team environment follow engineering best practices, and ensure bank & regulatory compliance across the lifecycle ensure the quality, security, reliability, and compliance of our solutions, promote re-use where possible. Automate testing & deployment where possible and build observability to monitor and resolve production issues help manage the department data inventory and data products provide data support to internal & external stakeholders ensure the quality, security, reliability, and compliance of our solutions Your team You will be part of the Development Practices & Standards (DP&S) Engineering global team within the Group CTO – Core Platform Engineering area. The team is responsible for delivering DevLens, the new engineering data solution, and to help improve the efficiency of end-to-end Software Development Lifecycle for the Bank. The team has a strong continuous growth & improvement mindset, at personal, team and department level. We are a global organization that values diversity, collaboration, engineering & digital culture, and innovation. You will be able to join one or more UBS Certified programs for Engineers or Data Specialists which offers many learning opportunities. This is one of many strategic initiatives to drive engineering excellence. Your expertise ideally 8+ years of experience in designing/developing data analytics and data warehouse solutions strong experience with Azure stack and in particular Azure Databricks, working with large Data/Delta Lake solution with multi-format data good understanding on Azure Data Lake Storage Gen2 proficient Python & T-SQL coding experience, in particular developing Spark jobs data modelling experience, ideally creating and maintaining data products good understanding of engineering practices and software development lifecycle experience working in enterprise software engineering environment including Git (Gitlab) Nice to have: knowledge in Kafka data streaming on Azure, prior ETL experience using industry tool e.g. PowerCenter/Alteryx/SSIS etc. About us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How we hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 8 hours ago
1.0 - 3.0 years
7 - 8 Lacs
Hyderābād
On-site
India - Hyderabad JOB ID: R-219085 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiativesand, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications : Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 8 hours ago
0 years
4 - 9 Lacs
Hyderābād
On-site
About the Role: Grade Level (for internal use): 09 The Role: S&P Global is seeking a Data Specialist who has a strong interest in data and would like to establish a career utilising the latest technologies working with industry experts for the Energy markets. As a Data Analyst within our European Data Team, you would be required to work closely with our energy market analysts and off-shore IT team to be responsible for day-to-day data processes. In this role you will look to understand data and apply the applicable business logics and metadata to turn data into information, information into insight and insight into business decisions. You will also work closely with Business Intelligence and IT teams to implement data changes into databases, products, and processes. The Impact: This role has a core impact to our Energy analytics team by adding value to our data through data quality and enhancements. This role serves as the backbone of our European data operations team, and you will contribute to turning our data into information and insights. The Career Opportunity: Data is at the core of all our products and strategy, understanding the data from an analytical and technical point of view can open opportunities across the business. An individual who can excel at this role will be known throughout numerous leadership teams – allowing for many opportunities of growth within the organisation. The Team / The Business: Dynamic Data team tasked with ensuring the integrity of our data and the automation and visualization of processes and data flows. The team is supportive and collaborative with exposure to many business units from Data Modelling to IT. They work well at moving forward with the group’s mission and goals. A new member of this team will hit the ground running and collaborate with their team members immediately. What We’re Looking For: We are looking for a process orientated, methodological thinker with a keen eye for detail. Someone who enjoys dealing with large sets of data and problem solving. This role would be ideal for someone proactive and eager to learn about energy markets with a strong working knowledge of large datasets. The role is a great place to hone your skills and grow within a strong and collaborative team. Responsibilities: Analyse data workflows, understanding how they link to our platforms Contribute and manage daily processes ensuring you find, investigate, resolve and report issues to internal and external platform users to maintain the integrity of S&P Global’s European energy data Maintain and create data documentation including mappings, and quality thresholds Assist in answering data related client questions, both internal and external, to ensure platform user issues are investigated and status updates provided to client facing team members Collaborate with cross-functional teams to understand their data requirements/issues to generate innovative ideas and solutions Work closely with clients and analysts to provide data subject matter expertise Communicating data changes to data stakeholders (internal/external) Maintain new and existing data change request backlog to ensure all requests are tracked and completed. Prioritise the backlog with analysts to ensure our dataset is accurate and up to date Assist in analytical support of data sets by working with the wider data team, analysts, and Business Intelligence team to ensure we have a complete and accurate dataset for European Gas, Power, LNG and other commodities Work with the Data Collections group and internal market analysts to ensure new data collection requests are processed and prioritized Ensure new and edited data points are quality assured before release and integrated into products, business logics and quality assurance processes to ensure consistency between database and products. Requirements/Skills: Enthusiastic about problem solving Excellent oral and written communication skills STEM (All Science, Technology, Economics or Maths) related University level education or equivalent Intermediate knowledge/exposure of databases (PostgresSQL, SQL Server) Intermediate Python coding skills Intermediate Spark (PySpark) experience desirable Experience working with Databricks and Apache Airflow Knowledge of using Git (GitHub) desirable but not essential Understanding of cloud computing (AWS preferred) Ability to retrieve, interrogate, manipulate, and analyse data and presenting findings Advanced troubleshooting and multi-tasking skills with a meticulous eye for detail Experience of processes in agile work streams driving data quality and improvement, experience of Azure DevOps desirable but not essential Knowledge of business intelligence tools for analysing datasets and visualizing data Interest in technology to visualize, interrogate and report data Strong communication skills over email, instant chat and face-to-face and experience of building constructive working relationships A strong track record and demonstrable interest in analysing data Knowledge/interest of energy markets or energy data desirable but not essential. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), ANLYTC202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316042 Posted On: 2025-06-28 Location: Gurgaon, India
Posted 8 hours ago
0 years
0 Lacs
Hyderābād
On-site
Who are we? CDK Global is the largest technical soltuions provider for the automotive retail industry that is setting the the landscape for automotive dealers, original equipment manufacturers (OEMs) and the customers they serve. As a technology company, we have a significant focus moving our applications to the public cloud and in the process working multiple transformation/modernization Be Part of Something Bigger Each year, more than three percent of the U.S. gross domestic product (GDP) is attributed to the auto industry, which flows through our customer, the auto dealer. It’s time you joined an evolving marketplace where research and development investment is measured in the tens of billions. It’s time you were a part of something bigger. We’re expanding our workforce – engineers, architects, developers and more – onboarding early adopters who can optimize, pivot and keep pace with ever-evolving development roadmaps and applications. Lead the way – Do you have a passion for doing public cloud right? Your technical thought leadership and deep skills will set the path for CDK’s use of public cloud computing. You will influence the organization at all levels. Automate, Automate, Automate - you’ll lead initiatives to automate everything from assisting application development teams in writing deployment code to developing Infrastructure-as-Code to drive security, operational process automation and governance. There’s got to be a better way - Can you look at an architecture, process, or application flow and see a more stable, standard, or optimized way to accomplish the same thing? Do you ask “why” when a request doesn’t sound right? Do you see problems and immediately want to fix them yourself? Can you find ways to quantify and prioritize these improvements? Join Our Team Growth potential, flexibility and material impact on the success and quality of a next-gen, enterprise software product make CDK an excellent choice for those who thrive in challenging, fast-paced engineering environments. The possibilities for impact are endless. We have exceptional opportunities to evolve our industry by driving change through new technology. If you’re ready for high-impact, you’re ready for CDK. Role: Define/Maintain/Implement CDK’s Public Clould standards including secrets management, storage, compute, networking, account management, database and operations. Leverage tools like AWS Trusted Advisor, 3rd party Cloud Cost Management tools and scripting to identify and drive cost optimization. This will include working with Application owners to achieve the cost savings. Design and implement Cloud Security Controls that creates guard rails for application teams to work within ensuring proper platform security for applications deployed within the CDK cloud environments. Design/Develop/Implement cloud solutions. Leveraging cloud native services, wrap the appropriate security, automation and service levels to support CDK business needs. Examples of solutions this role will be responsible for developing and supporting are Business Continuity/Backup and Recovery, Identity and Access Management, data services including long term archival, DNS, etc. Develop/maintain/implement cloud platform standards (User Access & Roles, tagging, security/compliance controls, operations management, performance management and configuration management) Responsible for writing and eventual automation of operational run-books for operations. Assist application teams with automating their production support run-books (automate everywhere) Assist application teams when they have issues using AWS services where they are not are fully up to speed in their use. Hands on development of automation solutions to support application teams. Define and maintain minimum application deployment standards (governance, cost management and tech debt) Optimizing and tuning designs based on performance and root cause analysis Analysis of existing solutions alignment to infrastructure standards and providing feedback to both evolve and mature the product solutions and CDK public cloud standards. Essential Duties & Skills: This is a hands-on role where the candidate will take on technical tasks where in depth knowledge on usage and public cloud best practices. Some of the areas within AWS where you will be working include: Compute: EC2, EKS. RDS, Lambda Networking: Load Balancing (ALB/ELB), VPN, Transit Gateways, VPC’s, Availablity Zones/Regions Storage: EBS, S3, Archive Services, AWS Backup Security: AWS Config, Cloud Watch, Cloud Trail, Route53, Guard Duty, Detective, Inspector, Security Hub, Secrets Server, KMS, AWS Shield, Security Groups,.AWS Identity and Access Management, etc. Cloud Cost Optimization: Cost Optimizer, Trusted Advisor, Cost Explorer, Harness Cloud Clost Management or equivalent cost management tools. Preferred: Experience with 3rd party SaaS solutions like DataBricks, Snowflake, Confluent Kafka Broad understanding/experience across full stack infrastructure technologies Site Reliablity Engineering practices Github/Artifactory/Bamboo/Terraform Database solutions (SQL/NoSQL) Containerization Solutions (Docker, Kubernetes) DevOps processes and tooling Message queuing, data streaming and caching solutions Networking principles and concepts Scripting and development; preferred Python & Java languages Server based operating systems (Windows/Linux) and Web Services (IIS, Apache) Experience of designing, optimizing and troubleshooting public cloud platforms associated with large, complex application stacks Have clear and concise communication and be comfortable working with at all levels in the organization Capable of managing and prioritize multiple projects with competing resource requirements and timelines Years of Experience: 4-5 yrs+ working in the AWS public cloud environment AWS Solution Architect Professional certification preferred Experience with Infrastructure as code (CloudFormation, Terraform) At CDK, we believe inclusion and diversity are essential in inspiring meaningful connections to our people, customers and communities. We are open, curious and encourage different views, so that everyone can be their best selves and make an impact. CDK is an Equal Opportunity Employer committed to creating an inclusive workforce where everyone is valued. Qualified applicants will receive consideration for employment without regard to race, color, creed, ancestry, national origin, gender, sexual orientation, gender identity, gender expression, marital status, creed or religion, age, disability (including pregnancy), results of genetic testing, service in the military, veteran status or any other category protected by law. Applicants for employment in the US must be authorized to work in the US. CDK may offer employer visa sponsorship to applicants.
Posted 8 hours ago
15.0 years
4 - 8 Lacs
Hyderābād
On-site
Purpose: Over 15 years, we have become a premier global provider of multi-cloud management, cloud-native application development solutions, and strategic end-to-end digital transformation services. Headquartered in Canada and with regional headquarters in the U.S. and the United Kingdom, Centrilogic delivers smart, streamlined solutions to clients worldwide. We are looking for a passionate and experienced Data Engineer to work with our other 70 Software, Data and DevOps engineers to guide and assist our clients’ data modernization journey. Our team works with companies with ambitious missions - clients who are creating new, innovative products, often in uncharted markets. We work as embedded members and leaders of our clients' development and data teams. We bring experienced senior engineers, leading-edge technologies and mindsets, and creative thinking. We show our clients how to move to the modern frameworks of data infrastructures and processing, and we help them reach their full potential with the power of data. In this role, you'll be the day-to-day primary point of contact with our clients to modernize their data infrastructures, architecture, and pipelines. Principal Responsibilities: Consulting clients on cloud-first strategies for core bet-the-company data initiatives Providing thought leadership on both process and technical matters Becoming a real champion and trusted advisor to our clients on all facets of Data Engineering Designing, developing, deploying, and supporting the modernization and transformation of our client’s end-to-end data strategy, including infrastructure, collection, transmission, processing, and analytics Mentoring and educating clients’ teams to keep them up to speed with the latest approaches, tools and skills, and setting them up for continued success post-delivery Required Experience and Skills: Must have either Microsoft Certified Azure Data Engineer Associate or Fabric Data Engineer Associate certification. Must have experience working in a consulting or contracting capacity on large data management and modernization programs. Experience with SQL Servers, data engineering, on platforms such as Azure Data Factory, Databricks, Data Lake, and Synapse. Strong knowledge and demonstrated experience with Delta Lake and Lakehouse Architecture. Strong knowledge of securing Azure environment, such as RBAC, Key Vault, and Azure Security Center. Strong knowledge of Kafka and Spark and extensive experience using them in a production environment. Strong and demonstrable experience as DBA in large-scale MS SQL environments deployed in Azure. Strong problem-solving skills, with the ability to get to the route of an issue quickly. Strong knowledge of Scala or Python. Strong knowledge of Linux administration and networking. Scripting skills and Infrastructure as Code (IaC) experience using PowerShell, Bash, and ARM templates. Understanding of security and corporate governance issues related with cloud-first data architecture, as well as accepted industry solutions. Experience in enabling continuous delivery for development teams using scripted cloud provisioning and automated tooling. Experience working with Agile development methodology that is fit for purpose. Sound business judgment and demonstrated leadership
Posted 8 hours ago
3.0 - 5.0 years
9 - 12 Lacs
Hyderābād
On-site
Position- Python Developer Location: Hyderabad Immediate Joiners Experience: 3-5 years Salary budget- upto 9 LPA to 12 LPA Responsibilities l Develop, maintain, and optimize data pipelines using Python and Azure Data Services. l Work with Azure Data Factory, Azure Databricks, Synapse Analytics, and SQL Server to process and analyze large datasets. l Implement ETL processes and data transformations for structured and unstructured data. l Collaborate with data engineers, analysts, and business teams to ensure data integrity and availability. l Design and implement data models and storage solutions on Azure. l Optimize data workflows for performance and scalability. l Ensure security and compliance with Azure best practices. Required Skills & Qualifications: l 3-5 years of experience in Python development. l Strong knowledge of Azure Data Platform (Azure Data Factory, Databricks, Synapse, SQL Server). l Experience with big data processing using PySpark or Apache Spark. l Proficiency in SQL for querying and managing databases. l Familiarity with data warehousing and cloud-based data solutions. l Knowledge of CI/CD pipelines and DevOps practices for data engineering. l Strong problem-solving and analytical skills. Preferred Qualifications: l Azure certifications would be an added advantage. l Experience with machine learning and AI-driven data processing. l Knowledge of Snowflake or other cloud-based data platforms. Job Type: Full-time Pay: ₹900,000.00 - ₹1,200,000.00 per year Schedule: Day shift Work Location: In person
Posted 8 hours ago
5.0 years
0 Lacs
Bengaluru
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: About the Role: We are hiring sharp, hands-on Data Engineers to build scalable data solutions and drive performance across modern data platforms. If you love writing clean code, solving tough data problems, and automating workflows, this role is for you. What you will do: · Build and manage high-performance data pipelines for batch and near real-time use cases · Write optimized, complex SQL queries and stored procedures for analytics and reporting · Develop modular Python scripts for automation, file processing, and data transformation using Pandas/NumPy · Optimize queries and scripts over large-scale datasets (TBs) with a focus on speed and efficiency · Build versioned, testable data models using DBT · Orchestrate multi-step workflows with Apache Airflow · Collaborate across teams to convert data needs into robust technical solutions Mandatory skill sets: ‘Must have’ knowledge, skills and experiences · 5+ years of hands-on experience in Data Engineering · Strong command over SQL and Python, especially for transformation and automation · Deep experience with DBT and Airflow in production environments · Solid understanding of ETL/ELT, data modeling, and pipeline performance tuning · Strong analytical thinking and debugging skills Preferred skill sets: ‘Good to have’ knowledge, skills and experiences · Experience with Teradata and Starburst (Presto/Trino) · Familiarity with cloud platforms (Azure/GCP/Snowflake) · Exposure to on-prem to cloud data migrations · Knowledge of Git-based workflows and CI/CD pipelines Years of experience required: Experience · 5-8 years Education qualification: o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering, Python (Programming Language), Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 8 hours ago
10.0 years
3 - 7 Lacs
Bengaluru
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: We are currently seeking an experienced Agile Coach ( FS domain) for our client for their agile transformation journey. Ideally, the candidate should hold the ICP-ACC certification and possess a strong background in coaching agile teams, facilitating agile practices, and fostering a culture of continuous improvement. Mandatory skill sets: ‘Must have’ knowledge, skills and experiences · ICP-ACC (ICAgile Certified Professional in Agile Coaching) · Proven experience in agile coaching across multiple teams or departments Preferred skill sets: ‘Good to have’ knowledge, skills and experiences · Strong facilitation, mentoring, and training skills · Excellent communication and interpersonal abilities Years of experience required: Experience and Qualifications · Experience - 10+ Years · NP - Immediate to 30 days · Location - Bangalore · 3 days / week work from client office Education qualification: o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Agile Coaching Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 8 hours ago
0 years
2 - 10 Lacs
Bengaluru
On-site
Role: AWS Data Specialist 8+ yes of exp with Managed Apache Kafka and Databricks in AWS, Dynamo DB, AWS Glue, AWS pipeline development. FHIR skill – All are must have skills Qualifications BE Range of Year Experience-Min Year 5 Range of Year Experience-Max Year 8
Posted 8 hours ago
3.0 years
5 - 8 Lacs
Bengaluru
Remote
Location: Remote - India Job Type: Full-Time Role Overview We are looking for a QA Engineer with strong experience in test automation and CI/CD practices, particularly using Azure DevOps. The ideal candidate will have solid Python programming skills and a strong understanding of quality assurance processes in cloud-native environments. This role will focus on integrating testing into build and release pipelines, ensuring product quality through automated and manual testing, and working closely with developers and Data engineers. Additionally, the role requires experience in testing Azure Data Factory pipelines and Databricks notebooks and workflows. Key Responsibilities Design, configure, and maintain test automation in Azure DevOps pipelines. Implement quality gates, automated test triggers, and release-stage validation steps. Collaborate with DevOps and engineering teams to maintain reliable, scalable build and deployment workflows. Develop and maintain automated test scripts and frameworks using Python. Automate functional, API, integration, and regression test cases. Load testing and Data Quality & Validation Testing. Use Python to create utilities for test data management, reporting, and environment setup. Design and execute test cases for new features and production support issues. Conduct manual testing for UI, exploratory, and user acceptance validation. Work with developers to identify bugs early through shift-left testing practices. Review application logs and diagnostics to analyze test failures and report defects. Validate end-to-end data workflows and ETL pipelines in Azure Data Factory. Perform schema validation, data quality checks, and lineage validation across data layers. Identify and troubleshoot issues in data ingestion, processing, and storage. Required Qualifications 3+ years of experience in QA engineering or test automation roles. Hands-on experience with Azure DevOps CI/CD pipelines and testing stages. Strong programming knowledge in Python. Experience with API testing and automation tools (e.g., Postman, REST Assured, Pytest). Familiarity with Git, branching strategies, and code integration workflows. Solid understanding of SDLC, QA methodologies, and agile development. Preferred Qualifications Experience working with containerized environments (Docker, Kubernetes). Exposure to performance or load testing tools (e.g., JMeter, k6). Familiarity with cloud-based environments and test environment provisioning. Knowledge of SQL or data validation testing techniques. Soft Skills Strong attention to detail and analytical thinking. Effective communication and documentation skills. Self-starter with the ability to work independently and collaboratively. Comfortable working in fast-paced, agile development environments.
Posted 8 hours ago
2.0 years
3 - 5 Lacs
Bengaluru
On-site
Description ISP Data Science - Analyst Role Profile Location: Bangalore, India Purpose of Role We are seeking a highly skilled and data-driven Data Science - Analyst to join our team. The ideal candidate will leverage advanced data analytics and AI techniques along with business heuristics to analyse student enrolment and retention data, identify trends, and provide actionable insights to support ISP and its schools’ enrolment goals. This role is critical for improving student experiences, optimising resource allocation, and enhancing overall enrolment and retention performance. The successful candidate will bring strong expertise in Python or equivalent-based statistical modelling (including propensity modelling), experience with Azure Databricks for scalable data workflows, and advanced skills in Power BI to build high-impact visualisations and dashboards. The role requires both technical depth and the ability to translate complex insights into strategic recommendations. ISP Principles Begin with our children and students. Our children and students are at the heart of what we do. Simply, their success is our success. Wellbeing and safety are both essential for learners and learning. Therefore, we are consistent in identifying potential safeguarding and Health & Safety issues and acting and following up on all concerns appropriately. Treat everyone with care and respect. We look after one another, embrace similarities and differences and promote the well-being of self and others. Operate effectively. We focus relentlessly on the things that are most important and will make the most difference. We apply school policies and procedures and embody the shared ideas of our community. Are financially responsible. We make financial choices carefully based on the needs of the children, students and our schools. Learn continuously. Getting better is what drives us. We positively engage with personal and professional development and school improvement. ISP Data Science - Analyst Key Responsibilities Data Analysis: Collect, clean, and preprocess, enrolment, retention, and customer satisfaction data from multiple sources. Analyse data to uncover trends, patterns, and factors influencing enrolment, retention, and customer satisfaction. AI and Machine Learning Implementation: Expertise in developing and deploying propensity models to support customer acquisition and retention activities and strategy. Experience with Azure, Databricks (and other equivalent platforms) for scalable data engineering and machine learning workflows. Develop and implement AI models, such as predictive analytics and propensity models to forecast enrolment patterns and retention risks. Use machine learning algorithms to identify high-risk student populations and recommend intervention strategies. Support lead scoring model development on HubSpot CRM. Collaborate with key colleagues to understand and define the most impactful use cases for AI and Machine Learning. Analyse cost/benefit of deploying systems and provide recommendations. Reporting and Visualisation: Create relevant dashboards on MS Power BI, reports, and visualisations to communicate key insights to stakeholders. Present findings in a clear and actionable manner to support decision-making. Collaboration: Work closely with key Group and Regional colleagues to understand challenges and opportunities related to enrolment and retention. Partner with IT and data teams to ensure data integrity and accessibility. Continuous Improvement: Monitor the performance of AI models and analytics tools, making necessary adjustments to improve accuracy and relevance. Stay updated with the latest advancements in AI, data analytics, and education trends. Skills, Qualifications and Experience Education: Bachelor’s degree in Data Science, Computer Science, Statistics, or a related field (Master’s preferred). Experience: At least 2 years’ experience in data analytics, preferably in education or a related field Experience in implementing predictive models - propensity models and interpreting their results. Strong Python skills for statistical modelling, including logistic regression, clustering, and decision trees. Hands-on experience with Azure Databricks is highly preferred. Strong working knowledge of Power BI for building automated and interactive dashboards. Hands-on experience with AI/ML tools and frameworks and currently employed in an AI/ML role. Proficiency in SQL, Python, R, or other data analytics languages. Skills and preferred attributes: Strong understanding of statistical methods and predictive analytics. Proficiency in data visualization tools (e.g., Tableau, Power BI, or similar). Excellent problem-solving, critical thinking, and communication skills. Ability to work collaboratively with diverse teams. Experience in education technology or student success initiatives. Familiarity with CRM or student information systems. Knowledge of ethical considerations in AI and data privacy laws. ISP Commitment to Safeguarding Principles ISP is committed to safeguarding and promoting the welfare of children and young people and expects all staff and volunteers to share this commitment. All post holders are subject to appropriate vetting procedures, including an online due diligence search, references and satisfactory Criminal Background Checks or equivalent covering the previous 10 years’ employment history. ISP Commitment to Diversity, Equity, Inclusion, and Belonging ISP is committed to strengthening our inclusive culture by identifying, hiring, developing, and retaining high-performing teammates regardless of gender, ethnicity, sexual orientation and gender expression, age, disability status, neurodivergence, socio-economic background or other demographic characteristics. Candidates who share our vision and principles and are interested in contributing to the success of ISP through this role are strongly encouraged to apply.
Posted 8 hours ago
3.0 years
9 - 9 Lacs
Bengaluru
On-site
JOB DESCRIPTION Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team. As a Data Engineer III at JPMorgan Chase within the Asset & Wealth Management, you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives. Job responsibilities Supports review of controls to ensure sufficient protection of enterprise data. Design and develop data solutions leveraging the Databricks platform, ensuring efficiency, scalability, resiliency and performance. Advises and makes custom configuration changes in one to two tools to generate a product at the business or customer request. Supports review of controls to ensure sufficient protection of enterprise data. Stay updated on evolving capabilities in the Data Lakehouse space and evaluate and implement the ones that meets our requirements Updates logical or physical data models based on new use cases Frequently uses SQL and understands NoSQL databases and their niche in the marketplace Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on data engineering concepts and 3+ years applied experience Experience across the data lifecycle Experience developing application leveraging Python Experience across the data lifecycle Expertise on SQL (e.g., joins and aggregations) Expertise on Spark and related technologies Significant experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis Experience working in an agile environment Preferred qualifications, capabilities, and skills Experience on AWS cloud platform Knowledge of Data Mesh architecture ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM J.P. Morgan Asset & Wealth Management delivers industry-leading investment management and private banking solutions. Asset Management provides individuals, advisors and institutions with strategies and expertise that span the full spectrum of asset classes through our global network of investment professionals. Wealth Management helps individuals, families and foundations take a more intentional approach to their wealth or finances to better define, focus and realize their goals.
Posted 8 hours ago
5.0 years
0 Lacs
Bengaluru
On-site
Company Description Bosch Global Software Technologies Private Limited is a 100% owned subsidiary of Robert Bosch GmbH, one of the world's leading global supplier of technology and services, offering end-to-end Engineering, IT and Business Solutions. With over 28,200+ associates, it’s the largest software development center of Bosch, outside Germany, indicating that it is the Technology Powerhouse of Bosch in India with a global footprint and presence in the US, Europe and the Asia Pacific region. Job Description Hands-on experience with containerized workloads (e.g., Docker, Kubernetes). Proficiency in networking concepts and their application in cloud environments. Familiarity with Infrastructure as Code (IaC) tools like Terraform. Solid scripting and automation skills (e.g., Python, Bash, PowerShell). Understanding of CI/CD pipelines and DevOps practices. Knowledge of monitoring and logging tools (Prometheus, Grafana, etc.) is advantageous. Support cloud infrastructure projects and advise on Databricks-Azure integrations. Qualifications Educational qualification: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. Relevant Azure certifications (e.g., Azure Administrator Associate, Azure Solutions Architect) are preferred. Experience : 5+ years of experience in cloud (Azure) infrastructure management/administration. Experience with on-prem infrastructure and hybrid cloud environments is a plus.
Posted 9 hours ago
5.0 years
7 - 9 Lacs
Chennai
On-site
5 - 12 Years 20 Openings Bangalore, Chennai, Gurgaon-IND-GGN-MobileComm, Hyderabad, Kochi, Noida, Pune, Trivandrum Role description This role requires 5+ years of experience and the ability to work closely with project leads and clients. As a Sr.Data Engineer, you'll join our team to design and develop scalable data processing systems. You'll gather requirements, create end-to-end solutions, and oversee projects from conception to production. Key responsibilities include mentoring team members and translating complex business needs into robust technical solutions using cutting-edge technologies. Technical Skills: Must have Strong expertise in Python, PySpark, Spark architecture and performance tuning Must have proficiency in writing optimized SQL with performance tuning aspects. Must have strong expertise in AWS data engineering services (Glue, Lambda, API Gateway, S3). Ability to translate complex business requirements into technical solutions. Good to have working knowledge of Unix. Good to have Hands-on experience with CloudFormation Experience with CI/CD pipeline. Skills AWS, Python, Databricks, Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.
Posted 9 hours ago
5.0 - 7.0 years
4 - 5 Lacs
Chennai
On-site
Position: Data Engineer – Azure Databricks Purpose of the Position: To design, build, and optimize scalable data pipelines and solutions using Azure Databricks and related technologies, enabling Zodiac Maritime to make faster, data-driven decisions as part of its data transformation journey. Proficiency in data integration techniques, ETL processes and data pipeline architectures. Well versed in Data Quality rules, principles and implementation. Location: Nagpur/ Pune/ Chennai/ Bangalore Type of Employment : FTE Key Result Areas and Activities: Data Pipeline Development: Design and implement robust batch and streaming data pipelines using Azure Databricks and Spark. Data Architecture Implementation: Apply Medallion Architecture to structure data layers (raw, enriched, curated). Data Quality & Governance: Ensure data accuracy, consistency, and governance using tools like Azure Purview and Unity Catalog. Performance Optimization: Optimize Spark jobs, Delta Lake tables, and SQL queries for efficiency and cost-effectiveness. Collaboration & Delivery: Work closely with analysts, architects, and business teams to deliver end-to-end data solutions. Technical Experience : Must Have: Hands-on experience with Azure Databricks, Delta Lake, Data Factory. Proficiency in Python, PySpark, and SQL with strong query optimization skills. Deep understanding of Lakehouse architecture and Medallion design patterns. Experience building scalable ETL/ELT pipelines and data transformations. Familiarity with Git, CI/CD pipelines, and Agile methodologies. Good To Have: Knowledge of data quality frameworks and monitoring practices. Experience with Power BI or other data visualization tools. Understanding of IoT data pipelines and streaming technologies like Kafka/Event Hubs. Awareness of emerging technologies such as Knowledge Graphs. Qualifications: Education: Likely a degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience: Proven hands-on experience with Azure data stack (Databricks, Data Factory, Delta Lake). Experience in building scalable ETL/ELT pipelines. Familiarity with data governance and DevOps practices. Qualities: Strong problem-solving and analytical skills Attention to detail and commitment to data quality Collaborative mindset and effective communication Proactive and self-driven Passion for learning and staying updated with emerging data technologies Location India Years Of Exp 5 to 7 years
Posted 9 hours ago
5.0 - 7.0 years
5 - 9 Lacs
Noida
On-site
Posted On: 27 Jun 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Skilled AWS Databricks Platform Administrator to manage and optimize our Databricks environment. The ideal candidate will have strong expertise in user access management, user persona development, and the ability to collaborate with architects to implement configuration changes. This role involves ensuring the security, performance, and reliability of the Databricks platform while supporting users and maintaining compliance with organizational policies. Good experience with SDLC Databricks platform administration is a must Must have security and access control experience, user provisioning Services integration experience Should be able to work with enterprise architects Good to have - API experience Required Skills & Qualifications 5-7 years of experience as a Databricks Administrator or similar role. Strong experience with AWS services (IAM, S3, EC2, Lambda, Glue, etc.). Expertise in Databricks administration, workspace management, and security configurations . Hands-on experience with AD groups, user access management, RBAC, and IAM policies . Experience in developing and managing user personas within enterprise environments. Strong understanding of network security, authentication, and data governance . Proficiency in Python, SQL, and Spark for troubleshooting and automation. Familiarity with Terraform, CloudFormation, or Infrastructure as Code (IaC) is a plus. Knowledge of CI/CD pipelines and DevOps best practices is desirable. Excellent communication and documentation skills . Preferred Certifications AWS Certified Solutions Architect – Associate / Professional Databricks Certified Data Engineer / Administrator Certified Information Systems Security Professional (CISSP) – Nice to have Mandatory Competencies Data Science - Databricks Cloud - AWS Cloud - Azure Cloud - AWS Lambda Data on Cloud - AWS S3 Python - Python Database - SQL Big Data - SPARK Beh - Communication and collaboration Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.
Posted 9 hours ago
8.0 years
0 Lacs
Noida
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 27-Jun-2025 Job ID 10255 Description and Requirements Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. As part of Tech Talent Transformation (T3) agenda, MetLife is establishing a Technology Center in India. This technology center will perform as an integrated organization between onshore, offshore, and strategic vendor partners in an Agile delivery model. We are seeking a highly skilled hands-on delivery engineer who is responsible for partnering with Internal Audit Leaders, third party vendors and IT Executives to lead global transformation projects with the goal of attracting, developing and retaining talent across the organization. This position will be a part of a fast-paced IT team leveraging technology expertise that spans across Java, REACT, Python, Azure and AI. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Corporate Technology Control Functions Leader, Control Functions Leadership team, India Corporate Technology AVP, and Business process Owners for Internal Audit Key Responsibilities Stakeholder Management - Managing key business stakeholders to deliver required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Technology Implementation – Implement and support projects on Internal Audit Technology platforms, specifically Azure Cloud. Ways of Working – Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Evaluate/Implement technical solutions supporting Internal Audit and SAAS based solutions, talent development, performance management, and workforce analytic Work with Functional Experts to translate user requirements into Technical Specifications Partner with internal business process owners, technical team members, and senior management throughout the project life cycle Act as the intermediary to facilitate a clear understanding among all parties about business assumptions and requirements, design, technical, testing, and production migration requirements Drive the resolution and troubleshooting of issues during development and post- production support. Responsible to Support Day-to-day business enhancements Knowledge, Skills, and Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 8+ years of experience in Controls Technology (Compliance, Audit, Legal, Risk) implementation & support; preferably cloud based solutions Global SAAS based Internal Audit or other control functions technology implementation experience Familiar with technology landscape supporting integration solutions such as Azure, Databricks, API Management Prior lead role or project management experience Experience in both front-end (e.g. REACT) and back-end technologies (e.g. Node.js, Python, Java), including Restful API design and microservices architecture Experience with MS Project, Visio, Excel, PowerPoint and related project delivery utilities. Preferred: Azure Cloud Certifications OTBI and BI Reports development Ability to manage systems testing including unit, QA, end to end and user acceptance testing Experience managing vendors to SLA’s. Proven experience collaborating with peers to establish best practices to achieve high service levels. Skills and Competencies: Competencies Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment People Management: Inspiring, motivating and leading diverse and distributed teams Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD Azure DevOps, JFrog Development Frameworks and Languages: Java REACT SQL Python Azure: Functional Knowledge of cloud based solutions Development Tools & Platforms Test Automation Security and Monitoring: Authentication/Authorization (CA SiteMinder, MS Entra, PingOne) About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 9 hours ago
5.0 years
2 - 7 Lacs
Noida
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 27-Jun-2025 Job ID 10257 Description and Requirements Position: Software/Platform Engineer Job Location: Noida Work Arrangement: Hybrid Assignment Category: Full-time Grade: 11M Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. As part of Tech Talent Transformation (T3) agenda, MetLife is establishing a Technology Center in India. This technology center will perform as an integrated organization between onshore, offshore, and strategic vendor partners in an Agile delivery model. We are seeking a highly skilled hands-on delivery engineer who is responsible for partnering with Internal Audit Leaders, third party vendors and IT Executives to lead global transformation projects with the goal of attracting, developing and retaining talent across the organization. This position will be a part of a fast-paced IT team leveraging technology expertise that spans across Java, REACT, Python, Azure and AI. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Corporate Technology Control Functions Leader, Control Functions Leadership team, India Corporate Technology AVP, and Business process Owners for Internal Audit Key Responsibilities Stakeholder Management - Managing key business stakeholders to deliver required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Technology Implementation – Implement and support projects on Internal Audit Technology platforms, specifically Azure Cloud. Ways of Working – Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Evaluate/Implement technical solutions supporting Internal Audit and SAAS based solutions, talent development, performance management, and workforce analytic Work with Functional Experts to translate user requirements into Technical Specifications Partner with internal business process owners, technical team members, and senior management throughout the project life cycle Act as the intermediary to facilitate a clear understanding among all parties about business assumptions and requirements, design, technical, testing, and production migration requirements Drive the resolution and troubleshooting of issues during development and post- production support. Responsible to Support Day-to-day business enhancements Knowledge, Skills, and Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 5+ years of experience in Controls Technology (Compliance, Audit, Legal, Risk) implementation & support; preferably cloud based solutions Global SAAS based Internal Audit or other control functions technology implementation experience Familiar with technology landscape supporting integration solutions such as Azure, Databricks, API Management Prior lead role or project management experience Experience in both front-end (e.g. REACT) and back-end technologies (e.g. Node.js, Python, Java), including Restful API design and microservices architecture Experience with MS Project, Visio, Excel, PowerPoint and related project delivery utilities. Preferred: Azure Cloud Certifications OTBI and BI Reports development Ability to manage systems testing including unit, QA, end to end and user acceptance testing Experience managing vendors to SLA’s. Proven experience collaborating with peers to establish best practices to achieve high service levels. Skills and Competencies: Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment People Management: Inspiring, motivating and leading diverse and distributed teams Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD Azure DevOps, JFrog Development Frameworks and Languages: Java REACT SQL Python Azure: Functional Knowledge of cloud based solutions Development Tools & Platforms: Test Automation Security and Monitoring: Authentication/Authorization (CA SiteMinder, MS Entra, PingOne) About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 9 hours ago
8.0 years
2 - 7 Lacs
Noida
On-site
Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 27-Jun-2025 Job ID 10256 Description and Requirements Position: Sr. Software/Platform Engineer Job Location: Noida Work Arrangement: Hybrid Department: GOSC Operations 41165 Assignment Category: Full-time Grade: 11M Job Description and Requirements Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife’s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. As part of Tech Talent Transformation (T3) agenda, MetLife is establishing a Technology Center in India. This technology center will perform as an integrated organization between onshore, offshore, and strategic vendor partners in an Agile delivery model. We are seeking a highly skilled hands-on delivery engineer who is responsible for partnering with Internal Audit Leaders, third party vendors and IT Executives to lead global transformation projects with the goal of attracting, developing and retaining talent across the organization. This position will be a part of a fast-paced IT team leveraging technology expertise that spans across Java, REACT, Python, Azure and AI. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder – Corporate Technology Control Functions Leader, Control Functions Leadership team, India Corporate Technology AVP, and Business process Owners for Internal Audit Key Responsibilities Stakeholder Management - Managing key business stakeholders to deliver required technology capabilities to support the digital transformation agenda. Driving prioritization of the product backlog. This includes managing key vendors providing the resources, SaaS & other capabilities. Technology Implementation – Implement and support projects on Internal Audit Technology platforms, specifically Azure Cloud. Ways of Working – Adoption of the Agile ways of working in the software delivery lifecycle. E2E Software Lifecycle Management (Architecture, Design, Development, Testing & Production) Evaluate/Implement technical solutions supporting Internal Audit and SAAS based solutions, talent development, performance management, and workforce analytic Work with Functional Experts to translate user requirements into Technical Specifications Partner with internal business process owners, technical team members, and senior management throughout the project life cycle Act as the intermediary to facilitate a clear understanding among all parties about business assumptions and requirements, design, technical, testing, and production migration requirements Drive the resolution and troubleshooting of issues during development and post- production support. Responsible to Support Day-to-day business enhancements Knowledge, Skills, and Abilities Education A Bachelors/master's degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelor's degree in computer science, Information Systems or related field Experience: Required: 8+ years of experience in Controls Technology (Compliance, Audit, Legal, Risk) implementation & support; preferably cloud based solutions Global SAAS based Internal Audit or other control functions technology implementation experience Familiar with technology landscape supporting integration solutions such as Azure, Databricks, API Management Prior lead role or project management experience Experience in both front-end (e.g. REACT) and back-end technologies (e.g. Node.js, Python, Java), including Restful API design and microservices architecture Experience with MS Project, Visio, Excel, PowerPoint and related project delivery utilities. Preferred: Azure Cloud Certifications OTBI and BI Reports development Ability to manage systems testing including unit, QA, end to end and user acceptance testing Experience managing vendors to SLA’s. Proven experience collaborating with peers to establish best practices to achieve high service levels. Skills and Competencies: Competencies Communication: Ability to influence and help communicate the organization’s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment People Management: Inspiring, motivating and leading diverse and distributed teams Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD Azure DevOps, JFrog Development Frameworks and Languages: Java REACT SQL Python Azure: Functional Knowledge of cloud based solutions Development Tools & Platforms: Test Automation Security and Monitoring: Authentication/Authorization (CA SiteMinder, MS Entra, PingOne) About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!
Posted 9 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.
The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum
In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect
In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools
As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane