Home
Jobs

40 Data Wrangling Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Senior Associate, Enterprise Integration Engineer to join our team in Bengaluru. The ideal candidate will have 2-7 years of experience in data migration, application development, or system integration. Roles and Responsibility Support the design and implementation of data migration and system integration projects. Collaborate with ERP, CRM, and HCM teams to gather and review business and technical requirements. Develop, test, and deploy integrations and data pipelines using modern integration platforms. Conduct unit testing and assist with QA to ensure technical solutions meet client needs and follow best practices. Take ownership of individual tasks and workstreams, delivering high-quality results within established timelines. Assist with preparing documentation related to design, testing, and implementation for client-facing and internal use. Job Requirements Bachelor's degree in Computer Science, Information Technology, Systems Engineering, or a related field. Minimum 2 years of experience in data migration, application development, or system integration. Experience with integration tools such as Boomi, Azure Integration Services (Azure Data Factory, Logic Apps, etc.), MuleSoft, SSIS, or Celigo. Strong analytical, problem-solving, and data wrangling skills, including the ability to clean, transform, and prepare data for integration, migration, or reporting. Familiarity with ERP, CRM, HCM, or CPM systems and their data structures. Solid understanding of software development principles and good documentation habits. Strong communication skills and ability to work collaboratively with cross-functional teams.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

1 - 4 Lacs

Chennai

Work from Office

Naukri logo

Job Title:Data Analyst Experience5-10 Years Location:Chennai : Perform end-to-end analysis on healthcare data in tabular format or natural language data, ensuring strict compliance with HIPAA and data privacy regulations Independently identify problems, QA data, architect solutions and conduct analysis to support data-driven decision-making processes Collaborate with product managers, data scientists and data engineers, in an Agile and technology-driven environment Support senior data analyst and data scientists with data processing Generate internal data reports, ensuring clarity and alignment with organizational goals while adhering to HIPAA and data protection regulations Present in front of stakeholders including senior leadership, ensuring transparency and actionable recommendations ABOUT YOU: At least 3 years of work experience in analytics required, ideally in the healthcare or health plan sector, with a strong understanding of data privacy and governance frameworks such as HIPAA Strong analytical skills and critical thinking Strong knowledge of Excel calculations and data visualizations Strong SQL skills, experience of querying large datasets Experience of data visualization software (e.g., Power BI & Looker) Experience of generating automatic Power BI reports Strong statistical ability to analyze large amounts of data Experience of data wrangling and data visualization Desire and ability to build trust with business stakeholders, manage the relationship and socialize insights effectively Be able to communicate analytical insights with a variety of business stakeholders across different technical levels (including senior leaders) Self-motivated and be able to work independently Be able to work towards deadline NICE TO HAVE: Experience in Python Experience of cloud (AWS preferred) Experience in Quality or Risk Adjustment

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

RAG Pipeline Architectures Fine/Prompt/Instruction Tuning of LLMs machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn). data wrangling, data cleaning, data preprocessing, and data

Posted 1 month ago

Apply

1.0 - 3.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Analyst Location: Bangalore Experience: 1- 3 years Industry: Insurtech / Employee Benefits Company: Pazcare Job Type: Full-Time Job Summary: Pazcare is seeking a Data Analyst to help us unlock actionable insights from our growing datasets across insurance, wellness, and employee benefits. You will work closely with our business, product, and growth teams to drive data-informed decision-making. Responsibilities: Collect, clean, and analyze structured and unstructured data across various channels. Design and maintain dashboards to track key metrics and business KPIs. Perform root-cause analyses and identify actionable insights for product and growth teams. Collaborate with stakeholders to define data requirements and reporting goals. Present findings clearly with visualizations, summaries, and recommendations. Requirements: 13 years of experience in a data analyst or business intelligence role. Proficiency in SQL and spreadsheet tools like Excel/Google Sheets. Experience with data visualization tools (Tableau, Power BI, Looker, or similar). Familiarity with scripting in Python or R for data wrangling and analysis. Strong analytical thinking and attention to detail. Excellent communication and presentation skills. Nice to Have: Experience working with insurance or employee benefits data. Exposure to data warehouses (BigQuery, Snowflake, Redshift). Understanding of basic statistical methods and/or A/B testing.

Posted 1 month ago

Apply

10 - 12 years

13 - 20 Lacs

Ahmedabad

Work from Office

Naukri logo

Key Responsibilities : - Understand the factories , manufacturing process , data availability and avenues for improvement - Brainstorm , together with engineering, manufacturing and quality problems that can be solved using the acquired data in the data lake platform. - Define what data is required to create a solution and work with connectivity engineers , users to collect the data - Create and maintain optimal data pipeline architecture. - Assemble large, complex data sets that meet functional / non-functional business requirements. - Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability - Work on data preparation, data deep dive , help engineering, process and quality to understand the process/ machine behavior more closely using available data - Deploy and monitor the solution - Work with data and analytics experts to strive for greater functionality in our data systems. - Work together with Data Architects and data modeling teams. Skills /Competencies : - Good knowledge of the business vertical with prior experience in solving different use cases in the manufacturing or similar industry - Ability to bring cross industry learning to benefit the use cases aimed at improving manufacturing process Problem Scoping/definition Skills : - Experience in problem scoping, solving, quantification - Strong analytic skills related to working with unstructured datasets. - Build processes supporting data transformation, data structures, metadata, dependency and workload management. - Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores - Ability to foresee and identify all right data required to solve the problem Data Wrangling Skills : - Strong skill in data mining, data wrangling techniques for creating the required analytical dataset - Experience building and optimizing 'big data' data pipelines, architectures and data sets - Adaptive mindset to improvise on the data challenges and employ techniques to drive desired outcomes Programming Skills : - Experience with big data tools: Spark, Delta, CDC, NiFi, Kafka, etc - Experience with relational SQL ,NoSQL databases and query languages, including oracle , Hive, sparkQL. - Experience with object-oriented languages: Scala, Java, C++ etc. Visualization Skills : - Know how of any visualization tools such as PowerBI, Tableau - Good storytelling skills to present the data in simple and meaningful manner Data Engineering Skills : - Strong skill in data analysis techniques to generate finding and insights by means of exploratory data analysis - Good understanding of how to transform and connect the data of various types and form - Great numerical and analytical skills - Identify opportunities for data acquisition - Explore ways to enhance data quality and reliability - Build algorithms and prototypes - Reformulating existing frameworks to optimize their functioning. - Good understanding of optimization techniques to make the system performant for requirements.

Posted 1 month ago

Apply

10 - 12 years

13 - 20 Lacs

Kolkata

Work from Office

Naukri logo

Key Responsibilities : - Understand the factories , manufacturing process , data availability and avenues for improvement - Brainstorm , together with engineering, manufacturing and quality problems that can be solved using the acquired data in the data lake platform. - Define what data is required to create a solution and work with connectivity engineers , users to collect the data - Create and maintain optimal data pipeline architecture. - Assemble large, complex data sets that meet functional / non-functional business requirements. - Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability - Work on data preparation, data deep dive , help engineering, process and quality to understand the process/ machine behavior more closely using available data - Deploy and monitor the solution - Work with data and analytics experts to strive for greater functionality in our data systems. - Work together with Data Architects and data modeling teams. Skills /Competencies : - Good knowledge of the business vertical with prior experience in solving different use cases in the manufacturing or similar industry - Ability to bring cross industry learning to benefit the use cases aimed at improving manufacturing process Problem Scoping/definition Skills : - Experience in problem scoping, solving, quantification - Strong analytic skills related to working with unstructured datasets. - Build processes supporting data transformation, data structures, metadata, dependency and workload management. - Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores - Ability to foresee and identify all right data required to solve the problem Data Wrangling Skills : - Strong skill in data mining, data wrangling techniques for creating the required analytical dataset - Experience building and optimizing 'big data' data pipelines, architectures and data sets - Adaptive mindset to improvise on the data challenges and employ techniques to drive desired outcomes Programming Skills : - Experience with big data tools: Spark, Delta, CDC, NiFi, Kafka, etc - Experience with relational SQL ,NoSQL databases and query languages, including oracle , Hive, sparkQL. - Experience with object-oriented languages: Scala, Java, C++ etc. Visualization Skills : - Know how of any visualization tools such as PowerBI, Tableau - Good storytelling skills to present the data in simple and meaningful manner Data Engineering Skills : - Strong skill in data analysis techniques to generate finding and insights by means of exploratory data analysis - Good understanding of how to transform and connect the data of various types and form - Great numerical and analytical skills - Identify opportunities for data acquisition - Explore ways to enhance data quality and reliability - Build algorithms and prototypes - Reformulating existing frameworks to optimize their functioning. - Good understanding of optimization techniques to make the system performant for requirements.

Posted 1 month ago

Apply

10 - 12 years

13 - 20 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities : - Understand the factories , manufacturing process , data availability and avenues for improvement - Brainstorm , together with engineering, manufacturing and quality problems that can be solved using the acquired data in the data lake platform. - Define what data is required to create a solution and work with connectivity engineers , users to collect the data - Create and maintain optimal data pipeline architecture. - Assemble large, complex data sets that meet functional / non-functional business requirements. - Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery for greater scalability - Work on data preparation, data deep dive , help engineering, process and quality to understand the process/ machine behavior more closely using available data - Deploy and monitor the solution - Work with data and analytics experts to strive for greater functionality in our data systems. - Work together with Data Architects and data modeling teams. Skills /Competencies : - Good knowledge of the business vertical with prior experience in solving different use cases in the manufacturing or similar industry - Ability to bring cross industry learning to benefit the use cases aimed at improving manufacturing process Problem Scoping/definition Skills : - Experience in problem scoping, solving, quantification - Strong analytic skills related to working with unstructured datasets. - Build processes supporting data transformation, data structures, metadata, dependency and workload management. - Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores - Ability to foresee and identify all right data required to solve the problem Data Wrangling Skills : - Strong skill in data mining, data wrangling techniques for creating the required analytical dataset - Experience building and optimizing 'big data' data pipelines, architectures and data sets - Adaptive mindset to improvise on the data challenges and employ techniques to drive desired outcomes Programming Skills : - Experience with big data tools: Spark, Delta, CDC, NiFi, Kafka, etc - Experience with relational SQL ,NoSQL databases and query languages, including oracle , Hive, sparkQL. - Experience with object-oriented languages: Scala, Java, C++ etc. Visualization Skills : - Know how of any visualization tools such as PowerBI, Tableau - Good storytelling skills to present the data in simple and meaningful manner Data Engineering Skills : - Strong skill in data analysis techniques to generate finding and insights by means of exploratory data analysis - Good understanding of how to transform and connect the data of various types and form - Great numerical and analytical skills - Identify opportunities for data acquisition - Explore ways to enhance data quality and reliability - Build algorithms and prototypes - Reformulating existing frameworks to optimize their functioning. - Good understanding of optimization techniques to make the system performant for requirements.

Posted 1 month ago

Apply

5 - 8 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Job Title: Data Analyst Experience: 5 8 Years Location: Bangalore (BLR) Notice Period: Immediate to 15 Days Job Summary: We are looking for a seasoned Data Analyst with 5–8 years of professional experience to join our analytics team in Bangalore. The ideal candidate will have strong technical expertise in SQL, Python, data visualization tools, and statistical analysis. You will play a crucial role in uncovering insights that drive strategic business decisions. Key Responsibilities: Analyze complex datasets to extract actionable insights aligned with business objectives. Build, optimize, and maintain dashboards and reports using BI tools. Write complex SQL queries for data extraction, transformation, and reporting. Apply statistical techniques to identify trends, patterns, and anomalies. Collaborate with product managers, engineers, and other stakeholders to define metrics and KPIs. Automate data workflows and streamline reporting processes. Present findings and recommendations to senior leadership in a clear and concise manner. Mandatory Technical Skills: SkillDescription SQL Advanced query writing, joins, aggregations, window functions Python Data manipulation with Pandas, NumPy; basic scripting; data visualization Data Visualization Hands-on with Power BI , Tableau , or LookerExcel Advanced functions (VLOOKUP, pivot tables, macros) Statistics Hypothesis testing, regression, forecasting ETL/Data Pipelines Understanding of data ingestion and transformation processes Cloud (Preferred) Basic knowledge of AWS/GCP/Azure data services Preferred/Bonus Skills: R for statistical computing Experience with big data tools (Spark, Hive) Exposure to A/B testing and predictive modeling Familiarity with data warehousing concepts and tools like Snowflake or Redshift Experience working with version control (Git) and Agile methodologies

Posted 1 month ago

Apply

3 - 8 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Engineer with 2 –5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

5 - 10 years

25 - 30 Lacs

Bengaluru

Remote

Naukri logo

Seeking a Data Scientist to develop AI-driven data conversion apps, automate code generation, ensure data quality, and collaborate across teams. Requires 5+ yrs exp, Python, AI/ML skills, and strong problem-solving. Fintech experience is a plus.

Posted 1 month ago

Apply

3 - 7 years

14 - 24 Lacs

Bengaluru

Hybrid

Naukri logo

Role Summary: Construction of modelling and monitoring bases and data quality management for regulatory credit risk (PD, LGD and EAD) and provisio:n (IFRS9) models. Role Description Apply data wrangling and credit risk domain expertise in creation of modelling and monitoring databases and evaluating the quality of the same. Manage intermediate level case studies and challenges on around data collection, wrangling and data quality with minimal supervision. Perform root-cause analysis and resolve any team hurdles around data quality issues. Identify, develop and implement process and project enhancements. Know upstream and downstream processes of modelling data creation. Lead process governance meetings with stakeholders - modelling and IT teams. Represent and contribute to internal forums, and innovation. Profile Required: Understanding of Design and Development of Data Analytics. Good experience on SQL Queries and sound programming knowledge in analytical tools like SAS, Python, PySpark. Good to have awareness of Big Data and cloud based BI stack, emerging BI trend. Understanding of designing, defining and documenting solution architecture. Ability in gathering client requirements and work through specifications and develop solutions where appropriate in line with project documentation. Working within time guideline. Knowledge of credit risk modelling such as PD, LGD , CCF and IFRS9. Specific Context Client Focus, Team Sprit, Commitment, Responsibility, Ownership, and Innovation Environment At Socit Gnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious. Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future. Creating, daring, innovating and taking action are part of our DNA. If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating? You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices and sharing their skills with charities. There are many ways to get involved. We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies. They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection.

Posted 1 month ago

Apply

2 - 3 years

4 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

As a skilled Developer, you are responsible for building tools and applications that utilize the data held within company databases. The primary responsibility will be to design and develop these layers of our applications and to coordinate with the rest of the team working on different layers of IT infrastructure. A commitment to collaborative problem solving, sophisticated design and quality product is essential Python Developer Necessary Skills: Have experience in data wrangling and manipulation with Python/Pandas. Experience with Docker containers. Knowledge of data structures, algorithms and data modeling. Experience with versioning (Git, Azure DevOps). Design and implementation of ETL/ELT pipelines. Should have good knowledge and experience on web scrapping (Scrapy, BeautifulSoup, Selenium) Expertise in at least one popular Python framework (like Django, Flask or Pyramid) Design, build, and maintain efficient, reusable, and reliable Python code. (SOLID, Design principles) Have experience in SQL database (Views, Stored Procedure, etc.) Responsibilities and Activities Aside from the core development role this job position includes auxiliary roles that are not related to development. The role includes but is not limited to: Support and maintenance of customs and previously developed tools, as well as excellence of performance and responsiveness of new applications. Deliver high quality and reliable applications, including Development and Front-End. In addition, you will maintain code quality, prioritize organization, and drive automatization. Participate in the peer review of plans, technical solutions, and related documentation (Map/document technical procedures). Identify security issues, bottlenecks, and bugs, implementing solutions to mitigate and address issues of service data security and data breaches. Work with SQL / Postgres databases: installing and maintaining database systems, supporting server management, including Backups. In addition to troubleshooting issues raised by the Data Processing team.

Posted 1 month ago

Apply

1 - 4 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Engineer with 2 –5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong e xperience with Databricks and AWS architecture. Must have k nowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having e xperience with Informatica or Reltio MDM platforms will be preferred . Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

- 2 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Engineer with 2 –5 years of experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong data engineering experience along with MDM knowledge, hence the candidates having only MDM experience are not eligible for this role. Candidate must have data engineering experience on technologies like (SQL, Python, PySpark , Databricks, AWS etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Master’s degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced SQL expertise and data wrangling. Strong experience in Python and PySpark for data transformation workflows. Strong experience with Databricks and AWS architecture. Must have knowledge of MDM, data governance, stewardship, and profiling practices. In addition to above, candidates having experience with Informatica or Reltio MDM platforms will be preferred. Good-to-Have Skills: Experience with IDQ, data modeling and approval workflow/DCR. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Strong grip on data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL, Python, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

2 - 4 years

12 - 17 Lacs

Chennai, Pune

Work from Office

Naukri logo

Data Quality/Governance Analyst - Data & AI KEY ACCOUNTABILITIES Investigate, troubleshoot, and resolve data related production issues. Provide timely reporting on data quality metrics and trends. Document and maintain support procedures for data quality processes. Collaborate with IT and business teams to implement data quality improvements. Ensure data validation and reconciliation processes are followed. Engage with stakeholders to establish procedures for data validation and quality metrics. Track data issues using incident tickets and ensure timely resolution or escalate issues for immediate attention if not resolved. Maintain and update production support dashboards (Microsoft Power BI) to ensure accuracy and meet monitoring requirements. Develop Data Quality health reports for stakeholders to monitor and observe data reliability across the platform. Creating and maintaining documentation procedures, and best practices of data governance and related processes Provide training to users on tools to promote awareness and adherence. Collaborating with data owners and data stewards to ensure data governance is implemented and followed. Able to work with vendor as there will be technical platform issues that requires coordination and solution. Deliver consistent, accurate and high- quality work while communicating findings and insights in a clear manner. EXPERIENCE / QUALIFICATIONS At least 4 years of hands-on experience with a Data Quality tool (Collibra is preferred), Databricks and Microsoft Power BI Strong technical skills in data and database management, with proficiency in data wrangling, analytics, and transformation using Python and SQL Asset Management experience will be beneficial to understand and recommend the required data quality rules and remediation plan to the stakeholders. Other Attributes Curious, analytical, and able to think critically to solve problems Detail-oriented and comfortable dealing with complex structured and unstructured datasets Customer-centric and strive to deliver value by effectively and proactively engaging stakeholders Clear and effective communication skills, with an ability to communicate complex ideas and manage stakeholder expectations Strong organisational and prioritisation skills, adaptable and able to work independently as required

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies