Home
Jobs

2095 Data Quality Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Not Applicable Specialism Data, Analytics & AI Management Level Associate & Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. \ Responsibilites Adobe Analytics implementation experience across the lifecycle. Good understanding of data collection using Adobe Experience Platform (AEP). Should be very well versed with Adobe Launch or any other tag management solutions like GTM. Excellent JavaScript experience in coding and troubleshooting the analytics implementation on a large scale website. Create and maintain Adobe Analytics JavaScript tags and plugins. Convert scope and requirements to Solution Design Reference (SDR) document Be able to convert SDR to exact JS code to be added on page or user actions on page Thorough knowledge of Analytics Workspace and Admin Console Collaboration with various business stakeholders in scoping out analytics implementation for new initiatives Handson experience with one or more debugging tools like Charles, Chrome Debugger. Handson experience in implementing 3rd party or Marketing pixels such as Google gTag, Facebook Pixel, Linked Insight tags etc... Good understanding on Reporting and Analytics concepts Mandatory Skill Sets 1. Adobe Analytics, 2. Adobe Experience Platform, 3. Adobe Launch, 4. Developer Tools (Chrome, Firefox) 5. JavaScript (Advanced) 6. Adobe Target Preferred Skill Sets HTML+CSS+Javascript Education Qualification Graduate Engineer or Management Graduate , BE / BTECH / MCA / MTECH / MSC / MBA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Adobe Acrobat Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytic Research, Big Data, Business Data Analytics, Communication, Complex Data Analysis, Conducting Research, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, DataDriven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling, Data Pipeline, Data Preprocessing, Data Quality {+ 33 more} Travel Requirements Government Clearance Required?

Posted 2 hours ago

Apply

3.0 - 4.0 years

3 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Spanbix Technologies is looking for Power BI Professional to join our dynamic team and embark on a rewarding career journey Responsible for designing, developing, and implementing business intelligence solutions using Power BI, a data visualization and reporting tool from Microsoft Connecting to and integrating data from various sources, including databases, spreadsheets, and cloud services Designing and creating data models, dashboards, reports, and other data visualizations Enhancing existing Power BI solutions to meet evolving business requirements Collaborating with stakeholders to understand their data needs and requirements Building and maintaining data pipelines and ETL processes to ensure data quality and accuracy Developing and implementing security and access control measures to ensure the protection of sensitive data Troubleshooting and resolving issues with Power BI solutions Documenting and communicating solutions to stakeholders Excellent communication, analytical, and problem-solving skills

Posted 2 hours ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

NYU Manager - Owais UR Delivery Manager - Laxmi Title: Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, and other relevant technologies. Overview: As a Senior Developer I at NYU EDA team, you will play a vital role to improve the operation of our data load and management processes. Your primary responsibilities will be to ensure the accuracy and timeliness of data loads, maintain the health of data pipelines, and monitor that all scheduled jobs are completed successfully. You will collaborate with cross-functional teams to identify and resolve issues, improve processes, and maintain a high standard of data integrity. Responsibilities: Manage and perform Healthy Planet file loads into a data warehouse. Monitor Airflow DAGs for successful completion, manage alerts, and rerun failed tasks as necessary. Monitor and oversee other daily and weekly jobs, including FGP cash reports and external reports. Collaborate with the data engineering team to streamline data processing workflows. Develop automation scripts to reduce manual intervention in repetitive tasks using SQL and Python. Ensure all data-related tasks are performed accurately and on time. Investigate and resolve data discrepancies and processing issues. Prepare and maintain documentation for processes and workflows. Conduct periodic data audits to ensure data integrity and compliance with defined standards. Skillset Requirements: MS/Oracle SQL Python Data warehousing and ETL processes Monitoring tools such as Apache Airflow Data quality and integrity assurance Strong analytical and problem-solving abilities Excellent written and verbal communication Additional Skillset Familiarity with monitoring and managing Apache Airflow DAGs. Experience: Minimum of 5 years experience in a similar role, with a focus on data management and process automation. Proven track record of successfully managing complex data processes and meeting deadlines. Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Certifications: Epic Cogito MS/Oracle SQL, Python, or data management are a plus.

Posted 4 hours ago

Apply

10.0 - 15.0 years

17 - 20 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Key Responsibilities • Design end-to-end data workflow architecture using Alteryx Designer, Server, and Connect. • Translate HLS use cases into technical workflows, including patient journey analytics, claims automation, and real-world evidence processing. • Integrate Alteryx with cloud platforms, EMR systems, SQL-based data warehouses, and visualization tools. • Ensure compliance with HIPAA, GxP, FDA, and other regulatory standards. • Guide the data governance, security, and quality framework across solutions. • Collaborate with clinical analysts, data scientists, and IT to deliver analytics solutions aligned with business goals. • Provide thought leadership, mentor junior Alteryx resources, and contribute to CoEs. Required Skills & Qualifications • Deep expertise in Alteryx Designer, Alteryx Server, and related tools. • Proficient in SQL, Python, and integrating workflows with cloud ecosystems (AWS, Azure, GCP). • Strong understanding of healthcare data models (claims, EMR/EHR, HL7/FHIR). • Mandatory: Minimum 3-5 years of experience working with healthcare or life sciences datasets . • Familiarity with regulatory frameworks (HIPAA, FDA, GxP). • Strong communication and stakeholder management skills.

Posted 6 hours ago

Apply

1.0 - 5.0 years

35 - 100 Lacs

Bengaluru

Work from Office

Naukri logo

Data Governance & Quality - Data Analyst Req number: R4899 Employment type: Full time Worksite flexibility: Hybrid Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are looking for a motivated Data Analyst ready to take us to the next level! If you have SQL, Excel, Collibra and other data management tools and are looking for your next career move, apply now. Job Description We are looking for a Data Analyst This position will be full-time and Hybrid (Bangalore). What You’ll Do Conduct thorough data audits to identify any discrepancies or inconsistencies in data quality tool like Collibra. Collaborate with internal teams to understand data requirements and provide solutions to enhance data quality. Collaborate with Corporate Data Quality Teams. Develop and implement data quality standards and best practices. Analyze complex datasets to identify patterns, trends, and insights. Ensure data integrity and accuracy by performing regular data validation checks. Collaborate with stakeholders to understand their data needs and provide recommendations for data quality improvement. Participate in the design and implementation of data quality control processes. Communicate data quality issues and solutions effectively to both technical and non-technical stakeholders. Stay up-to-date with the latest industry trends and advancements in data quality practices. What You'll Need Bachelor's degree in Computer Science, Information Systems, or a related field. Proven experience in data analysis and quality assurance. Proficiency in SQL, Excel, and data management tools like SAP MDG, Collibra / Informatica. Strong attention to detail and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and collaborate effectively in a team environment. Strong organizational skills and ability to manage multiple priorities. Knowledge of data quality frameworks and methodologies. Physical Demands Sedentary work that involves sitting or remaining stationary most of the time with occasional need to move around the office to attend meetings, etc. Ability to conduct repetitive tasks on a computer, utilizing a mouse, keyboard, and monitor. Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

Posted 7 hours ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

The Senior Spark Tech Lead will be responsible for integrating and maintaining the Quantexa platform, a spark based software provided by a UK fintech, into our existing systems to enhance our anti-money laundering capabilities. This role requires a deep expertise in Spark development, as well as an ability to analyze and understand underlying data. Additionally, the candidate should have an interest in exploring open-source applications distributed by Apache, Kubernetes, OpenSearch and Oracle. Should be able to work as a Scrum Master Responsibilities Direct Responsibilities Integrate and upgrade the Quantexa tool with our existing systems for enhanced anti-money laundering measures. Develop and maintain Spark-based applications deployed on Kubernetes clusters. Conduct data analysis to understand and interpret underlying data structures. Collaborate with cross-functional teams to ensure seamless integration and functionality. Stay updated with the latest trends and best practices in Spark development and Kubernetes. Contributing Responsibilities Taking complete ownership of project activities and understand each tasks in details. Ensure that the team delivers on time without any delays and deliveries are of high quality standards. Estimation, Planning and scheduling of the project. Ensure all internal timelines are respected and project is on track. Work with team to develop robust software adhering to the timelines & following all the standard guidelines. Act proactively to ensure smooth team operations and effective collaboration Make sure team adheres to all compliance processes and intervene if required Task assignment to the team and tracking until task completion Proactive Status reporting to the management. Identify Risks in the project and highlight to Manager. Create Contingency and Backup planning as necessary. Create Mitigation Plan. Take decision by own based on situation. Play the role of mentor and coach team members as and when required to meet the target goals Gain functional knowledge on applications worked upon Create knowledge repositories for future reference. Arrange knowledge sharing sessions to enhance team's functional capability. Evaluation of new tools and coming with POCs. Provide feedback of team to upper management on timely basis Technical & Behavioral Competencies Key Responsibilities Integrate and upgrade the Quantexa tool with our existing systems for enhanced anti-money laundering measures. Develop and maintain Spark-based applications deployed on Kubernetes clusters. Conduct data analysis to understand and interpret underlying data structures. Collaborate with cross-functional teams to ensure seamless integration and functionality. Stay updated with the latest trends and best practices in Spark development and Kubernetes. Required Qualifications 7+ Years of experience in development Extensive experience in Hadoop, Spark, Scala development (5 years min). Strong analytical skills and experience in data analysis (SQL), data processing (such as ETL), parsing, data mapping and handling real-life data quality issues. Excellent problem-solving abilities and attention to detail. Strong communication and collaboration skills. Experience in Agile development. High quality coding skill, incl. code control, unit testing, design, and documentation (code, test). Experience with tools such as sonar. Experience with GIT, Jenkins. Specific Qualifications (if required) Experience with development and deployment of spark application and deployment on Kubernetes clusters Hands-on development experience (Java, Scala, etc.) via system integration projects, Python, Elastic (optional). Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Adaptability Creativity & Innovation / Problem solving Attention to detail / rigor Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Ability to develop and leverage networks Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Fluent in English Team player Strong analytical skills Quality oriented and well organized Willing to work under pressure and mission oriented Excellent Oral and Written Communication Skills, Motivational Skills, Results-Oriented

Posted 8 hours ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Naukri logo

Required_Skills":[{"level":80 , "last_used":"2025" , "name":"Advanced SQL , ETL , Query Optimization Manage end-to-end data pipelines, ensuringseamless flow and integrity of data from diverse sources to analytical systems Collaborate with data scientists, analysts,and business teams to understand data needs and develop efficient solutions. Implementrobust data governance practices to maintain data quality standards andfacilitate reliable analysisand reporting Conductthorough data validation procedures to ensure accuracy and reliability ofanalytical outputs Monitordata systems and pipelines, troubleshoot issues, and ensure the continuousavailability of data Ensure data quality, integrity, andconsistency across different data sources and storage systems Optimize data flow and storage processes forperformance and scalability. Requirements Must Have... At least 2-4 years of experience working inthe field of analytics, reporting out metrics and deep dive analytics. Strong proficiency with Advanced SQL (WindowFunctions, DML Commands, DDL Commands, CTES, Sub Queries, etc) Expertise in building end to end datapipelines and ETL frameworks & tools Ability to write complex queries andunderstanding of database concepts. Strong understanding of data modelling,schema design, and database optimization techniques Knowledgeof version control (e.g., Git) and collaborative development practices. Exceptionalcommunication and collaboration skills. Nice to have... Exposureto broader analytics ecosystem Experiencewith data lake architectures and big data technologies. Education Bachelor degree in computer science,Engineering, or a related field. At least 2-4 years of relevant experience inanalytics organizations of large corporates or in consulting companies inanalytics roles.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Naukri logo

Job Overview: We are looking for highly skilled Senior and Mid-to-Junior Technical Consultants with expertise in SQL, PySpark, Python, Airflow, and API development . The ideal candidates will have hands-on experience in data warehousing concepts (Fact & Dimension) and a strong understanding of supply chain domain processes. You will work closely with cross-functional teams to develop, optimize, and implement scalable data solutions. Senior Technical Consultant: 4-6 years Key Responsibilities: - Design, develop, and optimize data pipelines using PySpark, SQL, and Python . - Implement and manage Airflow DAGs for workflow automation. - Work with APIs to integrate data sources and ensure seamless data exchange. - Develop and maintain data models based on fact and dimension tables for efficient reporting and analytics. - Optimize query performance and data processing for large datasets. - Collaborate with business analysts, data engineers, and stakeholders to understand business requirements and translate them into technical solutions. - Ensure data quality, reliability, and scalability of solutions. - Provide mentorship to junior team members (for the Senior Technical Consultant role). \u200b Requirements Required Skills & Qualifications: - Strong proficiency in SQL, PySpark, and Python . - Hands-on experience with Airflow for scheduling and orchestrating workflows. - Expertise in working with APIs (development and integration). - Solid understanding of data warehousing concepts (Fact & Dimension modeling). - Experience in the supply chain domain is highly preferred. - Knowledge of cloud platforms (AWS, Azure, or GCP) is a plus and not mandatory. - Excellent problem-solving skills and ability to work in an agile environment. - Strong communication skills to effectively collaborate with cross-functional teams

Posted 2 days ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Lucknow

Work from Office

Naukri logo

Team Leader / Senior Specialist Facility Strengthening (Internal Staff Only) Team Leader / Senior Specialist Facility Strengthening (Internal Staff Only) - India Health Action Trust (IHAT) Team Leader / Senior Specialist Facility Strengthening (Internal Staff Only) About IHAT: About Program: Deputy Director/Regional Director or anyone else designated by them Travel Requirements: 40% Job Summary : The Team Leader/ Senior Specialist will provide overall programmatic and operational support to RD/DD for the division allotted to her/him. He/ She will coordinate and supervise the field team to ensure to meet the program objectives and deliverables. Key Responsibilities: Activation of designated FRUs, ensure functionality of maternity OTs, blood storage units, equipment availability, gap assessments and strengthen obstetric OPDs, triage, labour rooms, PNC wards among others. Activation of NBSUs, conduct gap assessments, ensure functionality of NBSU s with adequate HR and equipment. Activation, strengthening and quality improvements of priority Primary Health centers across RMNCHN inclusive of certification. Undertake activation of delivery points and facility strengthening interventions that include training and mentoring of FRUs / and other high load delivery points. Strengthen interventions on sick newborn care inclusive of NBSUs and SNCUs. Strengthening oversight and ensuring efficient service delivery across the entire RMNCH+N and Routine Immunization program spectrum. Ensure accurate and timely reporting on HMIS, PMSMA portal, MaNTra, FBNC portal, and other data collection platform for informed decision-making, while strengthening data quality through regular review and addressing UPKSK exception reports. Support EDL availability and management through DVDMS across all relevant facilities and CiVHSND. This includes escalating any issues related to DVDMS and EDL stock issues as identified. Establish and strengthen digital health initiatives namely E-Kavach and e-sushrut among others. Support rollout of all trainings across cadres that include: LSAS/ EmOC/ USG doctors, SBA/ NSSK/ DAKSH/ DAKSHATA/ NBSU/ CPAP, Staff Nurses (Induction) Be responsible in terms of administrative and functional reporting of the reporting teams and provide technical handholding to their respective teams. Essential experience: 5-7 years of relevant experience at the division, district and/or state level in working on RMNCH or related program. Preferred experience: Understanding of the government health system Required Qualification: MBBS/BDS/BAMS/BHMS with a Master s degree in public health or a related field is required. PhD in public health or a related field. Knowledge in use of digital applications will be preferred Key Competencies: In-depth understanding of RMNCH concepts and the Indian Public Healthcare system Strong programmatic, coordination and communication skills Ability to work effectively with government officials and other stakeholders Analytical skills and proficiency in data interpretation Experience in capacity building and providing technical assistance Proficiency in using MS Office and various health program-related IT applications How To Apply : Interested candidates should submit their applications by clicking the Apply Now button provided on this page. Only a complete application submitted through the online portal before the closing date will be considered. IHAT provides a safe working environment for all its employees; follows the principle of equal opportunity and encourages women applicants. Physically challenged with required skills /knowledge and willing to travel are also encouraged to apply. We will be following a systematic selection process to fill this position based on experience, competency and suitability. Shortlisting for the posts will take place soon after the closing date. Only shortlisted candidates will be invited for an interview. Unfortunately, we can only contact applicants who have been shortlisted for the interview. If you have not heard from us within 6 weeks of the closing date, please assume that the current IHAT positions are unable to accommodate you at the moment. It is also not possible for us to provide you with specific feedback because of the volume of applications we receive. IHAT does not charge any application, processing, training, interviewing, testing, or other fees in connection with the application or recruitment process. Should you receive a solicitation for the payment of a fee, please disregard it.

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Data Analyst II Syneos Health is a leading fully integrated biopharmaceutical solutions organization built to accelerate customer success. We translate unique clinical, medical affairs and commercial insights into outcomes to address modern market realities. Our Clinical Development model brings the customer and the patient to the center of everything that we do. We are continuously looking for ways to simplify and streamline our work to not only make Syneos Health easier to work with, but to make us easier to work for. Whether you join us in a Functional Service Provider partnership or a Full-Service environment, you ll collaborate with passionate problem solvers, innovating as a team to help our customers achieve their goals. We are agile and driven to accelerate the delivery of therapies, because we are passionate to change lives. Discover what our 29,000 employees, across 110 countries already know: WORK HERE MATTERS EVERYWHERE Why Syneos Health We are passionate about developing our people, through career development and progression; supportive and engaged line management; technical and therapeutic area training; peer recognition and total rewards program. We are committed to our Total Self culture - where you can authentically be yourself. Our Total Self culture is what unites us globally, and we are dedicated to taking care of our people. We are continuously building the company we all want to work for and our customers want to work with. Why? Because when we bring together diversity of thoughts, backgrounds, cultures, and perspectives - we re able to create a place where everyone feels like they belong. Job Responsibilities Work independently to solve open-ended questions. Design and analyze tests and experiments. Maintain documentation of analytical processes and projects. Build, maintain, and improve performance dashboards leveraging customer feedback for use and accessibility. Advise clients on relevant best practices and ensure the data is easily retrievable for their review. Support data quality and understanding customer needs as they evolve. Mentor and coach junior team members. Support site advocacy group meetings by inviting PIs, discussing blinded protocols, collecting feedback, and managing scheduling, hosting, and meeting minutes. Develop and manage capabilities decks twice annually, along with bespoke slides and marketing information sheets using Power BI data. Track and analyze business development outcomes through opportunity trackers, monitoring RFP success rates, regulatory approvals, and win rates. Monitor customer satisfaction by reviewing feedback from the EM team and facilitating monthly cross-time zone communications. Oversee product approval tracking, ensuring visibility into product lifecycle status and final approval outcomes. Qualifications Get to know Syneos Health Over the past 5 years, we have worked with 94% of all Novel FDA Approved Drugs, 95% of EMA Authorized Products and over 200 Studies across 73,000 Sites and 675,000+ Trial patients. No matter what your role is, you ll take the initiative and challenge the status quo with us in a highly competitive and ever-changing environment. Learn more about Syneos Health. http://www.syneoshealth.com Additional Information Tasks, duties, and responsibilities as listed in this job description are not exhaustive. The Company, at its sole discretion and with no prior notice, may assign other tasks, duties, and job responsibilities. Equivalent experience, skills, and/or education will also be considered so qualifications of incumbents may differ from those listed in the Job Description. The Company, at its sole discretion, will determine what constitutes as equivalent to the qualifications described above. Further, nothing contained herein should be construed to create an employment contract. Occasionally, required skills/experiences for jobs are expressed in brief terms. Any language contained herein is intended to fully comply with all obligations imposed by the legislation of each country in which it operates, including the implementation of the EU Equality Directive, in relation to the recruitment and employment of its employees. The Company is committed to compliance with the Americans with Disabilities Act, including the provision of reasonable accommodations, when appropriate, to assist employees or applicants to perform the essential functions of the job.

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Ahmedabad

Work from Office

Naukri logo

Job Description Department: SOC - Excellence Responsibilities: Design and develop data solutions: Architect, implement, and optimize data solutions using Elasticsearch/OpenSearch, integrating with various data sources and systems. Machine Learning Integration: Apply your expertise in machine learning to develop models, algorithms, and pipelines for data analysis, prediction, and anomaly detection within Elasticsearch/OpenSearch environments. Data Ingestion and Transformation: Design and implement data ingestion pipelines to collect, cleanse, and transform data from diverse sources, ensuring data quality and integrity. Elasticsearch/OpenSearch Administration: Manage and administer Elasticsearch/OpenSearch clusters, including configuration, performance tuning, index optimization, and monitoring. Query Optimization: Optimize complex queries and search operations in Elasticsearch/OpenSearch to ensure efficient and accurate retrieval of data. Troubleshooting and Performance Tuning: Identify and resolve issues related to Elasticsearch/OpenSearch performance, scalability, and reliability, working closely with DevOps and Infrastructure teams. Collaboration and Communication: Collaborate with cross-functional teams, including data scientists, software engineers, and business stakeholders, to understand requirements and deliver effective data solutions. Documentation and Best Practices: Document technical designs, processes, and best practices related to Elasticsearch/OpenSearch and machine learning integration. Provide guidance and mentorship to junior team members. Requirements: Bachelors or Masters degree in Computer Science, Data Science, or a related field. Strong experience in designing, implementing, and managing large-scale Elasticsearch/OpenSearch clusters, including experience with indexing, search queries, performance tuning, and troubleshooting. Expertise in machine learning techniques and frameworks, such as TensorFlow, PyTorch, or scikit-learn, with hands-on experience in developing ML models and integrating them into data pipelines. Proficiency in programming languages like Python, Java, or Scala, and experience with data processing frameworks (e.g., Apache Spark) and distributed computing. Solid understanding of data engineering concepts, including data modeling, ETL processes, data warehousing, and data integration. Experience with cloud platforms like AWS, Azure, or GCP, and knowledge of containerization technologies (e.g., Docker, Kubernetes) is highly desirable. Strong analytical and problem-solving skills, with the ability to work effectively in a fastpaced, collaborative environment. Excellent communication skills, with the ability to translate complex technical concepts into clear and concise explanations for both technical and non-technical stakeholders. Proven track record of successfully delivering data engineering projects on time and within budget. Ahmedabad Requirement Data Ingestion and Transformation, Elastic Search/Open Search Administration, Machine Learning Integration, etc

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Kochi

Work from Office

Naukri logo

The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at Save this job LEARN ABOUT HOW WE WORK Join our Global Talent Network Let s stay connected. Sign up to receive alerts when new opportunities become available that match your career ambitions.

Posted 2 days ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Pune

Work from Office

Naukri logo

We are looking for a Senior Data Engineer to join our data engineering team and lead the design and implementation of scalable, high-performance data solutions. The ideal candidate will have extensive... We are looking for a Senior Data Engineer to join our data engineering team and lead the design and implementation of scalable, high-performance data solutions. The ideal candidate will have extensive experience with Python, Databricks, AWS platforms, and strong hands-on skills Docker. You will play a key role in integrating data with client s vendors, specifically Flat File exchange for now, but new vendors will be API integrations. Key Responsibilities Design and implement scalable ETL/ELT pipelines using Databricks and Apache Spark. Develop, maintain, and optimise SQL and PL/SQL scripts for data extraction, transformation, and loading from Oracle and other relational databases. Write robust, reusable, and optimised Python and Shell scripts for data automation and workflow orchestration. Manage data ingestion and integration processes across structured and unstructured sources. Deploy and manage code using GitHub for version control and collaboration. Monitor, troubleshoot, and improve performance of data pipelines and jobs. Required Skills & Experience 6+ years of experience in data engineering with large-scale data systems. Strong proficiency in Databricks, Python, AWS S3 buckets, and Docker Solid experience with SQL, PL/SQL. Advanced programming in Python for data transformation and automation. Hands-on experience with GitHub for code versioning, branching, and collaboration. Familiar with data quality frameworks and best practices in data architecture. Lake house architecture implementation experience

Posted 2 days ago

Apply

9.0 - 10.0 years

30 - 35 Lacs

Kagal

Work from Office

Naukri logo

Reporting to the RISE Chief of Party/ Country Director, the Senior Technical Advisor will serve as the lead technical expert for the RISE Marburg Project, which focuses on strengthening national and subnational capacities to prevent, detect, and respond to infectious disease threats. This role entails providing strategic technical leadership and oversight in key areas such as Infection Prevention and Control (IPC), disease surveillance, case management of Marburg Virus Disease (MVD) and other Viral Haemorrhagic Fevers (VHFs), as well as broader epidemic preparedness and response efforts. The Senior Director will guide and coordinate the efforts of multidisciplinary teams, including those responsible for Monitoring and Evaluation. The position requires strong collaboration with the Ministry of Health, Rwanda Biomedical Centre, donors, and other stakeholders involved in Global Health Security (GHS). Responsibilities Technical Leadership and Coordination Provide overall technical leadership in the design and implementation of IPC, VHF case management, surveillance, and epidemic preparedness and response interventions. Lead the development and adaptation of technical strategies, tools, and protocols in alignment with national standards. Coordinate technical input from subject matter experts to ensure an integrated, high-impact program approach. Lead technical support to national and subnational in outbreak preparedness, risk assessments, simulation exercises, and rapid response planning. Viral Hemorrhagic Fevers (VHFs) & IPC Oversee technical guidance for safe and effective case management of VHFs. Provide leadership in the implementation and scale-up of IPC programs at health facility and community levels. Ensure readiness for VHF outbreaks through technical training, stockpiling, referral systems, and workforce readiness. Surveillance and Data Systems Provide technical oversight on strengthening integrated disease surveillance and response (IDSR), event-based surveillance (EBS), and community-based surveillance. Guide the integration of real-time data platforms and ensure data use for decision-making. Monitoring and Evaluation Supervise the M&E team to ensure data quality, effective monitoring, and evidence-based reporting. Ensure that project data informs program adaptation and continuous quality improvement. Team Leadership and Management Lead and manage a diverse technical team, ensuring collaboration, mentorship, and high performance. Foster a culture of learning, innovation, and accountability across the project technical teams. Contribute to annual work plans, donor reporting, and knowledge sharing. Stakeholder Engagement Represent the project in technical working groups and coordination forums with the Ministry of Health, WHO, CDC, and other GHS partners. Build strong partnerships with implementing partners, and regional health bodies. Contribute to advocacy efforts for sustainable epidemic preparedness and resilient health systems. Required Qualifications Medical degree (MD, MBBS) with Master s in Public Health, Epidemiology, Infectious Diseases, or related field. A minimum of 9 -10 years of progressive experience in global health security, outbreak response, or related technical areas. Proven records of expertise in IPC, VHF outbreak preparedness and rapid response and case management (Marburg, Ebola, COVID-19 etc.), Experience with disease surveillance systems (IDSR, EBS, CBS) and emergency preparedness and response planning. Proven experience managing technical and M&E team in complex projects. Strong leadership, communication, and stakeholder coordination skills. Familiarity with donor-funded projects, particularly USAID, CDC, or other bilateral and multilateral donors. Experience working in resource-limited settings or emergency contexts Preferred Attributes Experience supporting MOH-led emergency operations centers (EOC) or public health emergency response. Fluency in English required ; proficiency in French or a local language is a plus. NB: Please note that we will be reviewing applications on a rolling basis . This means we may proceed with interviews and make hiring decisions before the stated application deadline. We therefore encourage interested candidates to apply as soon as possible to ensure full consideration Application submission deadline: 24th June 2025 Note: The position is on a national contract and only applicants holding permit to work in Rwanda can apply. Only shortlisted candidates will receive an invitation for an interview. For further information about Jhpiego, visit our website at www.jhpiego.org . The successful candidate selected for this position will be subject to a pre-employment background investigation. Jhpiego is an Affirmative Action/Equal Opportunity Employer: Jhpiego, a Johns Hopkins University affiliate, is an equal opportunity employer and does not discriminate based on gender, marital status,pregnancy, race, color, ethnicity, national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, other legally protected characteristics or any other occupationally irrelevant criteria. Jhpiego promotes Affirmative Action for minorities, women, individuals who are disabled, and veterans. Recruitment scams & fraud warning Jhpiego has become aware of scams involving false job offers. Please be advised: Recruiters will never ask for a fee during any stage of the recruitment process. All active jobs are advertised directly on our careers page. Official Jhpiego emails will always arrive from a @Jhpiego.org email address. Please report any suspicious communications to Info@jhpiego.org

Posted 2 days ago

Apply

10.0 - 14.0 years

30 - 40 Lacs

Ahmedabad

Work from Office

Naukri logo

Leading a team, ensuring SLA compliance, managing escalations & driving data quality, process improvements & business value as a SME within global data operations Manage execute & delivers continuous improvement program delivery for DMO

Posted 2 days ago

Apply

6.0 - 11.0 years

6 - 16 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Job Description: As a Data Governance Architect, you must be able to manage organization-wide data governance activities and will be responsible for improving the quality and managing the protection of sensitive data and information assets. You will be responsible for preparing a Data Catalog strategy to build out a catalog with data and BI objects and onboard a user base to support the curation of metadata, lineage, and documentation for enabling seamless data discovery at an enterprise level, thereby streamlining data intake, and reducing data duplication throughout the organization. You must be result-oriented, self-motivated and can thrive in a fast-paced environment. This role requires you to serve as a point of escalation for governance, data quality and protection issues and will work closely with Business and Functional area leadership to improve the quality and value of core data assets, respond to regulatory protection requirements as well as support the strategic requirements of the department. Primary Roles and Responsibilities: • Looking for a Data Governance expert for the development of a metadata management system solution. Should be able to streamline the curation of metadata with custom scripts to upload available metadata to the API to achieve a deeper understanding of their catalog content and user base using custom dashboards to track adoption. • Responsible for the implementation and oversight of the Companys data management goals, standards, practices, process, and technologies. • Experience in establishing data connections for relevant schemas, defining data stewards role & responsibilities for the scope of the data catalog. • Define roles and responsibilities related to data governance and ensure clear accountability for stewardship of the companys principal information assets • To properly onboard the data catalog, you should be able to conduct a data domain team assessment, discover the availability & completeness of each teams metadata, and develop a process for working with and onboarding data domain teams. • Be the point of contact for Data Governance queries, including escalation point for client concerns. • Coordinate the resolution of data integrity gaps by working with the business owners and IT. • Ability to work in an agile environment with an iterative approach to development. Skills and Qualifications: Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 4+ years' experience in Data Cataloging & Data Governance projects. Programming skills (sufficient to write SQL queries to validate test results in DW database). Proficient in SQL, Python, and must have strong understanding of databases and data structures. Experience with Application programming interface (API) development skills, essential in developing & running API scripts across multiple devices, databases, and servers, working with REST, and open API technologies. Proficient in working with Enterprise Data Catalog software like Alation, Collibra etc. Experience with dashboarding of reporting tools (Power BI, Tableau etc.) is a plus. Excellent analytical, problem-solving, communication and interpersonal skills. Ability to set priorities and multi-task in a fast-paced environment. Experience in metadata management - Business Glossary, Lineage, data dictionaries, ETL is essential. Ability to work independently and productively under pressure. Strong organizational skills and decision-making ability

Posted 2 days ago

Apply

4.0 - 6.0 years

10 - 14 Lacs

Kolkata, Pune, Chennai

Work from Office

Naukri logo

Location: Remote / Pan India- Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai Job Responsibilities: To transition legacy Rules from Python using the Polars Library to SparkSQL To create new Rules using SparkSQL based on written requirements. Must Have Skills: Understanding of Polars library Understanding of SparkSQL (this is more important than Polars) Good English communication Ability to work in a collaborative environment Experience with US healthcare data preferred Understanding of SparkSQL (this is more important than Polars)

Posted 2 days ago

Apply

4.0 - 9.0 years

8 - 15 Lacs

Mumbai

Hybrid

Naukri logo

Advanced experience in data analysis, business requirement gathering, dashboarding and reporting, preferably with Microsoft Azure, SQL, Python, Power BI and Tableau Experience with SAP S/4 Hana contact "roshitha(at)talentcorner.in/9840092605 Required Candidate profile At least 4 years of experience working in data quality and governance Subject matter expertise on data management policy, Issue Management & Resolution and knowledgeable with data-related policies

Posted 2 days ago

Apply

4.0 - 8.0 years

8 - 16 Lacs

Pune

Remote

Naukri logo

Rudder Analytics is looking for Senior BI Developer(Tableau / Power BI) at Pune, with 4-8 yrs of experience. Please see details at https://shorturl.at/7O9fa for job code BI-SA-01 Required Candidate profile Knack for professional design layouts and for visual storytelling. Precision and attention to detail. Ability to lead a team and manage projects independently.

Posted 2 days ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Not Applicable Specialism Risk & Summary A career within Internal Audit services, will provide you with an opportunity to gain an understanding of an organisation s objectives, regulatory and risk management environment, and the diverse needs of their critical stakeholders. We focus on helping organisations look deeper and see further considering areas like culture and behaviours to help improve and embed controls. In short, we seek to address the right risks and ultimately add value to their organisation. Why PWC & Summary PricewaterhouseCoopers is a multinational professional services network of firms, operating as partnerships under the PwC brand. PwC ranks as the secondlargest professional services network in the world and is considered one of the Big Four accounting firms, along with Deloitte, EY and KPMG. PwC Careers PwC offers a diverse and exciting approach to development which puts you in the drivers seat. Driving your development and growth means that you have the opportunity to learn from your colleagues and clients around you through onthejob experiences. Brief note on the requirement is given below Risk Assurance Services (RAS) is one of PwC s high growth verticals. It supports clients in defining their strategy, formulating business objectives and managing performance while achieving a balance between risk and opportunity or return. Our services within the Risk Assurance practice cover the entire risk & controls spectrum across Internal Audit, Governance, Risk & Controls, Contract & Compliance, Data analytics etc. Technical Skills Experience in Internal Audit/ Process Audit concepts & methodology Processes, Subprocesses, and Activities as well as their relationship Must be proficient in MS Office Sarbanes Oxley Act (SOX)/ IFC Reviews, SOP s Internal control concepts (e.g., Preventive Controls; Detective Controls; Risk Assessment; Antifraud Controls; etc.) Soft Skills Clarity of thought, articulation, and expression Takes ownership, sincere and focused on execution Confident and good verbal communication skills Ability to organize, prioritize and meet deadlines Responsibilities PricewaterhouseCoopers is a multinational professional services network of firms, operating as partnerships under the PwC brand. PwC ranks as the secondlargest professional services network in the world and is considered one of the Big Four accounting firms, along with Deloitte, EY and KPMG. PwC Careers PwC offers a diverse and exciting approach to development which puts you in the drivers seat. Driving your development and growth means that you have the opportunity to learn from your colleagues and clients around you through onthejob experiences. Brief note on the requirement is given below Risk Assurance Services (RAS) is one of PwC s high growth verticals. It supports clients in defining their strategy, formulating business objectives and managing performance while achieving a balance between risk and opportunity or return. Our services within the Risk Assurance practice cover the entire risk & controls spectrum across Internal Audit, Governance, Risk & Controls, Contract & Compliance, Data analytics etc. Technical Skills Experience in Internal Audit/ Process Audit concepts & methodology Processes, Subprocesses, and Activities as well as their relationship Must be proficient in MS Office Sarbanes Oxley Act (SOX)/ IFC Reviews, SOP s Internal control concepts (e.g., Preventive Controls; Detective Controls; Risk Assessment; Antifraud Controls; etc.) Soft Skills Clarity of thought, articulation, and expression Takes ownership, sincere and focused on execution Confident and good verbal communication skills Ability to organize, prioritize and meet deadlines Mandatory skill sets Internal Audit Preferred skill sets Internal Audit Years of experience required 7 to 12 Years Education qualification MBA/ M.Com/ MCA/ CA Education Degrees/Field of Study required Chartered Accountant Diploma, Master of Business Administration Degrees/Field of Study preferred Required Skills Internal Auditing Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Analytical Thinking, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Coaching and Feedback, Communication, Compliance Auditing, Corporate Governance, Creativity, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Financial Accounting {+ 29 more} Travel Requirements Up to 60% No

Posted 3 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description We are looking for an experienced Senior Data Engineer with a strong foundation in Python, SQL, and Spark , and hands-on expertise in AWS, Databricks . In this role, you will build and maintain scalable data pipelines and architecture to support analytics, data science, and business intelligence initiatives. You ll work closely with cross-functional teams to drive data reliability, quality, and performance. Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks in AWS such as Glue, S3, Lambda, EMR, Databricks notebooks, workflows and jobs. Building data lake in WS Databricks. Build and maintain robust ETL/ELT workflows using Python and SQL to handle structured and semi-structured data. Develop distributed data processing solutions using Apache Spark or PySpark . Partner with data scientists and analysts to provide high-quality, accessible, and well-structured data. Ensure data quality, governance, security, and compliance across pipelines and data stores. Monitor, troubleshoot, and improve the performance of data systems and pipelines. Participate in code reviews and help establish engineering best practices. Mentor junior data engineers and support their technical development. Qualifications Requirements Bachelors or masters degree in computer science, Engineering, or a related field. 5+ years of hands-on experience in data engineering , with at least 2 years working with AWS Databricks . Strong programming skills in Python for data processing and automation. Advanced proficiency in SQL for querying and transforming large datasets. Deep experience with Apache Spark/PySpark in a distributed computing environment. Solid understanding of data modelling, warehousing, and performance optimization techniques. Proficiency with AWS services such as Glue , S3 , Lambda and EMR . Experience with version control Git or Code commit Experience in any workflow orchestration like Airflow, AWS Step funtions is a plu

Posted 3 days ago

Apply

6.0 - 9.0 years

22 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description We are seeking a highly skilled and motivated Senior Snowflake Developer to join our growing data engineering team. In this role, you will be responsible for building scalable and secure data pipelines and Snowflake-based architectures that power data analytics across the organization. You ll collaborate with business and technical stakeholders to design robust solutions in an AWS environment and play a key role in driving our data strategy forward. Responsibilities Design, develop, and maintain efficient and scalable Snowflake data warehouse solutions on AWS. Build robust ETL/ELT pipelines using SQL, Python, and AWS services (e.g., Glue, Lambda, S3). Collaborate with data analysts, engineers, and business teams to gather requirements and design data models aligned with business needs. Optimize Snowflake performance through best practices in clustering, partitioning, caching, and query tuning. Ensure data quality, accuracy, and completeness across data pipelines and warehouse processes. Maintain documentation and enforce best practices for data architecture, governance, and security. Continuously evaluate tools, technologies, and processes to improve system reliability, scalability, and performance. Ensure compliance with relevant data privacy and security regulations (e.g., GDPR, CCPA). Qualifications Bachelor s degree in Computer Science, Information Technology, or a related field. Minimum 6 years of experience in data engineering, with at least 3 years of hands-on experience with Snowflake. Strong experience working with AWS services such as S3, Glue, Lambda, Redshift, and IAM. Proficient in SQL and Python for data transformation and scripting. Solid understanding of data modeling principles (Star/Snowflake schema, normalization/denormalization). Experience in performance tuning and Snowflake optimization techniques. Excellent problem-solving skills and ability to work independently or as part of a team. Strong communication skills, both written and verbal.

Posted 3 days ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

Naukri logo

Key Responsibilities: ETL Development and Maintenance: Design, develop, and implement ETL processes using SSIS to support data integration and warehousing requirements. Maintain and enhance existing ETL workflows to ensure data accuracy and integrity. Collaborate with data analysts, data architects, and other stakeholders to understand data requirements and translate them into technical specifications. Extract, transform, and load data from various source systems into the data warehouse. Perform data profiling, validation, and cleansing to ensure high data quality. Monitor ETL processes to ensure timely and accurate data loads. Write and optimize complex SQL queries to extract and manipulate data. Work with SQL Server to manage database objects, indexes, and performance tuning. Ensure data security and compliance with industry standards and regulations. Business Intelligence and Reporting: Develop and maintain interactive dashboards and reports using Power BI or SSRS. Collaborate with business users to gather requirements and create visualizations that provide actionable insights. Integrate Power BI with other data sources and platforms for comprehensive reporting. Scripting and Automation: Utilize Python for data manipulation, automation, and integration tasks. Develop scripts to automate repetitive tasks and improve efficiency. Insurance Domain Expertise: Leverage knowledge of insurance industry processes and terminology to effectively manage and interpret insurance data. Work closely with business users and stakeholders within the insurance domain to understand their data needs and provide solutions. Qualifications Required Skills and Qualifications: Technical Skills: Proficient in SQL and experience with SQL Server. Strong experience with SSIS for ETL development and data integration. Proficiency in Python for data manipulation and scripting. Experience with Power BI/SSRS for developing interactive dashboards and reports. Knowledge of data warehousing concepts and best practices. Domain Knowledge: Solid understanding of insurance industry processes, terminology, and data structures. Experience working with insurance-related data, such as policies, claims, underwriting, and actuarial data. Additional Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities.

Posted 3 days ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

Its fun to work at a company where people truly believe in what they are doing! Job Description: Job Summary: The Operations Analyst role is to provide technical support for the full lifecycle of the electronic discovery reference model (EDRM) including ingestion of data, quality control, document production and document review projects. The position will require attention to detail, multi-tasking, analytical skills, as well as someone who works well in a team. The candidate must be able to work under the pressure of strict deadlines on multiple projects in a fast-paced environment. Essential Job Responsibilities Utilize proprietary and 3rd party eDiscovery software applications for electronic discovery and data recovery processes. Load, Process and Search client data in many different file formats. Conducting relevant searches of electronic data using proprietary tools. Work closely with team members to troubleshoot data issues (prior to escalation to operations senior management and/or IT/Development), research software and/or techniques to solve problems, and carry out complex data analysis tasks. Providing end user and technical documentation and training for applications supported. Communicate and collaborate with other company departments. Generate reports from various database platforms for senior management. Generating written status reports to clients, managers, and project managers. Working closely with internal departments on streamlining processes and development of proprietary tools Qualifications & Certifications Solid understanding of Windows and all MS Office applications is required. Basic UNIX skills, understanding of hardware, networking, and delimited files would be an advantage. Experience with database applications and knowledge of litigation support software is desirable. Strong analytical and problem-solving skills are essential for this role. Demonstrated ability to work in a team environment, follow detailed instructions and meet established deadlines. A self-starter with ability to visualize data and software behavior and coordinate the two. Fluency in English (verbal and written) is required. Bachelor s degree or final year student, preferably in computer/technical or legal field or equivalent combination of education and/or experience required. If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us!

Posted 3 days ago

Apply

3.0 - 7.0 years

15 - 17 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

We are looking for a detail-oriented and analytical BI Engineer to join our data team. The ideal candidate will have a strong background in SQL, data visualization tools like Looker, Superset, Tableau, and MicroStrategy, and experience working with cloud platforms such as GCP and AWS. You will be responsible for transforming raw data into actionable insights and building scalable BI solutions to support data-driven decision-making across the organization. Job Summary: We are looking for a detail-oriented and analytical BI Engineer to join our data team. The ideal candidate will have a strong background in SQL , data visualization tools like Looker , Superset , Tableau , and MicroStrategy , and experience working with cloud platforms such as GCP and AWS . You will be responsible for transforming raw data into actionable insights and building scalable BI solutions to support data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain interactive dashboards and reports using Looker , Superset , Tableau , and MicroStrategy . Write complex and optimized SQL queries to extract, transform, and analyze data from various sources. Collaborate with stakeholders to gather requirements and translate business needs into technical specifications. Build and maintain data models and semantic layers to support self-service BI. Ensure data accuracy, consistency, and performance across all BI platforms. Work with data engineers and analysts to integrate data from multiple sources into cloud data warehouses (e.g., BigQuery , Redshift , Snowflake ). Implement best practices for data visualization, dashboard design, and user experience. Monitor BI tools and troubleshoot issues related to data quality, performance, and access. Required Skills: Proficiency in SQL and experience with large-scale data sets. Hands-on experience with at least two of the following BI tools: Looker , Superset , Tableau , MicroStrategy . MicroStrategy is a must Strong understanding of data modeling , ETL processes , and data warehousing . Experience working with cloud platforms, especially Google Cloud Platform (GCP) and Amazon Web Services (AWS) . Familiarity with version control systems (e.g., Git) and agile development practices. Excellent problem-solving skills and attention to detail. Preferred Qualifications: Bachelor s or Master s degree in Computer Science, Information Systems, or a related field. Experience with scripting languages like Python or R for data analysis. Knowledge of data governance , security , and compliance in BI environments. Exposure to CI/CD pipelines and DevOps practices for BI deployments. Impact Youll Make: NA This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Developer, Applications Development

Posted 3 days ago

Apply

Exploring Data Quality Jobs in India

The data quality job market in India is thriving, with an increasing demand for professionals who can ensure the accuracy, completeness, and consistency of data across various industries. As companies rely more heavily on data-driven decision-making, the need for skilled data quality experts continues to grow.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Mumbai
  5. Delhi

Average Salary Range

The average salary range for data quality professionals in India varies based on experience level: - Entry-level: INR 3-6 lakhs per annum - Mid-level: INR 6-12 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in data quality may include roles such as Data Quality Analyst, Data Quality Engineer, Data Quality Manager, and Data Quality Architect. Professionals can advance from entry-level positions to senior roles by gaining experience, acquiring relevant certifications, and demonstrating expertise in data quality management.

Related Skills

In addition to proficiency in data quality processes and tools, professionals in this field are often expected to have knowledge of data governance, data analysis, data modeling, and SQL. Strong communication skills, attention to detail, and problem-solving abilities are also valuable for success in data quality roles.

Interview Questions

  • What is data quality and why is it important? (basic)
  • How do you measure data quality? (basic)
  • Explain the difference between data profiling and data cleansing. (medium)
  • What are some common data quality issues you have encountered in your previous projects? (medium)
  • How would you handle missing values in a dataset? (medium)
  • Describe the steps you would take to establish a data quality framework. (advanced)
  • How do you ensure data quality in real-time data processing? (advanced)
  • What tools or software have you used for data quality management? (medium)
  • Can you explain the concept of data deduplication? (basic)
  • How do you handle data anomalies in a dataset? (medium)
  • What is the role of metadata in data quality management? (advanced)
  • How do you assess data quality requirements for a new project? (medium)
  • What are the key components of a data quality report? (basic)
  • Explain the difference between data accuracy and data integrity. (medium)
  • How do you ensure data quality compliance with regulatory standards? (advanced)
  • Have you implemented any data quality improvement initiatives in your previous roles? (medium)
  • How do you prioritize data quality issues in a large dataset? (advanced)
  • Can you describe a challenging data quality problem you solved in your career? (medium)
  • What strategies do you use to maintain data quality over time? (advanced)
  • How do you collaborate with other teams to improve data quality across an organization? (medium)
  • What role does data profiling play in data quality assessment? (basic)
  • How do you handle data quality issues caused by human error? (medium)
  • Can you explain the concept of data lineage and its importance in data quality management? (advanced)
  • How do you validate data quality metrics and KPIs? (medium)
  • What are some best practices for ensuring data quality in a data migration project? (advanced)

Conclusion

As you explore opportunities in the data quality job market in India, remember to continuously enhance your skills, stay updated on industry trends, and showcase your expertise during interviews. By preparing thoroughly and approaching each opportunity with confidence, you can position yourself for a successful career in data quality. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies