Jobs
Interviews

3717 Data Quality Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to prepare test cases and perform testing of the product/ platform/ solution to be deployed at a client end and ensure its meet 100% quality assurance parameters. Do Instrumental in understanding the test requirements and test case design of the product Authoring test planning with appropriate knowledge on business requirements and corresponding testable requirements Implementation of Wipro's way of testing using Model based testing and achieving efficient way of test generation Ensuring the test cases are peer reviewed and achieving less rework Work with development team to identify and capture test cases, ensure version Setting the criteria, parameters, scope/out-scope of testing and involve in UAT (User Acceptance Testing) Automate the test life cycle process at the appropriate stages through vb macros, scheduling, GUI automation etc To design and execute the automation framework and reporting Develop and automate tests for software validation by setting up of test environments, designing test plans, developing test cases/scenarios/usage cases, and executing these cases Ensure the test defects raised are as per the norm defined for project / program / account with clear description and replication patterns Detect bug issues and prepare file defect reports and report test progress No instances of rejection / slippage of delivered work items and they are within the Wipro / Customer SLA's and norms Design and timely release of test status dashboard at the end of every cycle test execution to the stake holders Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders Status Reporting and Customer Focus on an ongoing basis with respect to testing and its execution Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc On time deliveries - WSRs, Test execution report and relevant dashboard updates in Test management repository Updates of accurate efforts in eCube, TMS and other project related trackers Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Data Quality Engineering. Experience:3-5 Years.

Posted 1 week ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Pune

Work from Office

: Job TitleSenior Engineer Data SQL Engineer Corporate TitleAVP LocationPune, India Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your skills and experience 10+ years of hands-on experience with SQL in relational databases SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable skills that will help you excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.

Posted 1 week ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

Role The purpose of this role is to provide strategic guidance and recommendations on pricing of contracts being executed in the assigned SBU while maintaining the competitive advantage and profit margins. Responsible for ensuring the SoW adherence to internal guidelines of all contracts in the SBU. Do - Contract pricing review and advise - Pricing strategy deployment - Drive the deployment of pricing strategy for the SBU/ Vertical / Account in line with the overall pricing strategy for Wipro - Partner and educate the Business Leaders about adherence to the pricing strategy, internal guidelines and SoW. - Business partnering for advice on contract commercials - Work closely with pre-sales and BU leadership to review the contracts about to be finalized and provide inputs on its structuring, payment milestones and terms & conditions - Review the Resources Loading Sheet (RLS)) submitted by pre-sales / delivery team and work on the contract pricing - Collaborate with the business leaders to propose a competitive pricing basis the effort estimate by considering the cost of resources, skills availability and identified premium skills - Review adherence of contract's commercial terms and conditions - Review the commercial terms and conditions proposed in the SoW - Ensure they are aligned with internal guidelines for credit period and the existing MSAs and recommend payment milestones - Ensure accurate revenue recognition and provide forecast - Implement and drive adherence to revenue recognition guidelines - Ensure revenue recognition by the BFMs / Service Line Finance Manage are done as per the IFRS standards - Partner with Finance Managers and educate them on revenue recognition standards and internal guidelines of Wipro - Provide accurate and timely forecast of revenue for the assigned SBU/ Vertical / Cluster / Accounts - Validation of order booking - Adherence to order booking guidelines - Oversee and ensure all documents, approvals and guidelines are adhered before the order is confirmed in the books of accounts - Highlight any deviations to the internal guidelines / standards and work with the concerned teams to address the deviations - Team Management - Team Management - Clearly define the expectations for the team - Assign goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports - Guide the team members in acquiring relevant knowledge and develop their professional competence - Educate and build awareness in the team in Wipro guidelines on revenue recognition, pricing strategy, contract terms and MSA - Ensure that the Performance Nxt is followed for the entire team - Employee Satisfaction and Engagement - Lead and drive engagement initiatives for the team - Track team satisfaction scores and identify initiatives to build engagement within the team 1. Financials Monetizing Wipro's efforts and value additions Comprehensiveness of pricing recommendations Accurate inputs in forecasting of revenue as per revenue recognition guidelines 2. Internal Customer Completeness of contracts checklist before order booking 3. Team Management Team attrition %, Employee satisfaction score, localization %, gender diversity % Training and skill building of team on pricing operations Mandatory Skills: Data Governance. Experience: 5-8 Years.

Posted 1 week ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Chennai

Work from Office

The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Informatica MDM.: Experience: 5-8 Years.

Posted 1 week ago

Apply

8.0 - 13.0 years

16 - 31 Lacs

Mumbai

Work from Office

Job Name (Digital Banking) Associate Data Analyst Location - Mumbai Grade - Senior Manager / AVP Looking for Business Analyst working in Regulated sector by RBI - Bank, Lending NBFC or consulting Firms - Working on Banking data. Having experience in Business credit risk. Functional-techno resource who understand Data lifecycle & actively do Stakeholder management. Skills Stakeholder Management, Data Quality, Data Analytics, Data Management, Reporting, Data Transformation Experience - Graduate Candidates – upto 12 years of experience. For PG candidates – upto 10 years should be preferred Predominant Skills - Data Quality; Remediation Processes (Databases, SQL and Python) Data Visualisation Skills (Dashboard, Tableau Power, BI) Informatica Data Quality Basic understanding of Data Lakes and Cloud environment ---------------------------------------------------------------- Job Purpose HDFC Bank has huge volume of data, both structured and unstructured, and we are focused on creating assets out of data and deriving best value from the data for the Bank. The Data Remediation and DaaS specialist will be responsible for improving customer data quality through various internal data remediation methodologies. This role will also focus on designing, implementing, and maintaining global and local data marts on the Banks Data Lake to support business, marketing, analytics, regulatory, and other functional use cases. This role is crucial in ensuring high-quality customer data while enabling business functions with reliable and well-structured data marts. The ideal candidate will be someone with a passion for data quality, strong technical skills, and a strategic mindset to drive data-driven decision-making across the Bank. Role & responsibilities Customer Data Quality Management • Analyze and assess data quality issues in customer records • Implement data cleansing, standardization, and deduplication strategies. • Monitor and improve the accuracy, completeness, and consistency of customer data. Formulate Data Remediation Strategies • Conduct root cause analysis to identify sources of poor data quality. • Coordinate with internal stakeholders to drive data improvement initiatives. Data Mart Development & Maintenance • Engage with multiple business, product, credit, risk, analytics, marketing, finance, BIU etc. stakeholders to discover requirements of data marts along with the current challenges faced Providing inputs and recommendation on continuous improvement of policies, procedures, processes, standards, and control pertaining to Data Marts Quantify the impact in business value terms (revenue/cost/loss) due to launch of global and loc Experience Required 5-7 years of total work experience in Data Quality/ Data Product creation 5+ years of experience in Banking and Financial services Experience of working in large, multi-functional, matrix organization Strong technical & functional understanding of Data Remediation and Data Products that includes Staging, Mapping, Cleanse Function, Match Rules, Validation, Trust Scores, Remediation Techniques, Mart creation methodologies & best practices etc Experience with industry-leading master data/metadata/data quality suites, such as Informatica Data Quality Exposure of working in Cloud environment will be an added advantage

Posted 1 week ago

Apply

3.0 - 5.0 years

13 - 17 Lacs

Gurugram

Work from Office

Senior Analyst-GCP Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Data Analytics (DA) Data Analytics is one of the highest growth practices within Evalueserve, providing you rewarding career opportunities. Established in 2014, the global DA team extends beyond 1000+ (and growing) data science professionals across data engineering, business intelligence, digital marketing, advanced analytics, technology, and product engineering. Our more tenured teammates, some of whom have been with Evalueserve since it started more than 20 years ago, have enjoyed leadership opportunities in different regions of the world across our seven business lines. What you will be doing at Evalueserve Data Pipeline Development: Design and implement scalable ETL (Extract, Transform, Load) pipelines using tools like Cloud Dataflow, Apache Beam or Spark and BigQuery. Data Integration: Integrate various data sources into unified data warehouses or lakes, ensuring seamless data flow. Data Transformation: Transform raw data into analyzable formats using tools like dbt (data build tool) and Dataflow. Performance Optimization: Continuously monitor and optimize data pipelines for speed, scalability, and cost-efficiency. Data Governance: Implement data quality standards, validation checks, and anomaly detection mechanisms. Collaboration: Work closely with data scientists, analysts, and business stakeholders to align data solutions with organizational goals. Documentation: Maintain detailed documentation of workflows and adhere to coding standards. What were looking for Proficiency in **Python/PySpark and SQL for data processing and querying. Expertise in GCP services like BigQuery, Cloud Storage, Pub/Sub, Cloud composure and Dataflow. Familiarity with Datawarehouse and lake house principles and distributed data architectures. Strong problem-solving skills and the ability to handle complex projects under tight deadlines. Knowledge of data security and compliance best practices. Certification: GCP Professional Data engineer. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking onachievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Knowmore about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer : The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad

Hybrid

Job Summary: We are looking for a Quality Engineer Data to ensure the reliability, accuracy, and performance of data pipelines and AI/ML models in our SmartFM platform. This role is essential for delivering trustworthy data and actionable insights to optimize smart building operations. Roles and Responsibilities: Design and implement QA strategies for data pipelines and ML models. Test data ingestion and streaming systems (StreamSets, Kafka) for accuracy and completeness. Validate data stored in MongoDB, ensuring schema and data integrity. Collaborate with Data Engineers to proactively address data quality issues. Work with Data Scientists to test and validate ML/DL/LLM/Agentic Workflow models. Automate data validation and model testing using tools like Pytest, Great Expectations, Deepchecks. Monitor production pipelines for data drift, model degradation, and performance issues. Participate in code reviews and create detailed QA documentation. Continuously improve QA processes based on industry best practices. Required Technical Skills: 5 - 10 years of experience in QA, with focus on Data and ML testing. Proficient in SQL for complex data validation. Hands-on with StreamSets, Kafka, and MongoDB. Python scripting for test automation. Familiarity with ML model testing, metrics, and bias detection. Experience with cloud platforms (Azure, AWS, or GCP). Understanding of Node.js and React-based systems is a plus. Experience with QA tools like Pytest, Great Expectations, Deepchecks. Additional Qualifications: Excellent communication and documentation skills. Strong analytical mindset and attention to detail. Experience working cross-functionally with Engineers, Scientists, and Product teams. Passion for learning new technologies and QA frameworks. Domain knowledge in facility management, IoT, or building automation is a plus.

Posted 1 week ago

Apply

0.0 - 2.0 years

2 - 4 Lacs

Hyderabad

Work from Office

Career Category Supply Chain Job Description Role Description: The Master Data Associate - Material & Production at Amgen will support the accuracy and consistency of master data across the organization. This role will perform data validation, entry, cleansing, and enrichment while collaborating with teams to resolve issues and ensure data integrity. The associate will support key performance monitoring, data governance, and compliance efforts, as well as assist in data migration and integration projects. Candidates should have a basic understanding of enterprise applications like SAP or Oracle, familiarity with data quality and compliance standards, and strong analytical skills. Roles & Responsibilities: Perform data operations tasks, mainly maintenance, and validation, to ensure the accuracy and integrity of master data Support process optimization initiatives to improve data management workflows and enhance efficiency Learn and support data analysis to identify trends, discrepancies, and opportunities for improvement Provide support to partners, customers, and end-users on master data processes, tools, and best practices. Maintain data quality reports to monitor performance metrics and ensure data compliance. Collaborate cross-functionally with business, IT, and operations teams to resolve data-related issues and ensure alignment with organizational goals. Basic Qualifications and Experience: Bachelor s degree in a STEM discipline and 0-2 years of experience in SAP ECC, master data management, data governance, or data operations, preferably in the healthcare or biotech supply chains Technical Proficiency : Experience in SAP/Oracle, Microsoft Office (Excel, PowerPoint), and other data management tools (e. g. , MDG, Informatica). Analytical Skills : Ability to analyze large datasets and deliver actionable insights. Intellectual Curiosity : Driven to learn by asking questions, proactively learning the business, and developing new skills. Attention to Detail : High accuracy and attention to detail, with a strong focus on data quality. Communication : Excellent written and verbal communication skills, with the ability to present findings to both technical and non-technical stakeholders. Functional Skills: Must-Have Skills: Working knowledge of SAP/Oracle Understanding of general master data management processes, frameworks, and governance. Proficiency in Excel and MS Office Suite, with experience in data analysis Basic understanding of data governance frameworks and ensuring data accuracy and quality. Strong communication skills for presenting data insights to both technical and non-technical audiences. Good-to-Have Skills: SAP S/4, SAP MDG, SAP TM .

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 6 Lacs

Gurugram

Work from Office

About Us At SBI Card, the motto Make Life Simple inspires every initiative, ensuring that customer convenience is at the forefront of all that we do. We are committed to building an environment where people can thrive and create a better future for everyone. SBI Card is proud to be an equal opportunity & inclusive employer and welcome employees without any discrimination on the grounds of race, color, gender, religion, creed, disability, sexual orientation, gender identity, marital status, caste etc. SBI Card is committed to fostering an inclusive and diverse workplace where all employees are treated equally with dignity and respect which makes it a promising place to work. What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose This role is responsible for day-to-day operations of the function including catering to data service requests, data observability, Analytics, insight generation and Dashboard/MIS requirements while owning the business data within the SBI Card Data Lake. Role Accountability This role will involve development and implementation of BI solutions that support data-driven decision-making for our credit card business. Candidate with a strong background in data analytics and a decent understanding of the financial services industry. Key Responsibilities Data Analysis & Reporting: Collect, analyze, and interpret large datasets to generate actionable insights related to credit card products, customer behavior, and financial performance. Develop and maintain regular reports, dashboards, and key performance indicators (KPIs) that track and analyze business performance. Present data insights to management and relevant teams, ensuring clear and effective communication of findings. Data Visualization & Insights Generation: Design and build interactive dashboards and visualizations using BI tools (e. g. , Power BI, Tableau) to enhance data accessibility for stakeholders. Perform ad-hoc analysis to identify business opportunities, trends, and potential risks within the credit card portfolio. Work with the Manager and other stakeholders to identify areas for improvement and provide data-driven recommendations. Data Management & Integrity: Ensure data quality and accuracy by following best practices in data governance, validation, and cleansing. Assist in maintaining and enhancing the company s data warehouse and reporting structures, ensuring that all relevant data is up-to-date and accessible. Identify and troubleshoot data discrepancies or issues and provide solutions to ensure data integrity. Collaboration with Cross-Functional Teams: Collaborate with various departments, to understand business needs and provide actionable insights. Support in implementing new BI tools, processes, and technologies across the organization. Partner with business teams to define key metrics and develop reporting structures for ongoing performance monitoring. Continuous Improvement & Learning: Stay updated on the latest trends, tools, and technologies in business intelligence and analytics. Identify opportunities to automate or streamline data collection, analysis, and reporting processes. Contribute to ongoing process improvements and enhance the BI team s capabilities. Measures of Success Deliver data projects MIS, Reports and dashboards on Time and accurately to drive business decision making Deliver Actionable insights to business for decision making Deliver on data extraction and other service tickets within SLA Technical Skills / Experience / Certifications Proficiency in BI tools (e. g. , Power BI, Tableau, QlikView) and data analysis tools (e. g. , SQL, Python, R). Experience in data visualization and storytelling to communicate insights effectively. Strong knowledge of data warehousing, ETL processes, and database management is a plus Strong understanding of financial metrics and KPIs related to credit card businesses (e. g. , delinquency rates, card utilization, profitability). Strategic & Lateral thinking and capability to come up with new ideas Proficiency with statistical and data software languages Competencies critical to the role Ability to multi-task, work with cross-functional teams Communication & Presentation skills. Analytical Skills and an eye for detail Strategic & Lateral thinking and capability to come up with new ideas Commitment towards continuous learning Strong communication skills, with the ability to present findings to both technical and non-technical stakeholders. Qualification Graduate or Postgraduate in Computer Science, Data Science, Statistics, Data Analytics or related fields. Preferred Industry BFSI

Posted 1 week ago

Apply

2.0 - 6.0 years

8 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Gen AI/ML Engineer with expertise in embeddings, vector databases, and application development to join our team. The ideal candidate will have a strong background in machine learning, natural language processing, and software development. The Gen AI/ML Engineer will be responsible for designing, implementing, and optimizing AI/ML models or applications, leveraging embeddings and vector databases to build innovative applications. Responsibilities Design, develop, and deploy Gen AI/ML applications with a focus on embeddings and vector databases. Implement and optimize machine learning algorithms for various applications, including natural language processing and computer vision. Develop and maintain scalable applications that leverage AI/ML models and vector databases. Collaborate with data scientists, software engineers, and other stakeholders to integrate AI/ML solutions into existing systems. Conduct research and stay current with the latest advancements in AI/ML, embeddings, and vector databases. Optimize the performance and scalability of AI/ML models and applications. Ensure data quality, integrity, and security throughout the AI/ML lifecycle. Provide technical guidance and support to team members and stakeholders. Qualifications Required Skills: Bachelor s degree in Computer Science, Data Science, Machine Learning, or a related field; Master s degree preferred. Proven experience as an AI/ML Engineer or similar role, with a focus on embeddings and vector databases. Strong understanding of machine learning principles, algorithms, and frameworks. Proficiency in programming languages such as Python, Java, or C++. Experience with AI/ML frameworks and libraries (e. g. , TensorFlow, PyTorch, scikit-learn). Hands-on experience with vector databases (e. g. , Faiss, Milvus, Pinecone). Familiarity with natural language processing techniques and tools. Experience in developing and deploying scalable applications. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Required Experience & Education: 11 13 years of over all experience in Technology Bachelor s degree in Computer Science, Data Science, Machine Learning, or a related field; Master s degree preferred. Proven experience as an AI/ML Engineer or similar role, with a focus on embeddings and vector databases. Strong understanding of machine learning principles, algorithms, and frameworks. Proficiency in programming languages such as Python, Java, or C++. Experience with AI/ML frameworks and libraries. Hands-on experience with vector databases (e. g. , Faiss, Milvus). Familiarity with natural language processing techniques and tools. Experience in developing and deploying scalable applications. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Desired Experience: Knowledge of data security and compliance requirements. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

0.0 - 5.0 years

7 - 12 Lacs

Mumbai

Work from Office

Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. Job Summary As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Job Responsibilities Define data quality standards Investigate data quality issues Collaborate with technology partners Establish dashboards and metrics Support data view and lineage tools Embed data quality in UAT cycles Assist Operations users with data access Work with project teams on implementations Implement data ownership processes Deliver tools and training for data owners Champion improvements to data quality Required Qualifications, Capabilities, and Skills Engage effectively across teams Understand data components for IBOR Comprehend trade lifecycle and cash management Possess technical data management skills Solve operational and technical issues Deliver with limited supervision Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills Demonstrate strong communication skills Exhibit leadership in data governance Adapt to changing project requirements Analyze complex data sets Implement innovative data solutions Foster collaboration across departments Drive continuous improvement initiatives Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. Job Summary As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Job Responsibilities Define data quality standards Investigate data quality issues Collaborate with technology partners Establish dashboards and metrics Support data view and lineage tools Embed data quality in UAT cycles Assist Operations users with data access Work with project teams on implementations Implement data ownership processes Deliver tools and training for data owners Champion improvements to data quality Required Qualifications, Capabilities, and Skills Engage effectively across teams Understand data components for IBOR Comprehend trade lifecycle and cash management Possess technical data management skills Solve operational and technical issues Deliver with limited supervision Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills Demonstrate strong communication skills Exhibit leadership in data governance Adapt to changing project requirements Analyze complex data sets Implement innovative data solutions Foster collaboration across departments Drive continuous improvement initiatives

Posted 1 week ago

Apply

1.0 - 7.0 years

20 - 27 Lacs

Mumbai

Work from Office

We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Risk Central Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Integrate data from various firm sources into big data warehouse Investigate data issues, provide support on data issues. Develop automation for data extraction. Design and tune schema for data landed on platform Partner with information modelling teams on firm wide logical data models. Serve as the primary subject matter expert (SME) for data in the analytics platform Develop data quality rules and controls for data Analyze and solve query performance bottlenecks in Cloud based warehouses like Redshift and AWS Glue Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in big data technologies - Apache Spark, Hadoop and analytics. Hands on coding experience in Java/Python Experience in designing & developing using Redshift Strong CS fundamentals, data structures, algorithms with good understanding of big data Experience with AWS application development including services such as Lambda, Glue, ECS/EKS Excellent communication skills are a must for this position Experience with Unix/Linux and shell scripting, Redshift, Hive. Preferred qualifications, capabilities, and skills Good understanding of data modelling challenges with big data Good understanding of Financial data especially in front office investment banking is a major plus Ability to code in Apache Spark using Scala is an added advantage We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Commercial and Investment Banks Risk Central Team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Integrate data from various firm sources into big data warehouse Investigate data issues, provide support on data issues. Develop automation for data extraction. Design and tune schema for data landed on platform Partner with information modelling teams on firm wide logical data models. Serve as the primary subject matter expert (SME) for data in the analytics platform Develop data quality rules and controls for data Analyze and solve query performance bottlenecks in Cloud based warehouses like Redshift and AWS Glue Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Experience in big data technologies - Apache Spark, Hadoop and analytics. Hands on coding experience in Java/Python Experience in designing & developing using Redshift Strong CS fundamentals, data structures, algorithms with good understanding of big data Experience with AWS application development including services such as Lambda, Glue, ECS/EKS Excellent communication skills are a must for this position Experience with Unix/Linux and shell scripting, Redshift, Hive. Preferred qualifications, capabilities, and skills Good understanding of data modelling challenges with big data Good understanding of Financial data especially in front office investment banking is a major plus Ability to code in Apache Spark using Scala is an added advantage

Posted 1 week ago

Apply

1.0 - 6.0 years

11 - 15 Lacs

Bengaluru

Work from Office

You are a strategic thinker passionate about driving solutions in External Reporting. You have found the right team. As an External Reporting Associate in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. Our external reporting function is responsible for overseeing the financial statements and external reporting. We ensure a robust control environment, apply USGAAP/IFRS in compliance with corporate and regulatory requirements, and understand the uses and reporting of financial statements. Job responsibilities Apply up-to-date product, industry, and market knowledge in specialty areas of reporting. Consolidate, review, and analyze financial data for accuracy and completeness, performing period-over-period variance analytics. Coordinate data collection and business results with various lines of business, Regulatory Controllers, and SEC reporting teams. Assist in thoroughly assessing issues, outcomes, and resolutions. Communicate financial information clearly to the lines of business and flag potential issues. Participate in the production, review, and filing of monthly, quarterly, semi-annual, and annual reports for various regulatory agencies. Adhere to proof and control procedures to ensure accurate reconciliation between regulatory filings, SEC filings, and other published financial reports. Follow various control procedures and edit checks to ensure the integrity of reported financial results. Ensure accurate and complete data submission to the Regulators. Interpret and define regulatory and/or SEC requirements and coordinate internal and external policies. Establish and manage relationships with the line of business and external regulatory agency constituents through ongoing partnership and dialogue. Participate in continuous improvement efforts around data quality review and external reporting improvement projects. Required qualifications, capabilities, and skills 3+ years in a Finance organization with exposure to accounting, financial statements, and/or regulatory reporting Experience in Product Control, Financial Control or knowledge of SEC reporting/Reg Reporting Strong skills in time management, problem solving, written and oral communication Team player, with ability to work effectively across diverse functions, locations and businesses Strong analytical skills Preferred qualifications, capabilities, and skills Chartered Accountant/ Masters degree in Accounting or Finance preferred Project management experience/skills Proficient in MS Excel and Business Intelligent Solutions like Alteryx, Tableau or Python Prior experience with US regulatory filings (TIC/FFIEC009/FR2510) You are a strategic thinker passionate about driving solutions in External Reporting. You have found the right team. As an External Reporting Associate in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. Our external reporting function is responsible for overseeing the financial statements and external reporting. We ensure a robust control environment, apply USGAAP/IFRS in compliance with corporate and regulatory requirements, and understand the uses and reporting of financial statements. Job responsibilities Apply up-to-date product, industry, and market knowledge in specialty areas of reporting. Consolidate, review, and analyze financial data for accuracy and completeness, performing period-over-period variance analytics. Coordinate data collection and business results with various lines of business, Regulatory Controllers, and SEC reporting teams. Assist in thoroughly assessing issues, outcomes, and resolutions. Communicate financial information clearly to the lines of business and flag potential issues. Participate in the production, review, and filing of monthly, quarterly, semi-annual, and annual reports for various regulatory agencies. Adhere to proof and control procedures to ensure accurate reconciliation between regulatory filings, SEC filings, and other published financial reports. Follow various control procedures and edit checks to ensure the integrity of reported financial results. Ensure accurate and complete data submission to the Regulators. Interpret and define regulatory and/or SEC requirements and coordinate internal and external policies. Establish and manage relationships with the line of business and external regulatory agency constituents through ongoing partnership and dialogue. Participate in continuous improvement efforts around data quality review and external reporting improvement projects. Required qualifications, capabilities, and skills 3+ years in a Finance organization with exposure to accounting, financial statements, and/or regulatory reporting Experience in Product Control, Financial Control or knowledge of SEC reporting/Reg Reporting Strong skills in time management, problem solving, written and oral communication Team player, with ability to work effectively across diverse functions, locations and businesses Strong analytical skills Preferred qualifications, capabilities, and skills Chartered Accountant/ Masters degree in Accounting or Finance preferred Project management experience/skills Proficient in MS Excel and Business Intelligent Solutions like Alteryx, Tableau or Python Prior experience with US regulatory filings (TIC/FFIEC009/FR2510)

Posted 1 week ago

Apply

7.0 - 8.0 years

5 - 8 Lacs

Hyderabad

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 week ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Responsibilities : Data Exploration and Insights : - Conduct continuous data exploration and analysis to identify opportunities for enhancing data matching logic, including fuzzy logic, and improving overall data quality within the SCI solution. - This includes working with large datasets from various sources, including Excel files and databases. Data Quality Improvement : - Perform various analyses specifically aimed at improving data quality within the SCI system. - This will involve identifying data quality issues, proposing solutions, and implementing improvements. Weekly Playback and Collaboration : - Participate in weekly playback sessions, using Jupyter Notebook to demonstrate data insights and analysis. - Incorporate new explorations and analyses based on feedback from the working group and prioritized tasks. Project Scaling and Support : - Contribute to the scaling of the SCI project by supporting data acquisition, cleansing, and validation processes for new markets. - This includes pre-requisites for batch ingestion and post-batch ingestion analysis and validation of SCI records. Data Analysis and Validation : - Perform thorough data analysis and validation of SCI records after batch ingestion. - Proactively identify insights and implement solutions to improve data quality. Stakeholder Collaboration : - Coordinate with business stakeholders to facilitate the manual validation of records flagged for manual intervention. - Communicate findings and recommendations clearly and effectively. Technical Requirements : - 5+ years of experience as a Data Scientist. - Strong proficiency in Python and SQL. - Extensive experience using Jupyter Notebook for data analysis and visualization. - Working knowledge of data matching techniques, including fuzzy logic. - Experience working with large datasets from various sources (Excel, databases, etc. - Solid understanding of data quality principles and methodologies. Skills : - SQL - Machine Learning (While not explicitly required in the initial description, it's a valuable skill for a Data Scientist and should be included) - Data Analysis - Jupyter Notebook - Data Cleansing - Fuzzy Logic - Python - Data Quality Improvement - Data Validation - Data Acquisition - Communication and Collaboration - Problem-solving and Analytical skills Preferred Qualifications (Optional, but can help attract stronger candidates) : - Experience with specific data quality tools and techniques. - Familiarity with cloud computing platforms (e.g., AWS, Azure, GCP). - Experience with data visualization tools (e.g., Tableau, Power BI). - Knowledge of statistical modeling and machine learning algorithms

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 1 week ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Ahmedabad

Work from Office

Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 week ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 1 week ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Senior Data Engineer with a deep focus on data quality, validation frameworks, and reliability engineering . This role will be instrumental in ensuring the accuracy, integrity, and trustworthiness of data assets across our cloud-native infrastructure. The ideal candidate combines expert-level Python programming with practical experience in data pipeline engineering, API integration, and managing cloud-native workloads on AWS and Kubernetes . Roles and Responsibilities Design, develop, and deploy automated data validation and quality frameworks using Python. Build scalable and fault-tolerant data pipelines that support quality checks across data ingestion, transformation, and delivery. Integrate with REST APIs to validate and enrich datasets across distributed systems. Deploy and manage validation workflows using AWS services (EKS, EMR, EC2) and Kubernetes clusters. Collaborate with data engineers, analysts, and DevOps to embed quality checks into CI/CD and ETL pipelines. Develop monitoring and alerting systems for real-time detection of data anomalies and inconsistencies. Write clean, modular, and reusable Python code for automated testing, validation, and reporting. Lead root cause analysis for data quality incidents and design long-term solutions. Maintain detailed technical documentation of data validation strategies, test cases, and architecture. Promote data quality best practices and evangelize a culture of data reliability within the engineering teams. Required Skills: Experience with data quality platforms such as Great Expectations , Collibra Data Quality , or similar tools. Proficiency in Docker and container lifecycle management. Familiarity with serverless compute environments (e.g., AWS Lambda, Azure Functions), Python, PySpark Relevant certifications in AWS , Kubernetes , or data quality technologies . Prior experience working in big data ecosystems and real-time data environments.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 6 Lacs

Vapi

Work from Office

The BA/SBA_MDM_Central Master_SSC Officer is a key role within the organization, responsible for managing and maintaining the master data set, including creation, updates, and deletion. This role provides quality assurance of imported data, working with quality assurance analysts if necessary. The Officer will also commission and decommission of data sets, manage and resolve data quality issues, and work to improve data reliability, efficiency, and quality.

Posted 1 week ago

Apply

0.0 - 3.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Hi, Greetings from ExxonMobil! We are excited to share an opportunity with you. ExxonMobil is organizing scheduled in-person interviews at Gurugram on 2nd and 3rd Aug 2025. Work Location: Bengaluru (Last date to apply is 25th July 2025) Copy and paste the below link in browser to submit your application for the open position using the job link below; https://career4.successfactors.com/sfcareer/jobreqcareerpvt?jobId=81091&company=exxonmobilP&st=5BB041BCF26E9345E9049F1709275B10CD656553 Note: Shortlisted candidates will receive an interview invitation letter from recruiting team What role you will play in our team Coordinate collection of affiliate Environmental Performance Indicators (EPI). Understands internal reporting requirements and completes reports (Environmental Performance Indicator (EPI), Based in Bangalore, KA. Monthly and quarterly greenhouse gas (GHG) stewardships, monthly flaring report) What you will do Coordinate collection of affiliate Environmental Performance Indicators (EPI) QA/QC affiliate data Understands internal reporting requirements and completes reports (Environmental Performance Indicator (EPI), monthly and quarterly greenhouse gas (GHG) stewardships, monthly flaring report) Calculates key performance indicators of Environmental data (Flaring, CEMS, GHG, etc.) for assets Understands internal reporting requirements and completes reports (Environmental Performance Indicator (EPI), monthly and quarterly greenhouse gas (GHG) stewardships, monthly flaring report) Interfaces with data provider to collect emissions data and perform necessary analysis Responds to data providers queries Interface with ExxonMobil Information Technology (EMIT) Initiate tickets with EMIT for technical issues related to the data management infrastructure (Database, servers, integration, analytics) Validate technical solutions to the system Collect and communicate user feedback Support Business Line DAG analysts with stewardship processes Deep dive into affiliate EPI trends and outliers Respond to ad hoc request from affiliates and Global Operations & Sustainability contacts About you Skills and Qualifications B.Tech or M.Tech in Energy / Environmental / Petroleum Engineering or MBA or Masters degree in Sustainability with minimum 6 CGPA Liaising with various business units, relevant stakeholders, and external suppliers to gather key information Proven track record within Sustainability Reporting frameworks and ESG indices is a plus Experience in implementing sustainability reporting requirements and data reporting in a large company Minimum 6 months of experience working on Environmental Metrics, Data Management and Analysis Working experience with IT Systems (Advanced MS Excel, SSAS data cubes and statistical applications like Python/R, SaS, MS-SQL Experience on data visualization applications e.g., Tableau, Power BI, Spotfire Working experience on United States provincial and federal regulatory requirements to complete various emissions reports Preferred Qualifications/ Experience Sustainability Reporting frameworks; ESG indices knowledge; implementing sustainability reporting requirements and data reporting; Experience on North America provincial and federal regulatory requirements; Knowledge related to GHG and other air emissions, water, waste, climate change and other key environmental issues faced by business

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Your Impact: Good Logical and Analytical skills. Clear verbal and written communication capability.Strong communication skills. Have the ability to present and explain content to team, users and stakeholder.Experience in Pre sales and sales domain would be advantage . What the role offers: Excellent knowledge of Enterprise Content Management and Governance Domain (Data Discovery / Analysis / Classification and Management). Experience on Sensitive information management or PII discovery on unstructured data. (GDPR). Strong experience on TRIM/HPRM/HP Records Manager/Opentext Content Manager Deployment, customization and Upgrade. Experience in Data Governance, Metadata Management and Data Quality frameworks using TRIM/HPRM/Records Manager/Opentext Content Manager. Strong experience in leading the end-to-end design, Architecture, and implementation of ECM (Enterprise Content Management) solutions. Strong experience on defining and implementing Master Data Governance Process including Data Quality, Metadata Management, and Business ownership. Experience in managing extremely large records sets and hundreds of users across global sites. Experience with Enterprise Content Management cloud solution and migration. Experienced on TRIM/Records Manager/Content Manager integration with third party applications like SharePoint, Iron Mountain, O365 etc. Strong knowledge of Content Manager Security, audit configurations and workflow. Hands on Experience on SQL Database queries. Microsoft C#.NET (Programming language C#) Web Services development Web App development Windows Scheduler development Content Manager/Records Manager with .NET SDK (programming) Used in above mentioned Web Services and Windows Scheduler Custom Add-Ins development. Troubleshoot problems as necessary. What you need to Suceed : Experience on Capture, Imaging and Scanning products and technologies. Hands On experience on ECM products like Opentext xECM, Content Server. Cloud Certification Knowledge of Operation system (Win/Unix) and basic networking.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a member of the Contact Optimization team, you will play a crucial role in enhancing and acquiring marketing contact data, ensuring data governance and finding practical solutions to data quality and flow issues. Your strong communication skills and collaborative nature, combined with a data-driven mindset, will enable you to continuously improve contact data quality. By working with internal and external stakeholders, you will define, document, optimize, and operationalize data quality processes. As the primary escalation point for data issues, your ability to analyze and interpret data will provide valuable insights into AGM marketing priorities, allowing you to develop strategies that ensure optimal results. Drive and manage end-to-end marketing data enrichment activities, while adhering to and advocating for legal regulations Identify, analyze, select, recommend, onboard, and manage 3rd party WWM data vendors Conduct regular check-ins and monitor vendor performance to ensure data processing practices are followed and the quality of work meets company standards Collaborate cross-functionally to resolve system and/or vendor discrepancies, data accuracy, consistency, and availability of data across all systems Analyze data to identify volume gaps, expansion opportunities, access risk, and growth opportunities Evaluate the health of system data, standardizing or enhancing it as necessary to ensure marketability of contact data Track, measure, and communicate the impact of completed and ongoing data enrichment initiatives Drive new enrichment initiatives to increase data quality & net-new prospects Support initiatives to further define, align, and recommend optimized, globally standardized enrichment and acquisition policies and processes Drive awareness and alignment to global/regional processes by conducting regular trainings and assisting with the maintenance of Data Enrichment documentation/wiki Minimum Qualifications: 5+ years of experience in a marketing operations role involving databases, data manipulation, and data mapping Excellent written and verbal communication, critical thinking abilities, and effective cross-functional collaboration skills Strong understanding of database systems, technologies, integrations, and mapping logic Self-starter with creativity, flexibility, adaptability, and resilience in a dynamic environment Results-oriented individual with strong interpersonal skills, capable of taking ownership and driving deliverables Experience in managing projects internally and externally Strong problem-solving skills with attention to detail Preferred Qualifications: Proficiency in Microsoft Excel & Power Point Project Management skills Ability to deliver on KPIs and measurable success criteria Experience managing data enrichment and acquisition vendors Proficiency in problem-solving and communicating with individual stakeholders and teams Business proficiency in English Salesforce & Marketo knowledge a plus About Autodesk: Autodesk is dedicated to helping innovators turn their ideas into reality, transforming how things are made and what can be made. The company values a culture of belonging and equity in the workplace, where everyone can thrive and contribute to building a better future. Autodeskers are encouraged to be their authentic selves and engage in meaningful work that shapes the world and the future. Join Autodesk to be part of a team that creates amazing things every day.,

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

Job Description: Data Management Lead Experience Required: 12+ Years Location: Bangalore Industry: Financial Services & Others Role Overview We are seeking a seasoned Data Management Lead with 15+ years of experience to define, implement, and optimize the organizations data management strategy. This leadership role focuses on data usage controls implementation (hands-on), data maturity assessments, architecture design, and managing technical vs business data elements. The ideal candidate will possess a blend of hands-on technical expertise, strategic vision, and stakeholder management skills, enabling the organization to maximize the value of its data assets while ensuring compliance, governance, and quality. Key Responsibilities 1. Data Strategy, Maturity Assessment, and Architecture Develop and execute a comprehensive data management strategy aligned with organizational objectives. Conduct data maturity assessments to evaluate current capabilities and design a roadmap to the target state. Design and maintain current vs future state data architectures to ensure scalability, efficiency, and alignment with business goals. Manage and differentiate technical vs business data elements, ensuring alignment across teams and systems. 2. Data Usage Controls Implementation (Hands-On) Implement and manage data usage control frameworks to monitor, protect, and govern sensitive data. Lead the hands-on configuration and operationalization of tools for data classification, access control, and retention policies. Define and enforce data usage policies for internal and external stakeholders, ensuring compliance with organizational and regulatory standards. Collaborate with security teams to integrate data usage controls into the broader data security framework. 3. Data Governance and Quality Establish and enforce data governance policies, frameworks, and standards to ensure accuracy, consistency, and security. Lead the definition and management of Critical Data Elements (CDEs), including ownership and lifecycle management. Develop and track data quality metrics, conducting regular audits to ensure continuous improvement. Monitor data lineage and establish robust documentation for auditability and compliance. 4. System Integration and Tool Implementation Oversee the integration and management of Master Data Management (MDM) tools, ensuring seamless data consistency across systems. Lead data migration and transformation initiatives, ensuring alignment with business requirements. Configure and optimize data governance tools (e.g., Microsoft Purview, Collibra, Informatica) for metadata management, lineage tracking, and quality control. Work closely with IT teams to ensure the implementation of scalable and secure data infrastructure. 5. Stakeholder Engagement and Leadership Act as a trusted advisor to senior leadership, providing insights and recommendations on data strategy, governance, and usage control implementations. Foster strong relationships with internal stakeholders (e.g., business units, IT teams) and external vendors. Drive organizational alignment on data-related priorities and foster a culture of data-driven decision-making. 6. Team Leadership and Mentorship Lead, mentor, and inspire a team of data professionals to deliver high-impact outcomes. Identify skill gaps and provide training opportunities to ensure the team remains ahead of industry trends and challenges. Promote collaboration and knowledge sharing across teams to enhance overall data management capabilities. 7. Risk, Compliance, and Continuous Improvement Ensure compliance with data privacy and security regulations and other applicable laws. Implement risk mitigation strategies to address potential data-related issues and vulnerabilities. Drive continuous improvement initiatives to refine data management processes and adapt to evolving business needs. Identify opportunities to leverage emerging technologies (e.g., AI/ML) for data governance, quality improvement, and efficiency. Qualifications & Experience 15+ years of experience in data management, governance, and strategy. Proven expertise in implementing data usage control programs and tools with a hands-on approach. Strong knowledge of data governance frameworks, tools, and technologies (e.g., Microsoft Purview, Informatica, Collibra). Hands-on experience with data classification, lineage tracking, and retention policy enforcement. Expertise in conducting data maturity assessments and developing roadmaps for future state architectures. Familiarity with cloud-based platforms (e.g., Azure, AWS, GCP Preferred) and data management tools. Experience managing large-scale global data programs, including data migration and transformation. Key Skills Strong analytical and problem-solving abilities. Advanced technical proficiency in data usage controls, governance, and quality management. Strategic thinking coupled with a hands-on approach to execution. Exceptional communication and stakeholder management skills. Knowledge of programming and scripting languages like SQL, Python, or R (Good to have). Preferred Certifications Certified Data Management Professional (CDMP). Cloud certifications (e.g., AWS, Azure, GCP).,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

coimbatore, tamil nadu

On-site

Team: Technology Work Location: Coimbatore Who are we Value Health Inc. is one of the fastest-growing Life Sciences Consulting, Digital Transformation, and Productized Services companies working with Fortune 500 companies across the globe. Our offshore center is at Coimbatore and rapidly expanding to other parts of India. Value Health Inc has rich domain and consulting experience in Healthcare and Life Sciences space globally working across state-of-the-art technologies. What does the Talend Admin do As a Talend Administrator, you will oversee the daily operations and management of the Talend environment. Your role will involve ensuring system stability, optimizing performance, and maintaining security. Key responsibilities include installing, configuring, monitoring, troubleshooting, and managing user access to ensure seamless operation of the Talend platform. What does your day look like as a Talend Admin at Value Health Inc Working knowledge with Talend ETL tool like Talend Administration Center (TAC) Experience in setting up User roles/rights in the Administration Center Managing Project Authorizations Managing Licenses, notifications & checking for updates Executing Jobs, Routes and Services Managing Services and Polices Experience in Migration and addressing issues (Git, etc) Should have recent project implementation experience of Talend using Big Data environment. Qualifications Bachelors or masters degree in computer science, Information Technology, or a related field. 3 years of hands-on experience in Talend Administration, with strong expertise in Talend Administration Center (TAC). Hands-on experience managing Talend environments for data integration, ETL, and data quality projects. Preferred Exceptional problem-solving and analytical skills. Regular day or afternoon shifts (as per business requirements). Ability to commute/relocate to Coimbatore. This position involves working from the office only. It may require occasional extension of work or weekend work to accommodate the client requests. Why is Value Health Inc the right place for you Dynamic Growth Environment: Thrive in a rapidly expanding company where every day presents new opportunities for professional development and personal growth. Unleash Creativity: Enjoy the freedom to think innovatively and experiment with tasks, liberated from rigid guidelines, fostering a culture that values and rewards outside-the-box thinking. Impactful Decision-Making: Take charge of pivotal decisions that directly shape the success of some of the world's most crucial companies, contributing to their growth and prosperity. Continuous Learning Hub: Immerse yourself in a wealth of learning opportunities, growing your skill set, exploring diverse topics of interest, and actively sharing your knowledge within a collaborative and supportive community. Do you like being part of such a team Joining Value Health Inc means embracing a team where growth is not just encouraged, but inevitable. We foster a community where your contributions matter, and impactful growth is a shared journey. If you envision a workplace where your potential meets purpose, we invite you to be a part of our customer and employee-friendly culture. Your journey towards professional fulfilment begins here! Know more about us https://valuehealthai.com/,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies