Jobs
Interviews

3681 Data Quality Jobs - Page 30

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 4.0 years

5 - 9 Lacs

Chennai

Work from Office

The primary expectation for this role as a Linguist for the linguistics team is proficiency in Portuguese, enabling you to effectively manage, develop, and optimize linguistic resources. Your role will be to foster this language and develop them for a multitude of products delivered to customers. Your job will be to build and maintain these languages per our Lightcast standards and help in the development of further features. To fill this role we are looking for a dynamic and multilingual person that will quickly learn the ins and outs of the role in order to become an active part of a multicultural team. Major Responsibilities: Analyze and improve data quality of multilingual text classifiers Translate various taxonomies such as Skills, Titles, and Occupations. Annotate data used for model training and validation Education and Experience: Bachelor s degree in Linguistics, Data Analytics, Engineering, Computer Science, Statistics, Artificial Intelligence, NLP or similar. Strong linguistics knowledge Skills/Abilities: Understanding of syntax and structural analysis of languages Microsoft Excel experience (including vlookups, data cleanup, and functions) Experience with data analysis using tools such as Excel Knowledge of RegEx is preferred Lightcast is a global leader in labor market insights with headquarters in Moscow (ID) with offices in the United Kingdom, Europe, and India. We work with partners across six continents to help drive economic prosperity and mobility by providing the insights needed to build and develop our people, our institutions and companies, and our communities.

Posted 1 week ago

Apply

1.0 - 7.0 years

12 - 16 Lacs

Hyderabad

Work from Office

A Day in the Life Careers that Change Lives At Medtronic, we push the limits of technology to make tomorrow better than today, which makes it an exciting and rewarding place to work. We value what makes you unique. Be a part of a company that thinks differently to solve problems, make progress, and deliver meaningful innovations. As a Data Engineer II, you will be a part of our data engineering team responsible for Developing, deploying, monitoring, and supporting the data mart platform. In addition, you will be responsible for creating tools and automating operational tasks to integrate the data platform with external systems. Your entrepreneurial mindset and technical skills will be used, creating solutions that meet business needs and optimize customer experience directly impacting the organization and affecting the lives of millions. We believe that when people from diverse cultures, genders, and points of view come together, innovation is the result and everyone wins. Medtronic walks the walk, creating an inclusive culture where you can thrive. A DAY IN THE LIFE: In general, the following responsibilities apply for the Data Engineer II role . This includes, but is not limited to the following: Work effectively within a geographically dispersed and cross-functional teams during all phases of the product development process. Must be responsive, flexible, self-motivated and able to succeed within an open collaborative peer environment Participates in reviews, code inspections and will support the development of documentation required. Be Agile and effectively navigate through changing project priorities. Work independently under limited supervision. Setup proactive monitoring and alerting Troubleshoot production issues Qualifications- MUST HAVE - MINIMUM REQUIREMENTS: TO BE CONSIDERED FOR THIS ROLE, PLEASE BE SURE THE MINIMUM REQUIREMENTS ARE EVIDENT ON YOUR RESUME Overall, 4-7 years of IT experience with Bachelor s degree in computer engineering, Software Engineering, Computer Science, Electrical Engineering, or related technical field. Minimum 3 years of relevant experience in Data Engineering Minimum of 2 years of working experience in PySpark , and other data processing tools like Hive, Sqoop, etc. Minimum 1 year of experience in AWS and AWS native tools S3, Glue, Lambda, EMR , Athena Minimum 1 years of Hands-on experience with programming languages such as Python Strong Expertise in writing SQL Queries. Source Control systems Git/GitHub experience Strong problem-solving skills Experience in writing unit test and developing data quality frameworks Strong written and verbal communication & presentation skills Nice to Have Previous healthcare industry experience a plus Experience working with CI/CD tools preferrable Azure Pipelines & Terraform AWS certifications (AWS Developer /AWS Data Engineer) Working experience in any reporting tool like PowerBI. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission to alleviate pain, restore health, and extend life unites a global team of 95, 000+ passionate people. We are engineers at heart putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary. Learn more about our business, mission, and our commitment to diversity here

Posted 1 week ago

Apply

6.0 - 11.0 years

25 - 27 Lacs

Noida

Work from Office

We are seeking a Tech Operations Lead for our Technology - Business Management Office group, intended to provide decision support and analytics primarily focused on IT Asset Management. This position will support business decisions by providing accurate information on hardware and software assets deployed and used by the organization, and all assets are tracked to manage the financial, legal and compliance risks. Perform end-to-end lifecycle of hardware and software asset management processes, ensuring compliance with licensing terms and internal policies. Plan, monitor, and record software license and/or hardware assets to make sure they complied with vendor contracts in asset Management tools. Develop and enforce asset tagging, tracking and data reconciliation procedures while maintain the accurate inventory of all hardware assets using asset management tool. Design and deliver periodic and ad-hoc reports on asset utilization. Generated non-compliance reconciliation reports (weekly), Published monthly AMC and SLA reports &Managed daily machine and material records. Ensure proper hardware provisioning, deployment, maintenance, relocation and disposal aligned with company standards and lifecycle policies. Interfacing with other support organizations to ensure the effective use of the CMDB and Configuration Management System. Maintaining and recommending improvements to facilitate effective use and integrity of the CMDB. Make sure all changes to the CIs and the CMS are controlled, audited are reported and CMS is up to date. Ensuring that CI Owners maintain adequate Configuration Management process disciplines and systems for the CIs they own. Define and enhance scheme for identifying hardware and software-related assets as well as CIs, including versioning and dependencies in the asset management tools, attributes, the Contract management library, and the CMDB. Drive cost optimization strategies and identify opportunities for savings through effective license reuse, consolidation, and vendor negotiations. Onboard new software vendors for BAU Governance by collaborating with Procurement and Line of Business Operations teams to create a baseline inventory of entitlements and deployments. Responsible for managing lifecycle of hardware and software models in the DML right from their introduction to their retirement. Ensure the data quality, audits of data and interfaces between the tools and provide reporting on the asset management configuration items. Gather data and report effectiveness of IT asset management processes using pre-defined KPIs/metrics. Assist stakeholders in solutions to business needs for hardware and software cascades and technology charge backs. Creation of process guidelines/documentation and procedures to mature the Ameriprise TI asset management area. Experience: 7+ years of experience in hardware asset management and Software Asset Management (SAM), including standards, purchasing, and lifecycle practices. Experience with license management tools such as Flexera FNMS and ServiceNow SAM and HAM pro is highly desirable. Configuration Management experience on document control, source code management, and Defect Management tools. Experience of working in a multi-site environment. Preferred Knowledge Knowledge of IT Asset Management tool like Service Now, Flexera, Aspera, iTunes (Discovery agents) etc. knowledge of Excel, Access and reporting tools is required. SAM Tool Operational knowledge and Certification is preferred. Strong knowledge of Excel, Access and reporting tools is required. Strong written & verbal communication skills with attention to detail. Independent problem-solving ability & handling complex analysis. Ability to manage multiple tasks & projects. Sound business knowledge (Preferably Tech business) and ability to apply it in analysis. Location : Gurugram/Noida Timings : 2. 00 PM - 10. 30 PM Cab Facility provided : Yes. Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U. S. based financial planning company headquartered in Minneapolis with a global presence. The firm s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if youre talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Full-Time/Part-Time Timings (2:00p-10:30p) India Business Unit AWMPO AWMP&S Presidents Office Job Family Group Technology

Posted 1 week ago

Apply

3.0 - 7.0 years

9 - 14 Lacs

Pune

Work from Office

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: The Senior Data Engineer will be responsible for designing, building, and managing the data infrastructure and data pipeline processes for the bank. This role involves leading a team of data engineers, working closely with data scientists, analysts, and IT professionals to ensure data is accessible, reliable, and secure. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a thorough understanding of the banking industrys data requirements. Leadership and Team Management . Lead, mentor, and develop a team of data engineers. Establish best practices for data engineering and ensure team adherence. Coordinate with other IT teams, business units, and stakeholders. Data Pipelines Integration and Management: Design and implement scalable data architectures to support the banks data needs. Develop and maintain ETL (Extract, Transform, Load) processes. Ensure the data infrastructure is reliable, scalable, and secure. Oversee the integration of diverse data sources into a cohesive data platform. Ensure data quality, data governance, and compliance with regulatory requirements. Develop and enforce data security policies and procedures. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes Requirements To be successful in this role, you should meet the following requirements: (Must have Requirements) 8+ years of experience in data engineering or related field Strong experience with database technologies (SQL, NoSQL), data warehousing solutions, and big data technologies (Hadoop, Spark) Proficiency in programming languages such as Python, Java, or Scala. Experience with cloud platforms (AWS, Azure, Google Cloud) and their data services. Deep understanding of ETL processes and data pipeline orchestration tools (Airflow, Apache NiFi). Knowledge of data modeling, data warehousing concepts, and data integration techniques. Strong problem-solving skills and ability to work under pressure. Excellent communication and interpersonal skills. Experience in the banking or financial services industry. Familiarity with regulatory requirements related to data security and privacy in the banking sector. Certifications in cloud platforms (AWS Certified Data Analytics, Google Professional Data Engineer, etc. ). Experience with machine learning and data science frameworks Location : Pune and Bangalore

Posted 1 week ago

Apply

8.0 - 9.0 years

10 - 11 Lacs

Mumbai

Work from Office

You are a strategic thinker passionate about driving solutions and innovation mindset. You have found the right team. As a Data Engineer in our STO team, you will be a strategic thinker passionate about promoting solutions using data. You will mine, interpret, and clean our data, asking questions, connecting the dots, and uncovering hidden opportunities for realizing the data s full potential. As part of a team of specialists, you will slice and dice data using various methods and create new visions for the future. Our STO team is focused on collaborating and partnering with business to deliver efficiency and enhance controls via technology adoption and infrastructure support for Global Finance & Business Management India. Job Responsibilities Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources into Databricks. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Develop Optimized solutions for performance and scalability. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Communicating analytical findings to senior leaders through data visualization and storytelling Required qualifications, capabilities and skills Minimum 3+ years of hands-on experience in Developing, implementing and maintaining python automation solutions including the use of LLM. Develop, implement, and maintain new and existing solutions Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources . Ability to use LLM to build AI solutions. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Preferred qualifications, capabilities and skills Hand-on experience in Python desktop solution development Knowledge of machine learning and data science concepts will be plus Experience with data visualization tool Tableau will be plus You are a strategic thinker passionate about driving solutions and innovation mindset. You have found the right team. As a Data Engineer in our STO team, you will be a strategic thinker passionate about promoting solutions using data. You will mine, interpret, and clean our data, asking questions, connecting the dots, and uncovering hidden opportunities for realizing the data s full potential. As part of a team of specialists, you will slice and dice data using various methods and create new visions for the future. Our STO team is focused on collaborating and partnering with business to deliver efficiency and enhance controls via technology adoption and infrastructure support for Global Finance & Business Management India. Job Responsibilities Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources into Databricks. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Develop Optimized solutions for performance and scalability. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Communicating analytical findings to senior leaders through data visualization and storytelling Required qualifications, capabilities and skills Minimum 3+ years of hands-on experience in Developing, implementing and maintaining python automation solutions including the use of LLM. Develop, implement, and maintain new and existing solutions Write efficient Python and SQL code to extract, transform, and load (ETL) data from various sources . Ability to use LLM to build AI solutions. Perform data analysis and computation to derive actionable insights from the data. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data processes. Monitor and troubleshoot data workflows to ensure reliability and efficiency. Document data engineering processes, methodologies, and workflows. Preferred qualifications, capabilities and skills Hand-on experience in Python desktop solution development Knowledge of machine learning and data science concepts will be plus Experience with data visualization tool Tableau will be plus

Posted 1 week ago

Apply

2.0 - 9.0 years

7 - 8 Lacs

Mumbai

Work from Office

Step into a transformative role as an Operations Analyst in Collateral Operations, where youll be instrumental in driving portfolio reconciliation and ensuring regulatory adherence across all regulations. Your expertise will span cross-LOB metrics and projects, fostering a culture of continuous improvement that supports business functions across Back Office, Middle Office, and Global teams. Job Summary As an Operations Analyst in Collateral Operations, you will be responsible for Portfolio Reconciliation, Regulatory adherence for all regulations, cross LOBs metrics and projects. Additionally, you will be building the culture of continuous improvement supporting business across Back Office, Middle offices as well as Global teams. You will be interacting with multiple Operations & Technology teams within the organization to provide business support. Job Responsibilities Perform Portfolio Reconciliation and Collateral Dispute Management. Understand MTM Breaks Including data quality, strategic projects, etc. Focus on deep dive and fixing on upstream issues to keep the breaks to minimum. Resolve breaks with Middle Offices, Credit risk, VCG, etc. Checks regulatory compliance CFTC, EMIR, NCMR, etc. Perform UAT testing. Implements Strategic automation projects. Required qualifications, capabilities and skills Graduate or Post-Graduate with 2 years experience in operations. Familiarity with Capital Markets & OTC Derivatives i. e. Investment Banking, including OTC product, process and system knowledge. Ability to drive results through a "hands-on" approach. Excellent verbal and written communication skills, and adapt at communicating with all levels of the business and technical parts of the organization. Skilled in MS office applications including Outlook, PowerPoint, Excel, Word and Access. Can operate effectively in a dynamic environment with tight deadlines, and can prioritize ones own and team s work to achieve goals . Flexibility to work global hours. Preferred qualifications, capabilities and skills Knowledge on CFTC, EMIR, NCMR regulations preferable. Experience on OTC Confirmations, Collateral Management and Reconciliation platforms will be an advantage. Step into a transformative role as an Operations Analyst in Collateral Operations, where youll be instrumental in driving portfolio reconciliation and ensuring regulatory adherence across all regulations. Your expertise will span cross-LOB metrics and projects, fostering a culture of continuous improvement that supports business functions across Back Office, Middle Office, and Global teams. Job Summary As an Operations Analyst in Collateral Operations, you will be responsible for Portfolio Reconciliation, Regulatory adherence for all regulations, cross LOBs metrics and projects. Additionally, you will be building the culture of continuous improvement supporting business across Back Office, Middle offices as well as Global teams. You will be interacting with multiple Operations & Technology teams within the organization to provide business support. Job Responsibilities Perform Portfolio Reconciliation and Collateral Dispute Management. Understand MTM Breaks Including data quality, strategic projects, etc. Focus on deep dive and fixing on upstream issues to keep the breaks to minimum. Resolve breaks with Middle Offices, Credit risk, VCG, etc. Checks regulatory compliance CFTC, EMIR, NCMR, etc. Perform UAT testing. Implements Strategic automation projects. Required qualifications, capabilities and skills Graduate or Post-Graduate with 2 years experience in operations. Familiarity with Capital Markets & OTC Derivatives i. e. Investment Banking, including OTC product, process and system knowledge. Ability to drive results through a "hands-on" approach. Excellent verbal and written communication skills, and adapt at communicating with all levels of the business and technical parts of the organization. Skilled in MS office applications including Outlook, PowerPoint, Excel, Word and Access. Can operate effectively in a dynamic environment with tight deadlines, and can prioritize ones own and team s work to achieve goals . Flexibility to work global hours. Preferred qualifications, capabilities and skills Knowledge on CFTC, EMIR, NCMR regulations preferable. Experience on OTC Confirmations, Collateral Management and Reconciliation platforms will be an advantage.

Posted 1 week ago

Apply

9.0 - 14.0 years

15 - 19 Lacs

Bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products

Posted 1 week ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform. The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions.The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization.This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets. Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting. Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink. Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset. Good understanding of open table formats like Delta and Iceberg. Scale data quality frameworks to ensure data accuracy and reliability. Build data lineage tracking solutions for governance, access control, and compliance. Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms. Improve system stability, monitoring, and observability to ensure high availability ofthe platform. Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack. Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment. Qualifications: Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms. Expertise in big data architectures using Databricks, Trino, and Debezium. Strong experience with streaming platforms, including Confluent Kafka. Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment. Hands-on experience implementing data quality checks using Great Expectations. Deep understanding of data lineage, metadata management, and governancepractices. Strong knowledge of query optimization, cost efficiency, and scaling architectures. Familiarity with OSS contributions and keeping up with industry trends in dataengineering.Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges. Excellent communication and collaboration skills to work effectively withcross-functional teams.Ability to lead large-scale projects in a fast-paced, dynamic environment. Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products.

Posted 1 week ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

Bengaluru

Work from Office

About the Role: We are looking for an Associate Architect with atleast 9+ years of experience to help scale andmodernize Myntra's data platform The ideal candidate will have a strong background inbuilding scalable data platforms using a combination of open-source technologies andenterprise solutions The role demands deep technical expertise in data ingestion, processing, serving, andgovernance, with a strategic mindset to scale the platform 10x to meet the ever-growing dataneeds across the organization This is a high-impact role requiring innovation, engineering excellence and system stability,with an opportunity to contribute to OSS projects and build data products leveragingavailable data assets Key Responsibilities: Design and scale Myntra's data platform to support growing data needs acrossanalytics, ML, and reporting Architect and optimize streaming data ingestion pipelines using Debezium, Kafka(Confluent), Databricks Spark and Flink Lead improvements in data processing and serving layers, leveraging DatabricksSpark, Trino, and Superset Good understanding of open table formats like Delta and Iceberg Scale data quality frameworks to ensure data accuracy and reliability Build data lineage tracking solutions for governance, access control, and compliance Collaborate with engineering, analytics, and business teams to identify opportunitiesand build / enhance self-serve data platforms Improve system stability, monitoring, and observability to ensure high availability ofthe platform Work with open-source communities and contribute to OSS projects aligned withMyntras tech stack Implement cost-efficient, scalable architectures for handling 10B+ daily events in acloud environment Education: Bachelor's or Masters degree in Computer Science, Information Systems, or arelated field. Experience: 9+ years of experience in building large-scale data platforms Expertise in big data architectures using Databricks, Trino, and Debezium Strong experience with streaming platforms, including Confluent Kafka Experience in data ingestion, storage, processing, and serving in a cloud-basedenvironment Hands-on experience implementing data quality checks using Great Expectations Deep understanding of data lineage, metadata management, and governancepractices Strong knowledge of query optimization, cost efficiency, and scaling architectures Familiarity with OSS contributions and keeping up with industry trends in dataengineering Soft Skills: Strong analytical and problem-solving skills with a pragmatic approach to technicalchallenges Excellent communication and collaboration skills to work effectively withcross-functional teams Ability to lead large-scale projects in a fast-paced, dynamic environment Passion for continuous learning, open-source collaboration, and buildingbest-in-class data products

Posted 1 week ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Job title: Business Analyst Responsibilities : Analytical Support : Gather all operational and financial data across all centers to provide inputs into the weekly MIS as well as a Monthly Review Meeting. Drive meaningful weekly / monthly reports that will help the regional Managers to take decisions on their centers health Analyse financial data (budgets, income statements, etc.) to understand Oasis Fertility's financial health. Coordinate all operational issues captured at center level and program manager the closure through cross functional collaboration Evaluate operational expenditures (OPEX) and capital expenditures (Capex) against the budget to identify variances. Analyse operational data to identify trends and areas for improvement. Conduct ad-hoc analytics towards a hypothesis and derive insights that will impact business performance Operational support : Coordinate assimilation of data for calculating doctor payouts and facilitate the final file to finance Coordinate and assimilate data to calculate incentives for the eligible operations team members. Use key metrics like yearly growth, return on assets (ROA), return on equity (ROE), and earnings per share (EPS) to assess operational performance. Collaborate with the operations and finance teams to ensure alignment between operational and financial goals. Strategic Support : Conduct business studies to understand past, present, and potential future performance. Conduct market research to stay updated on financial trends in the fertility industry. Evaluate the effectiveness of current processes and recommend changes for better efficiency. Develop data-driven recommendations to improve operational efficiency. Prepare financial models to assess the profitability of different business units and potential investment opportunities. Participate in process improvement initiatives and policy development to optimize business functions.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Job Summary A BIE responsible for identifying the data objects developing strategies and analysing data to provide insights and recommendations for product data migrations. SAP MM module with 7+ years of experience. Responsibilities As a BIE must hands on ERP SAP CRM SFDC and LSMW Winshuttle. Should have hands on experience on Material master data Purchase info records condition records source list creation stock migration from legacy to new. Gather business requirements compliance and document the requirements as per Process documents in information models ensuring end-to-end data model consistency across processes and IT applications. Accountable for the implementation of Data Requirement Specifications based on inputs from the various stakeholders such as Business Process Owner Business Process Experts Business Information Owner Markets Business Group representatives and Program/Project managers. Support the implementation of the Services domain roadmap including implementation of data related solutions and processes keeping data quality and compliance controls in mind.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data platform's capabilities. You will be actively involved in problem-solving and contributing innovative ideas to improve the overall data architecture, ensuring that the platform meets the evolving needs of the organization and its stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and management frameworks.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Navi Mumbai

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Purview Good to have skills : Collibra Data GovernanceMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful outcomes. You will also engage in problem-solving activities, ensuring that the applications meet the required standards and specifications while fostering a collaborative environment for your team. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously assess and improve application development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Purview.- Good To Have Skills: Experience with Collibra Data Governance.- Strong understanding of data governance principles and practices.- Experience in application design and architecture.- Familiarity with cloud-based solutions and integration techniques. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Purview.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and meet the requirements of the organization, facilitating smooth data integration and accessibility for users across the company. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to increase efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and techniques.- Experience with ETL processes and data integration.- Familiarity with data governance and data quality principles.- Ability to work with various data visualization tools to present data models effectively. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Nagpur

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP FSCM Treasury and Risk Management (TRM) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact for the project- Manage the team and ensure successful project delivery- Collaborate with multiple teams to make key decisions- Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FSCM Treasury and Risk Management (TRM)- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP FSCM Treasury and Risk Management (TRM)- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Data Quality Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement Informatica Data Quality solutions.- Collaborate with cross-functional teams to analyze and address data quality issues.- Create and maintain documentation for data quality processes.- Participate in data quality improvement initiatives.- Assist in training junior professionals in data quality best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality.- Strong understanding of data quality principles.- Experience with ETL processes and data integration.- Knowledge of data profiling and cleansing techniques.- Familiarity with data governance and metadata management. Additional Information:- The candidate should have a minimum of 3 years of experience in Informatica Data Quality.- This position is based at our Kolkata office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Informatica MDM Good to have skills : Informatica AdministrationMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of enhancements and maintenance tasks, while also focusing on the development of new features to meet client needs. You will be responsible for delivering high-quality code and participating in discussions that drive project success. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM.- Good To Have Skills: Experience with Informatica Administration.- Strong understanding of data integration and data quality processes.- Experience with ETL processes and data warehousing concepts.- Familiarity with database management systems and SQL. Additional Information:- The candidate should have minimum 3 years of experience in Informatica MDM.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 15.0 years

7 - 17 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

We are looking for a data scientist who will help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver AI/ML based Enterprise Software Products. Develop solutions related to machine learning, natural language processing and deep learning & Generative AI to address business needs. Your primary focus will be in applying Language/Vision techniques, developing llm based applications and building high quality prediction systems. Analyze Data: Collaborate with cross-functional teams to understand data requirements and identify relevant data sources. Analyze and preprocess data to extract valuable insights and ensure data quality. Evaluation and Optimization: Evaluate model performance using appropriate metrics and iterate on solutions to enhance performance and accuracy. Continuously optimize algorithms and models to adapt to evolving business requirements. Documentation and Reporting: Document methodologies, findings, and outcomes in clear and concise reports. Communicate results effectively to technical and non-technical stakeholders. Work experience background required Experience building software from the ground up in a corporate or startup environment. Essential skillsets required: 5-15 years experience in software development Educational Background: Strong computer science and Math/Statistics Experience with Open Source LLM and Langchain Framework and designing efficient prompt for LLMs. Proven ability with NLP and text-based extraction techniques. Experience in Generative AI technologies, such as diffusion and/or language models. Excellent understanding of machine learning techniques and algorithms, such as k-NN, Naive Bayes, SVM, Decision Forests, etc. Familiarity with cloud computing platforms such as GCP or AWS. Experience to deploy and monitor model in cloud environment. Experience with common data science toolkits, such as NumPy, Pandas etc Proficiency in using query languages such as SQL Good applied statistics skills, such as distributions, statistical testing, regression, etc. Experience working with large data sets along with data modeling, language development, and database technologies Knowledge in Machine Learning and Deep Learning frameworks (e.g., TensorFlow, Keras, Scikit-Learn, CNTK, or PyTorch), NLP, Recommender systems, personalization, Segmentation, microservices architecture and API development. Ability to adapt to a fast-paced, dynamic work environment and learn new technologies quickly. Excellent verbal and written communication skills

Posted 2 weeks ago

Apply

3.0 - 4.0 years

5 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Description Experience : 3-4 years 3 Yrs relevant ETL : ADF, Databricks, GitHub, DevOps, SQL, Python Scope Responsible for designing, developing and deploying ETL processes, work with petabytes of data through our big data platform and enable BI and data science initiatives. Responsibilities Be autonomy to develop ETL processes and use CI/CD and help other team members to develop their jobs; Create reusable structures or processes that can be used in different projects, developing code to automate jobs using real-time services to remove dependencies; Create process and dataflow documentation; Act as level 3 in the support and quality of processes, identifying root causes for the L3 tickets; Work on data service transformations and Data Quality using SQL: analyzing redundant and inconsistent data; Work in Agile methodology, participating in ceremonies during the sprint; Align the deliverables and deadlines with the teams involved in the project, analyzing the impacts that team and project can have Participate Data Engineering OKR process, define actions and keep them up to date Share the knowledge between BrewDat team (India and Zones). Work with reusable structures or processes that can be used in different project. Develop code to automate jobs using real-time services and parameterized coding to remove dependencies. Required Knowledge Concepts Technical Data Warehouse, ETL/ELT, Cloud, Data Flow, Lakehouse, Agile, Big Data, Spark, Semantic Layer, Security, DataViz ADF, Databricks, GitHub, DevOps,SQL, Python, PBI Soft Skills Curiosity; Logical/Critical thinking; Communication; Resilience; Collaboration

Posted 2 weeks ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

MSCI ESG Data collection team is involved in acquisition of ESG data at scale and QA of the collected data and responsible for establishing and maintaining highest level of data quality and standards across all datasets feeding our ESG products. As an Data Collection Transformation Senior Associate, you will be responsible for leading and delivering on several initiatives as part of the ESG transformation agenda, to support rapidly evolving ESG landscape and its adoption in the financial market. Your Key Responsibilities As a member of MSCI Data Acquisition and Collection team, you are expected to have a strong interest in general Environment, Social, Governance, Climate and policy frameworks around these domains as well as regulatory trends Take active part in projects dealing with electronification of ESG & Climate frameworks and principles into data definitions which can be operationalized for collection Collaborate with Research teams on building data collection templates and with technology teams to translate these into implementable data models Do hands-on research with new data sets by studying company disclosures to help connect research proposals with implementable solution which are scalable Independently run analysis on data sets (either collected or from third party) to detect trends/patterns (EDA) and propose ways to build anomaly detection on new and existing content Analyze & research the historical data corrections across all ESG & Climate data and propose & implement contextual/thematic QA to detect cases that potentially may not be captured in current QA framework Codify data definitions with an intent to build NLP driven data extraction models (leveraging Traditional approaches/LLMs) to automate detection and extraction of Facts from company disclosures Help design and set-up new data collection processes and help with integration of these processes with ongoing data operations Deliver top quality data aligned with MSCI methodology, service level agreements, and regulatory requirements; Steer to improve methodology and SOP documents leveraging data and content expertise; Drive process improvements to ensure consistent data quality and efficiency, such as automation of data quality diagnostics by developing a new system/tool that will enable quality assessment of data without manual intervention; Contribute to process improvements to ensure consistent data quality and efficiency, such as automation of data quality diagnostics by developing a new system/tool which will enable quality assessment of data without manual intervention; Work with internal stakeholders and downstream teams on understanding data requirement, data QC scope and data delivery; Create reports/dashboards which provide quantitative data assessment metrics which justify recommendations. Visualization, outlier detection/analysis, data summaries, etc. Sharing plans, recommendations, summaries with management through conference calls, meetings and presentations with internal/external teams, Research and product Your skills and experience that will help you excel Analytical skills and has strong attention to details - Should have keen interest in analyzing data, process flows and quality focused Exposure of using tools such as Python/SQL etc. - Demonstrated experience in improving process/Automation through applications of Python/ML/RPA Work exposure with any of the visualization tools such as PowerBI would be preferable. Should have very good hands on skills working with advanced excel features. Self-starter and self-motivated, should be solutions focused and have the ability to work in unstructured environments Comfortable working in a team environment across hierarchies, functions and geographies Should have experience of working in Financial/technology/Business Analysis domain Knowledge about equities or financial markets in general. Exposure to ESG data would be added advantage Desired Experience 7+ years of full-time professional experience in: Experience in data quality and automation related roles, Business analysis, analyzing existing process and reengineer to achieve efficiency and improved quality, Exposure of using tools such as Pandas/SQL, Power BI etc. would be preferable Financial services experience; good to have exposure to ESG About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose to power better investment decisions. You ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI Inc. is an equal opportunity employer committed to diversifying its workforce. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment? We are Compliance Engineering, a global team of more than 300 engineers and scientists who work on the most complex, mission-critical problems. We: build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive UX/UI and Big Data applications. Compliance Engi neering is looking to fill several big data software engineering roles Your first deliverable and success criteria will be the deployment, in 2025, of new complex data pipelines and surveillance models to detect inappropriate trading activity. HOW YOU WILL FULFILL YOUR POTENTIAL As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, leverage various technologies including; Java, Spark, Hadoop, Flink, MapReduce, HBase, JSON, Protobuf, Presto, Elastic Search, Kafka, Kubernetes be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, including negotiating data contracts, capturing data quality metrics, processing large scale data, building surveillance detection models, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software systems across our products. QUALIFICATIONS A successful candidate will possess the following attributes: A Bachelors or Masters degree in Computer Science, Computer Engineering, or a similar field of study. Expertise in java, as well as proficiency with databases and data manipulation. Experience in end-to-end solutions, automated testing and SDLC concepts. The ability (and tenacity) to clearly express ideas and arguments in meetings and on paper. Experience in the some of following is desired and can set you apart from other candidates : developing in large-scale systems, such as MapReduce on Hadoop/Hbase, data analysis using tools such as SQL, Spark SQL, Zeppelin/Jupyter, API design, such as to create interconnected services, knowledge of the financial industry and compliance or risk functions, ability to influence stakeholders.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Warangal, Hyderabad, Nizamabad

Work from Office

Proficiency in data modeling tools such as ER/Studio, ERwin or similar. Deep understanding of relational database design, normalization/denormalization, and data warehousing principles. Experience with SQL and working knowledge of database platforms like Oracle, SQL Server, PostgreSQL, or Snowflake. Strong knowledge of metadata management, data lineage, and data governance practices. Understanding of data integration, ETL processes, and data quality frameworks. Ability to interpret and translate complex business requirements into scalable data models. Excellent communication and documentation skills to collaborate with cross-functional teams.

Posted 2 weeks ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Job Title: ETL Support Specialist - Informatica - EFR Job Summary: We are seeking a proactive and detail-oriented ETL Support Specialist with expertise in Informatica to join our Application Management Services (AMS) team. The ideal candidate will be responsible for monitoring ETL processes, providing business-as-usual (BAU) support, and ensuring prompt responses to incidents. The role requires strong troubleshooting skills, effective communication, and the ability to quickly learn new technologies. Key Responsibilities: Monitor ETL workflows and processes using Informatica to ensure timely data extraction, transformation, and loading. Provide BAU support for ETL operations, responding promptly to incidents and service requests. Troubleshoot and resolve issues related to ETL processes, databases, and data integrity. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Perform regular health checks and performance tuning of ETL jobs and databases. Maintain documentation related to ETL processes, workflows, and troubleshooting procedures. Assist in the development and execution of test cases for ETL processes to ensure quality and reliability. Participate in knowledge sharing and training sessions to enhance team capabilities and promote best practices. Stay updated with the latest trends and technologies in data integration and ETL processes. Technical Skills: Proficiency in Informatica PowerCenter. Strong knowledge of SQL, with experience in querying and manipulating data in Oracle DB and SQL Server DB. Familiarity with Autosys for job scheduling and monitoring. Understanding of data warehousing concepts and architecture. Experience with data quality and data governance practices is a plus. Qualifications: Bachelor s degree in computer science, Information Technology, or a related field. Relevant experience in ETL development and support, preferably in a data warehousing environment. Proven troubleshooting skills with a focus on analysis and resolution of complex issues. Excellent communication and interpersonal skills, with the ability to interact effectively with technical and non-technical stakeholders. A quick learner with a proactive approach to problem-solving and the ability to work independently as well as part of a team. Mandatory: 3 Days working from Client Office, Bangalore (Hybrid) 3 different Shifts - Morning - Afternoon-Night

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Scala Good to have skills : NoSQLMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data architecture, ensuring that the solutions you develop are efficient and scalable. You will also monitor and optimize existing data processes to enhance performance and reliability, making data accessible and actionable for stakeholders across the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with data scientists and analysts to understand data needs and deliver appropriate solutions.- Design and implement robust data pipelines that ensure the integrity and availability of data. Professional & Technical Skills: - Must To Have Skills: Proficiency in Scala.- Good To Have Skills: Experience with NoSQL.- Strong understanding of data modeling and database design principles.- Experience with ETL tools and frameworks.- Familiarity with cloud-based data solutions and services. Additional Information:- The candidate should have minimum 2 years of experience in Scala.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Gurugram

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Data Engineering, Cloud Data MigrationMinimum 7.5 year(s) of experience is required Educational Qualification : BE or BTech must Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and maintain data solutions for data generation, collection, and processing.- Create data pipelines to ensure efficient data flow.- Implement ETL processes for data migration and deployment. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with Data Engineering.- Strong understanding of data modeling techniques and methodologies.- Experience in cloud data migration.- Knowledge of data engineering principles.- Proficient in ETL processes. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Gurugram office.- A BE or BTech degree is required. Qualification BE or BTech must

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies