Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
20 - 25 Lacs
Noida
Work from Office
Exp 15+ year of exp Require knowledge of Talend but also knowledge of other Data related tools like Databricks or Snowflake The Senior Talend Developer/Architect role is responsible to lead the design, development and manage the INSEAD data infrastructure for the CRM ecosystem, to develop Talend Jobs & Flows and to act as a mentor for the other 3-4 Talend Developers. This role will be instrumental in driving data pipeline architecture and ensuring data integrity, performance, and scalability using the Talend platform. This role is key part of the HARMONIA project team while the engagement is active. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such data modelling & design, architecture, integration and propose technology strategy. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/She will collaborate closely with cross-functional teams to deliver high-quality data solutions that support strategic business objectives. Job Requirements Details Design, develop, and deploy scalable ETL/ELT solutions using Talend (e.g. Data Stewardship, Management Console, Studio). Architect end-to-end data integration workflows. Establish development best practices, reusable components, and job templates to optimize performance and maintainability. Responsible for delivering robust data architecture, tested, validated and deployable jobs/flows to production environments. He/she will follow Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions. Assist with the developer input/feedback for those requirements wherever deemed necessary. These are to be done by actively leading brainstorming sessions arranged by the project manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced developments follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Actively participate to the project related activities and ensure the SDLC process is followed. Participate in the implementation and execution of data cleansing and normalization, deduplication and transformation projects. Conduct performance tuning, error handling, monitoring, and troubleshooting of Talend jobs and environments. Contribute to sprint planning and agile ceremonies with the Harmonia Project Team and Data Operations Team. Document technical solutions, data flows, and design decisions to support operational transparency. Stay current with Talend product enhancements and industry trends, recommending upgrades or changes where appropriate. No budget responsibility
Posted 2 days ago
4.0 - 9.0 years
6 - 11 Lacs
Bengaluru
Work from Office
PL-SQL Developer - Total Yrs. of Experience 4+ years Relevant Yrs. of experience 4+ Years Detailed JD (Roles and Responsibilities) Database Management Systems: Strong experience with PLSQL, SQL such as Redshift, PostgreSQL, Oracle, SQL Server. Data Warehousing: Experience in data warehousing concepts, architecture, and best practices. Data Quality and Data Profiling: Good knowledge of data profiling and data cleansing techniques. Communication: Excellent communication skills to work effectively with team members, and business team. Mandatory skills PLSQL, SQL Desired/ Secondary skills Informatica Domain Energy Utilities Work Location given in ECMS ID Hyderabad WFO/WFH/Hybrid WFO Hybrid WFO Location- Hyderabad Yrs of Exp-5Yrs
Posted 2 days ago
4.0 - 9.0 years
12 - 13 Lacs
Hyderabad
Work from Office
Challenging. Meaningful. Life-changing. Those aren t words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You ll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Responsibilities will include, but are not limited to: Data Review : Understand and review Protocol along with relevant study specific data review documents. Provide input to data review documents like Protocol Data Review Plan, Data Quality Management Plan Review clinical data listings and prioritize critical data review. Manage and facilitate resolution of data discrepancies. Perform data cleaning as per the defined Clean Patient Group. Freezing & Locking of CRFs/Fields. Coordinate with Data Management Lead for study deliverables. External Data: Track data load and address discrepancies. Complete review of loaded external data and prioritize complex external data review e.g. Blinded Independent Committee Review, Biomarker, SAE Coordinate with external data vendor for resolution of data discrepancies as applicable. Documentation: Filing of appropriate documents in eTMF as per eTMF master plan. Training and Mentorship: Provide Training and mentoring to junior CDM staff. Bachelor s Degree required. Life sciences, Pharmacy or relevant fields preferred. 4 years of experience in Clinical Data Review tasks Able to work on clinical data review tasks Able to work collaboratively on multi-disciplinary project teams. Strong knowledge of Clinical Drug Development Process, FDA/ICH guidelines and industry standard practices regarding data management Strong knowledge and experience of EDC systems (Medidata RAVE preferred); demonstrated knowledge of Microsoft Office skills. Strong oral and written communication skills. Yes, 5-10% Industry Conferences, Investigator Meetings, Regulatory Inspections (as needed) With a single vision as inspiring as Transforming patients lives through science , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role:
Posted 2 days ago
5.0 - 10.0 years
11 - 16 Lacs
Pune
Work from Office
for the company s business critical systems and processes, utilizing cloud-first platforms and services Our engineering culture will empower you to make effective decisions, work collaboratively and take accountability for engineering projects at the core of the company and the leading edge of the latest industry technology trends The team is seeking a Cloud DevOps Engineer who will have a critical role on how we design, build, and deliver reusable managed cloud stacks Responsibilities: 1) Build and extend Full Stack infrastructure automations for enterprise and business applications in the enterprise cloud environment by leveraging Python, NodeJs, CI/CD and GitOps Methodologies etc 2) Configure and implement the public cloud environment for enterprise and business applications Refactor and migrate business applications to adopt Platform strategy to increase Engineering Productivity 3) Troubleshooting and solutioning of issues in the AWS environment 24x7 OnCall rotation 4) Design and build automation solutions to reduce manual efforts and increase team efficiency Desired Skills & Experience: 1) Must have 5+ years of hands-on experience on AWS & DevOps 2) 3 years of Hands-on experience in Software development and Automation using Python, NodeJS, TypeScript, Rest API, GitOps, etc 3) Working knowledge on setting up cloud infra using terraform, cloud formation templates and/or CDK 4) Expert knowledge with CI/CD tools like Code Build, Code Deploy, Jenkins 5) Expert Knowledge with observability and logging tools/services (ie Splunk, Catchpoint, Dynatrace, etc) 6) Good knowledge on developing automations on using Python, PowerShell, TypeScript, etc 7) Hands-on experience with the AWS Cloud Services (i e EC2, RDS, DynamoDB, S3 Bucket, API Gateway, Lambda, CloudWatch etc ) 8) Hands-on Experience on Linux is a plus 9) API Development experience with Java and/or NodeJS is a plus 10) Attention to detail and dedication to Quality, Automation 11) Should have SRE mindset 12) Excellent communication and documentation skills Education Bachelor s Degree or College Diploma in Computer Science, Information Systems or equivalent experience Good to have AWS Certification Business Title DevOps Engineer III Contingent Type Recruited Agency Temp Flex Forward Hybrid Primary Skills AWS & DevOps, Javascript / Typescript / Python Secondary Skills Automation using Python, NodeJS, TypeScript, Rest API, GitOps, etc
Posted 2 days ago
10.0 - 15.0 years
3 - 4 Lacs
Bengaluru
Work from Office
About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: Manager - AWS QA Engineer Location: Bangalore /Pune Experience: 10 + Years "We are seeking a Senior QA Data Engineer with expertise in Python, Pyspark, SQL and data testing automation. Good to have experience in AWS Cloud, Jenkins, AWS Glue, Lambda and S3. The ideal candidate will ensure the quality and reliability of data pipelines through robust testing and collaboration with DevOps teams on CI/CD pipelines. " Key Responsibilities: Develop and maintain automated test cases for ETL pipelines, and data workflows using Python/Pyspark. Proficient in SQL and NoSQL databases for data validation and analysis. Validate Jenkins CI/CD pipelines for automated testing and deployment. Willingness to scale up on testing AWS services, including Glue, Lambda, RDS, S3, and Iceberg, ensuring seamless integration and scalability. Collaborate closely with DevOps to enhance deployment processes and pipeline efficiency. Design data validation frameworks and monitor pipeline performance for data quality assurance. Validate NoSQL and relational database integrations in data workflows. Document test strategies, results, and best practices for cross-team alignment. Preferred: AWS certifications (e.g., AWS Developer Associate). Experience with tools like Airflow or Spark. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 2 days ago
7.0 - 12.0 years
15 - 17 Lacs
Hyderabad, Pune, Chennai
Work from Office
Job Title: Senior Data Analyst - Data Governance About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Description: Mandate Skills : Data Analysis + Data Governance, Collibra, Python Location : Bangalore, Pune, Chennai, Hyderabad Notice: immediate to 30 Days Level: M3/M4 (7+ Years) Job Description 1.Design, document and advise on implementing Data Discovery and Data 2.Control Fix for a premier global bank in the wealth and personal banking segment extensively using Collibra 3.Responsible for updating and maintaining process metadata along with critical data elements, preferred business glossary and respective technical metadata for critical global services from various regions in the DG 4.Understand functions of various enterprise information management applications, map the data lineage of data elements along with the flow-types and consumption status 5.Work with data quality team and establish proactive data quality controls by implementing a strong and scalable governance process 6.Create and promote the use of common data assets, such as business glossaries, reference data, data inventories, data models and data catalogs within the organization thereby improving awareness about Data Governance 7.Monitor adherence to data policies and standards, governing potential policy deviations and escalating where necessary 8.Establish data quality standards, procedures and protocols to ensure the accuracy, completeness, and consistency of data across the organization 9.Assist in the implementation of data classification processes to protect sensitive information appropriately
Posted 2 days ago
6.0 - 11.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Founded in 1976, CGI is among the worlds largest independent IT and business consulting services firms. With 94,000 consultants and professionals globally, CGI delivers an end-to-end portfolio of capabilities, from strategic IT and business consulting to systems integration, managed IT and business process services, and intellectual property solutions. CGI works with clients through a local relationship model complemented by a global delivery network that helps clients digitally transform their organizations and accelerate results. CGI Fiscal 2024 reported revenue is CA$14.68 billion, and CGI shares are listed on the TSX (GIB.A) and the NYSE (GIB). Learn more at cgi.com. Job Title: Senior Software engineer / LA Position: SAP BODS (Business Objects Data Services) Consultant Experience: 6+ years of experience Category: Software Development Job location: Bangalore, Chennai, Hyderabad and Pune Position ID: J0325-1751 Work Mode: Hybrid Employment Type: Full Time / Permanent Qualification: Bachelor s or Master s degree in Computer Science, Engineering, or a related field. As an SAP BODS (Business Objects Data Services) Consultant, you will be responsible for designing, developing, and optimizing ETL processes using SAP BusinessObjects Data Services (BODS). Your role includes data extraction, transformation, and loading (ETL), ensuring data quality, integrating BODS with SAP and non-SAP systems, troubleshooting performance issues, and supporting data migration projects while collaborating with business and functional teams. Responsibilities and must-Have Skills: Design, develop, and implement ETL solutions using SAP BusinessObjects Data Services (BODS). Extract, transform, and load (ETL) data from various source systems into SAP HANA, SAP BW, or other databases. Optimize and troubleshoot ETL jobs to improve performance and data accuracy. Develop and maintain data mappings, workflows, and transformations within BODS. Ensure data quality and consistency by implementing cleansing, validation, and enrichment processes. Integrate BODS with SAP and non-SAP systems, including ECC, S/4HANA, BW, and third-party databases. Monitor and maintain job scheduling and automation using SAP Data Services Management Console. Support data migration projects, ensuring smooth transition from legacy systems to SAP. Collaborate with business and functional teams to understand data requirements and provide solutions. Document technical designs, processes, and best practices for knowledge sharing and future enhancements. Good-to-Have Skills: Data Migration - Hands-on experience in migrating data from legacy systems to SAP. SAP Information Steward - Familiarity with data governance and metadata management. Job Scheduling & Automation - Experience with BODS job scheduling and monitoring using Data Services Management Console. Performance Optimization - Skills in improving ETL job performance and troubleshooting failures. CGI is an equal opportunity employer. In addition, CGI is committed to providing accommodation for people with disabilities in accordance with provincial legislation. Please let us know if you require reasonable accommodation due to a disability during any aspect of the recruitment process and we will work with you to address your needs. #LI-GB9 Skills: English Client Management Engineer ETL SAP Business Objects
Posted 2 days ago
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Company Profile: Job Title: SAP Materials Management Position: SSE Experience:7+Years Category: Software Development/ Engineering Main location: Hyderabad Position ID: J0425-1808 Employment Type: Full Time Job Description : SAP MM SAP Materials Management Lead and manage a team of SAP specialists focused on material; structure; and data operations. Oversee the creation and maintenance of material master data and structures in SAP. Ensure data integrity and accuracy across all SAP material and structural configurations. Monitor and analyze data quality metrics; identifying and addressing any issues. Provide technical guidance and support to the team on SAP material management and data operations. Develop and maintain documentation for material; structure; and data operations processes and procedures. Position Description Behavioural Competencies : Proven experience of delivering process efficiencies and improvements Clear and fluent English (both verbal and written) Ability to build and maintain efficient working relationships with remote teams Demonstrate ability to take ownership of and accountability for relevant products and services Ability to plan, prioritise and complete your own work, whilst remaining a team player Willingness to engage with and work in other technologies Note: This job description is a general outline of the responsibilities and qualifications typically associated with the SAP Materials Management role. Actual duties and qualifications may vary based on the specific needs of the organization. Your future duties and responsibilities Required qualifications to be successful in this role Skills: Analytical Thinking .
Posted 2 days ago
5.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
5+ years of professional experience in Data Quality 5+ years of experience in Trillium Must have the professional experience in Python coding Must be good at SQL. Trillium migration is desired. Skills: Data Engineering Data Quality Management Python SQL
Posted 2 days ago
2.0 - 4.0 years
8 - 12 Lacs
Bengaluru
Work from Office
OPENTEXT - THE INFORMATION COMPANY OpenText is a global leader in information management, where innovation, creativity, and collaboration are the key components of our corporate culture. As a member of our team, you will have the opportunity to partner with the most highly regarded companies in the world, tackle complex issues, and contribute to projects that shape the future of digital transformation. AI-First. Future-Driven. Human-Centered. At OpenText, AI is at the heart of everything we do powering innovation, transforming work, and empowering digital knowledge workers. Were hiring talent that AI cant replace to help us shape the future of information management. Join us. Your Impact As a Python Developer in the Debricked data science team, you will work on enhancing data intake processes and optimizing data pipelines. You will apply many different approaches, depending on the needs of the product and the challenges you encounter. In some cases, we use AI/LLM techniques, and we expect the number of such cases to increase. Your contributions will directly impact Debricked s scope and quality and will help ensure future commercial growth of the product. What the role offers As a Python Developer, you will Innovative Data Solutions Develop and optimize data pipelines that improve the efficiency, accuracy, and automation of the Debricked SCA tool s data intake processes. Collaborative Environment Work closely with engineers and product managers from Sweden and India to create impactful, data-driven solutions. Continuous Improvement Play an essential role in maintaining and improving the data quality that powers Debricked s analysis, improving the product s competitiveness. Skill Development Collaborate across teams and leverage OpenText s resources (including an educational budget) to develop your expertise in software engineering,data science and AI, expanding your skill set in both traditional and cutting-edge technologies. What you need to Succeed 2-4 years of experience in Python development, with a focus on optimizing data processes and improving data quality. Proficiency in Python and related tools and libraries like Jupyter, Pandas and Numpy. A degree in Computer Science or a related discipline. An interest in application security. Asset to have skills in Go, Java, LLMs (specifically Gemini), GCP, Kubernetes, MySQL, Elastic, Neo4J. A strong understanding of how to manage and improve data quality in automated systems and pipelines. Ability to address complex data challenges and develop solutions to optimize systems. Comfortable working in a distributed team, collaborating across different time zones. One last thing OpenText is more than just a corporation, its a global community where trust is foundational, the bar is raised, and outcomes are owned.Join us on our mission to drive positive change through privacy, technology, and collaboration. At OpenText, we dont just have a culture; we have character. Choose us because you want to be part of a company that embraces innovation and empowers its employees to make a difference. OpenTexts efforts to build an inclusive work environment go beyond simply complying with applicable laws. Our Employment Equity and Diversity Policy provides direction on maintaining a working environment that is inclusive of everyone, regardless of culture, national origin, race, color, gender, gender identification, sexual orientation, family status, age, veteran status, disability, religion, or other basis protected by applicable laws. . Our proactive approach fosters collaboration, innovation, and personal growth, enriching OpenTexts vibrant workplace.
Posted 2 days ago
0.0 - 1.0 years
0 Lacs
Tiruchirapalli
Work from Office
About the Company: Who we are: We are the manufacturing brains behind successful companies. What we do: Frigate is an On-demand cloud manufacturing startup that helps OEMS/ ODMS and product/device companies identify right manufacturing vendors and leverage their existing capacities to get their products manufactured. Responsibilities Collect data from primary and secondary sources Maintain and update databases and data systems Clean, validate, and correct data to ensure accuracy and completeness Analyze large data sets to identify trends, patterns, and insights Use statistical techniques to interpret data and generate reports Create data visualizations, dashboards, and presentations for stakeholders Collaborate with business teams to understand data needs and requirements Provide actionable insights to support decision-making processes Automate data collection and reporting processes where possible Use tools such as SQL, Excel, Python, R, Tableau, Power BI, etc. improve data quality and support decision-making. Perform additional tasks as required.
Posted 2 days ago
4.0 - 8.0 years
25 - 30 Lacs
Kolkata
Work from Office
Join our Team About this opportunity We are now looking for a Senior Data Scientist to be responsible for developing AI/ML methods, processes, and systems to extract knowledge or insights to drive the future of artificial intelligence!! Provide data science tasks, perform advanced statistical analysis, and create insights into data to provide to the business actionable insights, identify trends, and measure performance which address business problems. Collaborate with business and process owners to understand business issues, and with engineers to implement and deploy scalable solutions, where applicable!! What you will do Apply and/or develop statistical modeling techniques (such as deep neural networks, Bayesian models, Generative AI, Forecasting), optimization methods and other ML techniques. Synthesize problems into data question(s). Convert data into practical insights. Analyze and investigate data quality for identified data and communicate it Product Owner, Business Analyst, and other relevant stakeholders. Collect Data, explore it, and perform analysis to extract information suitable to the business need. Identify gaps in the data, aggregate data as per business need. Design & perform Data Analysis, Data Validation, Data Transformation, Feature Extraction. Decide approach for addressing business needs with Data & analytics. Understand end user needs and work accordingly with identifying new features in the data. Develop Data Science and Engineering Infrastructure &Tools. Derive key metrics suitable for the use-case and present the analysis to key stakeholder. You will bring 4-8 years of relevant Industry experience. A Bachelor s or higher degree in Computer Science, Statistics, Mathematics, or related disciplines. Ability to analyse data and communicate outcome to key stakeholders exploring new data source Excellent coding skills in python, R, SQL etc. Understanding of cloud services. Evidence of academic training in Statistics. Deep/broad knowledge of machine learning, statistics, optimization, or related field A genuine curiosity about new and applied technology and software engineering coupled with a high degree of business understanding. Experience in large scale product development is a plus. Experience in Generative AI and Large Language Models Why join Ericsson? What happens once you apply? Primary country and city: India (IN) || Kolkata Req ID: 768998
Posted 2 days ago
5.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Join our Team About this opportunity We are now looking for a Senior Data Scientist to be responsible for developing AI/ML methods, processes, and systems to extract knowledge or insights to drive the future of artificial intelligence!! Provide data science tasks, perform advanced statistical analysis, and create insights into data to provide to the business actionable insights, identify trends, and measure performance which address business problems. Collaborate with business and process owners to understand business issues, and with engineers to implement and deploy scalable solutions, where applicable!! What you will do Apply and/or develop statistical modeling techniques (such as deep neural networks, Bayesian models, Generative AI, Forecasting), optimization methods and other ML techniques. Synthesize problems into data question(s). Convert data into practical insights. Analyze and investigate data quality for identified data and communicate it Product Owner, Business Analyst, and other relevant stakeholders. Collect Data, explore it, and perform analysis to extract information suitable to the business need. Identify gaps in the data, aggregate data as per business need. Design & perform Data Analysis, Data Validation, Data Transformation, Feature Extraction. Decide approach for addressing business needs with Data & analytics. Understand end user needs and work accordingly with identifying new features in the data. Develop Data Science and Engineering Infrastructure &Tools. Derive key metrics suitable for the use-case and present the analysis to key stakeholder. You will bring 5-10 years of relevant Industry experience. A Bachelor s or higher degree in Computer Science, Statistics, Mathematics, or related disciplines. Ability to analyse data and communicate outcome to key stakeholders exploring new data source Excellent coding skills in python, R, SQL etc. Understanding of cloud services. Evidence of academic training in Statistics. Deep/broad knowledge of machine learning, statistics, optimization, or related field A genuine curiosity about new and applied technology and software engineering coupled with a high degree of business understanding. Experience in large scale product development is a plus. Experience in Generative AI and Large Language Models
Posted 2 days ago
5.0 - 10.0 years
3 - 4 Lacs
Bengaluru
Work from Office
About Us Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job Title: AWS QA Engineer Location: Bangalore /Pune Experience: 5 - 10 Years "We are seeking a Senior QA Data Engineer with expertise in Python, Pyspark, SQL and data testing automation. Good to have experience in AWS Cloud, Jenkins, AWS Glue, Lambda and S3. The ideal candidate will ensure the quality and reliability of data pipelines through robust testing and collaboration with DevOps teams on CI/CD pipelines. " Key Responsibilities: Develop and maintain automated test cases for ETL pipelines, and data workflows using Python/Pyspark. Proficient in SQL and NoSQL databases for data validation and analysis. Validate Jenkins CI/CD pipelines for automated testing and deployment. Willingness to scale up on testing AWS services, including Glue, Lambda, RDS, S3, and Iceberg, ensuring seamless integration and scalability. Collaborate closely with DevOps to enhance deployment processes and pipeline efficiency. Design data validation frameworks and monitor pipeline performance for data quality assurance. Validate NoSQL and relational database integrations in data workflows. Document test strategies, results, and best practices for cross-team alignment. Preferred: AWS certifications (e.g., AWS Developer Associate). Experience with tools like Airflow or Spark. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 2 days ago
4.0 - 7.0 years
8 - 12 Lacs
Kolkata
Work from Office
. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary Responsibilities Development experience on OAS, OAC(DVCS and BICS), OBIA or FAW knowledge will be added advantage Experience on lift & shift of OBIEE to OAC Should have excellent debugging and troubleshooting skills. Should have experience in Metadata management (RPD) and Analytics Should have good knowledge on OAC/OBIEE security Experience in customization and configuration of OBIA (preferably with Fusion Saas Cloud), OBIEE, Dashboards, Administration Experience in interacting with the Business Users to analyze the business process and gathering requirements Experience in sourcing data from Oracle EBS Experience in basic admin activities of OAC and OAS in Unix and Windows environments, like server restarting etc. Experience in Configuration, Troubleshooting, Tuning of OAC reports Mandatory skill sets Metadata management (RPD), design and OBIEE Admin experience including deployment of RPD, Catalog manager & Security Experience in OBIEE Dashboard and Reports Designer and Developer Experience in basic admin activities in Unix and Windows environments, like server restarting etc. Experience in Configuration, Troubleshooting, Tuning of OBIEE 12C Preferred skill sets OAC + OBIEE Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills Oracle Database Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 2 days ago
4.0 - 7.0 years
6 - 10 Lacs
Kolkata
Work from Office
In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC Learn more about us . s Experience of 4 to 7 years who has adequate knowledge Scalas objectoriented programming. Scala code written in the backend is the basis of the finance module reports which are accessed via QuickSight. To assess scala code written for Finance module reports, figure out the issues and fix the same. Mandatory skill sets Scala and OOP Preferred skill sets Scala and OOP Education qualification B.tech/MBA/MCA Education Degrees/Field of Study required Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills Scala (Programming Language) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No
Posted 2 days ago
3.0 - 6.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Challenging. Meaningful. Life-changing. Those aren t words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You ll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers. bms. com/working-with-us . Responsibilities will include, but are not limited to: Study Startup: Draft EDC build timeline in collaboration with Data Management Lead. Perform DB build tasks by creating specifications for Database and Edit Checks. Create test scripts and complete test data entry/UAT for Coding, Site Payment, Safety Gateway. Collaborate with Data Management Lead and facilitate startup meetings which includes, not limited to, EDC build kick-off, Interactive eCRF Build and IRMs (Interactive Review Meeting) for database and Edit checks. Create and finalize study documents like Data Quality Management Plan, eCRF completion Instructions, Protocol Data Review Plan (PDRP) post study team review. Ensure all startup documents are completed as per SOP and filled in eTMF as per eTMF master plan. Study Conduct: Plan/execute Post Production/Migration for the study (if any). Coordinate with Clinical Data Managers for the execution of data review tasks. Coordidate with external data vendors for any escalations related to any vendor data. Support Clean Patient Group delivery along with Clinical Data Management staff. Update study documents as needed during the conduct of the study Support DML to coduct Data Quality Review meetings. Provide Data Health Metrics to Data Management Lead as requested. Study Closeout- Support Data Management Lead in planning and execution of database lock activities. Perform post lock activities, as needed. Project Management Support DML in project management tasks to make sure that study is delivered successfully as per the study timelines with quality. Documentation: Filing of appropriate documents in eTMF as per eTMF master plan. Training and Mentorship: Provide Training and mentoring to junior CDM staff. Bachelor s Degree required. Life sciences, Pharmacy or relevant fields preferred. 6 years of experience in managing end to end Clinical Data Management tasks. Able to work on end to end Clinical Data Management tasks Able to work collaboratively on multi-disciplinary project teams. Strong knowledge of Clinical Drug Development Process, FDA/ICH guidelines and industry standard practices regarding data management Strong knowledge and experience of EDC systems (Medidata RAVE preferred); demonstrated knowledge of Microsoft Office skills. Strong oral and written communication skills. Strong project management skills Yes, 5-10% Industry Conferences, Investigator Meetings, Regulatory Inspections (as needed) If you come across a role that intrigues you but doesn t perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. With a single vision as inspiring as Transforming patients lives through science , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms. com . Visit careers. bms. com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers. bms. com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.
Posted 2 days ago
5.0 - 8.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Do you have a passion for assessing software compliance and guiding stakeholders to make in-formed sourcing decisionsInterested in implementing a Software Asset Management tool to optimize software spend and manage risks with a comprehensive view of entitlements and usageAre you driven to work in a complex, global environment where ideas are valued, and efforts are appreciated We re looking for a Software License Manager to: establishing and maintaining Effective License Positions (ELP) for strategic vendor software products processing (reading and interpreting) software contracts and other commercial documents (purchase orders, invoices, quotes) to validate and ensure accurate ELP and correct interpretation of license terms reviewing major software vendor product ELP with key business partners to ensure license compliance and optimal use of software reporting and escalating identified risk or potential underutilization supporting software contract renewal process or software audits with complete and accurate information and commentary supporting, maintaining, and improving UBS/Credit Suisse Software License Management tools, driving automation, validating, and improving data quality of source inventory systems, adopting new technologies (e.g Cloud, Containerization, new license model) Global Software License Management team consists currently of 21 team members. 6 located in Poland, 3 in Switzerland and 12 in India. Team is a combination of licensing experts with multiple years of experience and individuals who started their software licensing career few months ago. You ll be part of the GCTO GSAM team at our office in Hyderabad. Our team is responsible for reviewing and assessing the Banks software assets, maintaining compliance with software li-cense and maintenance contracts, and onboarding commercial documents in our SAM tool to maintain the Banks Software inventory. We also support sourcing teams with input for contract negotiations by providing current license positions and input on license-specific terms and conditions. in-depth knowledge of the SAM market, SAM operations, and competencies, with the ability to advise on software licensing topics, audits and produce Effective Licensing Positions (ELPs) for software publishers minimum 10+ years of experience in Software Asset Management or License Management in a global organization, CSAM or similar certification is a plus practical knowledge and Software License Management experience of product portfolio and licensing of at least one of the key vendors, i.e. Microsoft, IBM (PVU/RVU metrics & including ILMT bundling), Oracle - database licensing, Broadcom, Cloudera, Red Hat, BMC, CA technologies, SAP good knowledge of Flexera FNMS or/and ServiceNow SAMPro will be an added advantage general understanding of IT software systems, Client and Server virtualization technologies, Cloud / SaaS / PaaS solution, infrastructure, and software procurement processes a results-oriented individual with a high work ethic, accountability, and excellent problem-solving skills, who also possesses strong organizational and communication abilities to inter-act with managers, staff, and senior stakeholders. Dedicated to fostering an inclusive culture and valuing diverse perspectives bachelor s degree in computer science, Information Systems, Business Administration or other related field, or equivalent
Posted 2 days ago
4.0 - 7.0 years
15 - 17 Lacs
Pune
Work from Office
Do you want to build your software engineering skills whilst creating really impactful applicationsAre you interested in being part of an externally recognized engineering community with personal development at its coreWould you like to join a team that gives back to the community and engages with a diverse group of people We re looking for software engineers to: Design and maintain data pipelines using Starburst and related technologies. Optimize query performance and resolve data processing bottlenecks. Manage databases to ensure high availability reliability and security. Integrate Starburst with various data sources including cloud services and APIs. Monitor data pipelines and troubleshoot issues proactively. Collaborate with data scientists analysts and stakeholders on data requirements. Maintain comprehensive and up-to-date documentation for data processes. Stay current with data engineering advancements and propose innovative solutions. Implement best practices for data quality assurance and testing. Demonstrate experience with Python and data access (Numpy, Scipy, panda etc.), machine learning (Tensorflow etc.), and AI libraries (Chat GTP etc.) Have experience working with distributed systems, clustering, and replication technologies. Through the engineering guilds you ll become engaged in sharing and discussing knowledge with your peers and with staff networks you ll get involved in a wide range of volunteering and cultural topics. Once you ve met your team and joined our certified engineers development program, you ll also be able to become engaged in sharing and discussing knowledge with your peers through our engineering guilds and take part in a wide range of volunteering and cultural events and topics through our internal networks. software engineer/developer focused on cloud-based data virtualization and data delivery technologies minimum 2 Years experience working with Starburst develop, optimize, and maintain Starburst Enterprise queries for data processing and analytics. integrate Starburst with various data sources (e.g., cloud storage, relational databases, data lakes). solid understanding of Spark, Trino(Presto) or related technologies. able to code in Python within distributed computing systems such as databricks is added advantage(knowledge of PySpark) capable of working in a collaborative, multi-site environment to support rapid development and delivery of results and capabilities (i.e. AGILE SDLC) 3+ years of hands-on experience in developing large scale applications using data virtualization and/or data streaming technologies. UBS is the world s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries.
Posted 2 days ago
1.0 - 8.0 years
6 - 7 Lacs
Chennai
Work from Office
About the Role: Seeking a highly skilled Data Analyst with 6 years of experience to join our dynamic team. The ideal candidate will have a sound understanding of insurance products, processes such as underwriting, claims, policy administration and be skilled in transforming complex data into technical transformations. Requirements: Expertise in writing complex SQL queries to extract and manipulate data from relational databases (e.g., Oracle, SQL Server, Snowflake). Should analyse policy, claims, billing and underwriting data to identify trends, anomalies and opportunities. Proficient in translating raw data into business and technical transformations. Familiarity in document business rules, data mapping and transformation logic to support data pipeline development and data quality initiatives. Sound knowledge of P&C insurance domains such as policy administration, claims processing, underwriting, etc is a plus. Experience in SQL and relational database systems. #LI-MP1 #Hybrid
Posted 2 days ago
7.0 - 8.0 years
15 - 16 Lacs
Hyderabad
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Design, develop, and maintain scalable ETL/ELT pipelines using PySpark and Python. Build and manage real-time data ingestion and streaming pipelines using Apache Kafka. Develop and optimize data workflows and batch processes on GCP using services like BigQuery, Dataflow, Pub/Sub, and Cloud Composer. Implement data quality checks, error handling, and monitoring across pipelines. Collaborate with data scientists, analysts, and business teams to translate requirements into technical solutions. Ensure best practices in code quality, pipeline reliability, and data governance. Maintain thorough documentation of processes, tools, and infrastructure. Requirements To be successful in this role, you should meet the following requirements: 6+ years of experience in data engineering roles. Strong programming skills in Python and PySpark. Solid experience in working with Kafka for real-time data processing. Proven hands-on experience with GCP data tools and architecture. Familiarity with CI/CD, version control (Git), and workflow orchestration tools (Airflow/Composer). Strong analytical and problem-solving skills with attention to detail. Excellent communication and team collaboration skills You ll achieve more when you join HSBC. .
Posted 2 days ago
4.0 - 5.0 years
9 - 10 Lacs
Bengaluru
Work from Office
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Snowflake Professionals in the following areas : JD for Senior Snowflake developer as below Snowflake SnowSQL, PL/SQL Any ETL Tool Job Description: 4-5 years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures. Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong experience with Snowpipe execustion, snowflake Datawarehouse, deep understanding of snowflake architecture and Processing, Creating and managing automated data pipelines for both batch and streaming data using DBT. Designing and implementing data models and schemas to support data warehousing and analytics within Snowflake. Writing and optimizing SQL queries for efficient data retrieval and analysis. Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool. Good interpersonal skills, experience in handling communication and interactions between different teams. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 2 days ago
8.0 - 12.0 years
20 - 25 Lacs
Pune
Work from Office
Embark on a transformative journey as Test Automation Engineering Lead at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. To be successful as a Test Automation Engineering Lead , you should have experience with: Bachelor s/Master s degree in Computer Science, Information Technology, or related field. Proven experience in test automation for data engineering platforms (ETL, data lakes, data warehouses, Big Data, etc. ). Hands-on expertise with automation tools such as Selenium, PyTest, Robot Framework, Apache Airflow, dbt, or similar. Strong programming/scripting skills in Python, Java, or Scala. Deep understanding of data quality, data validation, and data governance principles in the banking sector. Experience with CI/CD pipelines, DevOps practices, and cloud platforms (AWS, Azure, or GCP). Excellent communication, stakeholder management, and team leadership skills. Knowledge of regulatory and security requirements in the banking domain. Some other highly valued skills include: Experience with AI/ML model testing and validation. Certifications in Test Automation, Data Engineering, or Cloud Technologies. Prior experience in large-scale test transformation programs You may be assessed on the key critical skills relevant for success in role, such as experience with Test Transformation Leadership, Automation Strategy and Frameworks as well as job-specific skillsets. This role will be based out of Pune office. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organizations technology communities to foster a culture of technical excellence and growth. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures. . If managing a team, they define jobs and responsibilities, planning for the department s future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements. . If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. . OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions. . Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes.
Posted 2 days ago
5.0 - 7.0 years
15 - 16 Lacs
Pune
Work from Office
Job Summary: Cummins is seeking a skilled Data Engineer to support the development, maintenance, and optimization of our enterprise data and analytics platform. This role involves hands-on experience in software development , ETL processes , and data warehousing , with strong exposure to tools like Snowflake , OBIEE , and Power BI . The engineer will collaborate with cross-functional teams, transforming data into actionable insights that enable business agility and scale. Please Note: While the role is categorized as remote, it will follow a hybrid work model based out of our Pune office . Key Responsibilities: Design, develop, and maintain ETL pipelines using Snowflake and related data transformation tools. Build and automate data integration workflows that extract, transform, and load data from various sources including Oracle EBS and other enterprise systems. Analyze, monitor, and troubleshoot data quality and integrity issues using standardized tools and methods. Develop and maintain dashboards and reports using OBIEE , Power BI , and other visualization tools for business stakeholders. Work with IT and Business teams to gather reporting requirements and translate them into scalable technical solutions. Participate in data modeling and storage architecture using star and snowflake schema designs. Contribute to the implementation of data governance , metadata management , and access control mechanisms . Maintain documentation for solutions and participate in testing and validation activities. Support migration and replication of data using tools such as Qlik Replicate and contribute to cloud-based data architecture . Apply agile and DevOps methodologies to continuously improve data delivery and quality assurance processes. Why Join Cummins Opportunity to work with a global leader in power solutions and digital transformation. Be part of a collaborative and inclusive team culture. Access to cutting-edge data platforms and tools. Exposure to enterprise-scale data challenges and finance domain expertise . Drive impact through data innovation and process improvement . Competencies: Data Extraction Transformation - Ability to perform ETL activities from varied sources with high data accuracy. Programming - Capable of writing and testing efficient code using industry standards and version control systems. Data Quality Management - Detect and correct data issues for better decision-making. Solution Documentation - Clearly document processes, models, and code for reuse and collaboration. Solution Validation - Test and validate changes or solutions based on customer requirements. Problem Solving - Address technical challenges systematically to ensure effective resolution and prevention. Customer Focus - Understand business requirements and deliver user-centric data solutions. Communication Collaboration - Work effectively across teams to meet shared goals. Values Differences - Promote inclusion by valuing diverse perspectives and backgrounds. Education, Licenses, Certifications: Bachelor s or Master s degree in Computer Science, Information Systems, Data Engineering, or a related technical discipline. Certifications in data engineering or relevant tools (Snowflake, Power BI, etc. ) are a plus. Experience Must have skills: 5-7 years of experience in data engineering or software development , preferably within a finance or enterprise IT environment. Proficient in ETL tools , SQL , and data warehouse development . Proficient in Snowflake , Power BI , and OBIEE reporting platforms. Must have worked in implementation using these tools and technologies. Strong understanding of data warehousing principles , including schema design (star/snowflake), ER modeling, and relational databases. Working knowledge of Oracle databases and Oracle EBS structures. Preferred Skills: Experience with Qlik Replicate , data replication , or data migration tools. Familiarity with data governance , data quality frameworks , and metadata management . Exposure to cloud-based architectures, Big Data platforms (e. g. , Spark, Hive, Kafka), and distributed storage systems (e. g. , HBase, MongoDB). Understanding of agile methodologies (Scrum, Kanban) and DevOps practices for continuous delivery and improvement.
Posted 2 days ago
5.0 - 8.0 years
17 - 25 Lacs
Bhubaneswar, Dubai, Coimbatore
Work from Office
Please Note- Candidates okay to travel to Middle East only apply Role & responsibilities 1. Data Quality & Governance: Implement data quality frameworks, policies, and standards to ensure data accuracy, completeness, and consistency across enterprise systems. 2. Master Data Management (MDM) Implementation: Design and configure MDM solutions using Informatica MDM (On-Prem & Cloud) for key business domains (Customer, Product, Vendor, etc.). 3. Data Profiling & Cleansing: Leverage Informatica Data Quality for data profiling, cleansing, standardization, deduplication, and enrichment to improve data reliability. 4. Metadata Management & Data Lineage: Deploy and maintain Informatica Metadata Manager to enhance data discoverability, governance, and lineage tracking. 5. Integration & Interoperability: Ensure seamless integration of MDM and DQ solutions with core enterprise applications (ERP, CRM, BI tools), supporting ETL/ELT teams. 6. Stakeholder Collaboration: Act as a liaison between business and IT teams, translating business requirements into scalable MDM and DQ solutions. 7. Training & Support: Provide guidance, training, and best practices to data stewards and business users to drive a culture of data governance. Preferred candidate profile Education & Experience Bachelors/Masters degree in Computer Science, Data Management, Information Systems, or a related field. 5+ years of consulting experience in Data Quality, MDM, and Metadata Management Expertise in Informatica IDQ (or IDMC Cloud Data Quality), Informatica MDM (On-Prem & Cloud) Technical Skills Strong experience in data profiling, cleansing, standardization, and data deduplication. Hands-on knowledge of data governance frameworks, data quality rules, and stewardship best practices. Expertise in SQL, data modeling, and data architecture principles. Experience integrating MDM and DQ solutions with enterprise applications (SAP, Salesforce, Microsoft Dynamics, etc.). Familiarity with cloud platforms (MS Azure), with a focus on cloud-based data governance and integration. Experience in designing end-to-end DQ and MDM solutions Preferred Industry Experience Prior experience in DQ/MDM implementation within at least one of the following sectors: Oil & Gas, Financial Services, Manufacturing, Healthcare, Real Estate , Tourism , Government/Citizen Services , Mobility , Energy & Utilities, Telecom Consulting & Leadership Skills Strong stakeholder management and client engagement skills, with experience in working on DQ & MDM consulting projects. Pre-sales experience with the ability to build quick PoCs, Client demos, and support business development efforts.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane