Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
22 - 27 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: Python, SQL, Snowflake Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS/Azure data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications
Posted 1 week ago
1.0 - 4.0 years
3 - 6 Lacs
Mumbai
Work from Office
Developer Role and Responsibilities Your specific duties will be based on your experience as an UiPath developer. In this role, you will be responsible for designing and delivering UiPath solutions in accordance with WonderBotz standards and best practices. You will work closely together with our enthusiastic team of both business and technical specialists. You will be part of a fast-growing and successful team that helps our clients get the maximum benefit. Expected Activities: Support development of UiPath strategies, including assessing opportunities Under the supervision of more experienced developers, define, design, and develop automation on UiPath platforms for clients, including POCs, pilots, and production automation. More senior developers will be expected to work independently Participate in workshops and interviews with business process SMEs to gather and confirm business process details & documenting process definitions. More senior developers will lead these workshops and interviews. Participate in design and configuration sessions and apply feedback to improve and enhance work products. More senior developers will lead these sessions. Work alongside newly trained developers to guide and mentor them. Qualifications and Skills Have mastered or have a strong desire to master a leading RPA tool (UiPath a must, Blue Prism, Automation Anywhere), including advanced RPA vendor certification. At least one year of hands-on experience with at least one of the following programming languages (e.g. .Net, Java, VB, C#/C, HTML/CSS, Python, Web Services, mainframe, web applications, SQL, data integration tools, technical automation tools). More senior developers should have a minimum of 2 to 4 years of this hands-on experience. Reasonably proficiency in reading Microsoft Office Visio or other equivalent process flow-charting tool or workflow-based logic Extra - Any prior work or academic experience with Document management and processing tools (e.g. Kofax, ABBYY, Data Cap), Data integration tools (e.g. Informatica, Microsoft SSIS), Technical automation tools (e.g. shell scripting, PHP), or Business process management tools (e.g. Pega). Desired characteristics in candidates Effective communication skills for technical and non-technical audiences Analytical and proven problem-solving skills High Emotional IQ Embraces challenges Team-orientation rather than an individual contributor Compensation and start dates Hiring now for immediate start Salary: Competitive base and bonus determined by level and experience Benefits: Healthcare, relocation, vacation, holidays Training: WonderBotz provides training, depending upon experience level, with the expectation that candidates will pass the vendor developer certification exam by end of their training period US professional services hubs: Princeton-NJ, Las Vegas-NV, Boston-MA, and additional major cities India RPA Factory: various metro cities WonderBotz is an Equal Employment Opportunity employer.
Posted 1 week ago
7.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Experience in Data Warehouse, Solution Design and Data Analytics. Experience in data modelling exercises like dimensional modelling, data vault modelling Understand, Interpret, and clarify Functional requirements as well as technical requirements. Should understand the overall system landscape, upstream and downstream systems. Should be able understand ETL tech specifications and develop the code efficiently. Ability to demonstrate Informatica Cloud features/functions to the achieve best results. Hands on experience in performance tuning, pushdown optimization in IICS. Provide mentorship on debugging and problem-solving. Review and optimize ETL tech specifications and code developed by the team. Ensure alignment with overall system architecture and data flow. Mandatory skill sets: Data Modelling, IICS/any leading ETL tool, SQL Preferred skill sets: Python Years of experience required: 7 - 10 yrs Education qualification: B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL Tools Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship Government Clearance Required Job Posting End Date
Posted 1 week ago
6.0 - 10.0 years
10 - 15 Lacs
Mumbai, Hyderabad, Pune
Work from Office
Staff Technical Product Manager Are you excited about the opportunity to lead a team within an industry leader in Energy Technology Are you passionate about improving capabilities, efficiency and performance Join our Digital Technology Team! As a Staff Technical Product Manager, this position will operate in lock-step with product management to create a clear strategic direction for build needs, and convey that vision to the service s scrum team. You will direct the team with a clear and descriptive set of requirements captured as stories and partner with the team to determine what can be delivered through balancing the need for new features, defects, and technical debt.. Partner with the best As a Staff Technical Product Manager, we are seeking a strong background in business analysis, team leadership, and data architecture and hands-on development skills. The ideal candidate will excel in creating roadmap, planning with prioritization, resource allocation, key item delivery, and seamless integration of perspectives from various stakeholders, including Product Managers, Technical Anchors, Service Owners, and Developers. Results - oriented leader, and capable of building and executing an aligned strategy leading data team and cross functional teams to meet deliverable timelines. As a Staff Technical Product Manager, you will be responsible for: Demonstrating wide and deep knowledge in data engineering, data architecture, and data science. Ability to guide, lead, and work with the team to drive to the right solution Engaging frequently (80%) with the development team; facilitate discussions, provide clarification, story acceptance and refinement, testing and validation; contribute to design activities and decisions; familiar with waterfall, Agile scrum framework; Owning and manage the backlog; continuously order and prioritize to ensure that 1-2 sprints/iterations of backlog are always ready. Collaborating with UX in design decisions, demonstrating deep understanding of technology stack and impact on final product. Conducting customer and stakeholder interviews and elaborate on personas. Demonstrating expert-level skill in problem decomposition and ability to navigate through ambiguity. Partnering with the Service Owner to ensure a healthy development process and clear tracking metric to form standard and trustworthy way of providing customer support Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness Implementing and maintaining data governance and security measures to protect sensitive data. Monitoring and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes. Fuel your passion: To be successful in this role you will require: Have a Bachelors or higher degree in Computer Science, Information Systems, or a related field. Have minimum 6-10 years of proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Extensive knowledge working with SAP systems, Tcode, data pipelines in SAP, Databricks related technologies. Have Experience with building complex jobs for building SCD type mappings using ETL tools like PySpark, Talend, Informatica, etc. Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have Strong problem-solving and analytical skills, with the ability to handle complex data challenges. Have Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing, and ETL principles. Have familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Certification in relevant technologies or data engineering disciplines. Having working knowledge in Databricks, Dremio, and SAP is highly preferred. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all our people are developed, engaged, and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenge with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. About Us With operations in over 120 countries, we provide better solutions for our customers and richer opportunities for our people. As a leading partner to the energy industry, were committed to achieving net-zero carbon emissions by 2050 and were always looking for the right people to help us get there. People who are as passionate as we are about making energy safer, cleaner, and more efficient. Join Us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clientsCome join us and grow with a team of people who will challenge and inspire you!
Posted 1 week ago
6.0 - 10.0 years
22 - 27 Lacs
Mumbai, Hyderabad, Pune
Work from Office
Staff Technical Product Manager Are you excited about the opportunity to lead a team within an industry leader in Energy Technology Are you passionate about improving capabilities, efficiency and performance Join our Digital Technology Team! As a Staff Technical Product Manager, this position will operate in lock-step with product management to create a clear strategic direction for build needs, and convey that vision to the service s scrum team. You will direct the team with a clear and descriptive set of requirements captured as stories and partner with the team to determine what can be delivered through balancing the need for new features, defects, and technical debt.. Partner with the best As a Staff Technical Product Manager, we are seeking a strong background in business analysis, team leadership, and data architecture and hands-on development skills. The ideal candidate will excel in creating roadmap, planning with prioritization, resource allocation, key item delivery, and seamless integration of perspectives from various stakeholders, including Product Managers, Technical Anchors, Service Owners, and Developers. Results - oriented leader, and capable of building and executing an aligned strategy leading data team and cross functional teams to meet deliverable timelines. As a Staff Technical Product Manager, you will be responsible for: Demonstrating wide and deep knowledge in data engineering, data architecture, and data science. Ability to guide, lead, and work with the team to drive to the right solution Engaging frequently (80%) with the development team; facilitate discussions, provide clarification, story acceptance and refinement, testing and validation; contribute to design activities and decisions; familiar with waterfall, Agile scrum framework; Owning and manage the backlog; continuously order and prioritize to ensure that 1-2 sprints/iterations of backlog are always ready. Collaborating with UX in design decisions, demonstrating deep understanding of technology stack and impact on final product. Conducting customer and stakeholder interviews and elaborate on personas. Demonstrating expert-level skill in problem decomposition and ability to navigate through ambiguity. Partnering with the Service Owner to ensure a healthy development process and clear tracking metric to form standard and trustworthy way of providing customer support Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness Implementing and maintaining data governance and security measures to protect sensitive data. Monitoring and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes. Fuel your passion: To be successful in this role you will require: Have a Bachelors or higher degree in Computer Science, Information Systems, or a related field. Have minimum 6-10 years of proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Extensive knowledge working with SAP systems, Tcode, data pipelines in SAP, Databricks related technologies. Have Experience with building complex jobs for building SCD type mappings using ETL tools like PySpark, Talend, Informatica, etc. Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have Strong problem-solving and analytical skills, with the ability to handle complex data challenges. Have Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing, and ETL principles. Have familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Certification in relevant technologies or data engineering disciplines. Having working knowledge in Databricks, Dremio, and SAP is highly preferred. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all our people are developed, engaged, and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenge with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. About Us With operations in over 120 countries, we provide better solutions for our customers and richer opportunities for our people. As a leading partner to the energy industry, were committed to achieving net-zero carbon emissions by 2050 and were always looking for the right people to help us get there. People who are as passionate as we are about making energy safer, cleaner, and more efficient. Join Us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clientsCome join us and grow with a team of people who will challenge and inspire you!
Posted 1 week ago
3.0 - 8.0 years
9 - 14 Lacs
Ahmedabad
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 10 S&P Global - Mobility The Role: Senior Business Analyst - Data Engineering The Team We are seeking a Senior Business Analyst in the Data Engineering Team, you will be responsible for bridging the gap between business needs and technical solutions. You will collaborate with stakeholders to gather requirements, analyze data workflows, and ensure the successful delivery of data-driven projects. The Impact In this role, you will have the opportunity to work in an Agile team, ensuring we meet our customer requirements and deliver impactful quality data. Using your technical skills, you will contribute to data analysis, design and implement complex solutions, and support the business strategy. Responsibilities Collaborate with business stakeholders to identify and document requirements for data engineering projects. Analyze existing data processes and workflows to identify opportunities for improvement and optimization. Work closely with data engineers and data scientists to translate business requirements into technical specifications. Conduct data analysis and data validation to ensure accuracy and consistency of data outputs. Develop and maintain documentation related to data processes, requirements, and project deliverables. Facilitate communication between technical teams and business stakeholders to ensure alignment on project goals and timelines. Participate in project planning and prioritization discussions, providing insights based on business needs. Support user acceptance testing (UAT) and ensure that solutions meet business requirements before deployment. Utilize Jira for project tracking, issue management, and to facilitate Agile project management practices. Stay updated on industry trends and best practices in data engineering and analytics. What Were Looking For Minimum of 6 years of experience as a Business Analyst in a data engineering environment. Strong understanding of data engineering concepts, data modeling, and ETL processes. Proficiency in data visualization tools (e.g., Tableau, Power BI) and SQL for data analysis. Experience with Jira for project management and tracking. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication and interpersonal skills, with the ability to work collaboratively in a team environment. Experience with Agile methodologies and project management tools is must. Return to Work Have you taken time out for caring responsibilities and are now looking to return to workAs part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply, and will actively support your return to the workplace. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), PDMGDV202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 week ago
8.0 - 13.0 years
10 - 15 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 10 Title Senior ETL and Backend Developer (Salesforce) Job Location Hyderabad, Ahmedabad, Gurgaon, Virtual-India The Team: We are seeking a skilled Senior ETL and Backend Developer with extensive experience in Informatica and Salesforce. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes and backend systems to ensure seamless data integration and management. The team works in a challenging environment that gives ample opportunities to use innovative ideas to solve complex problems. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: You will be making significant contribution in building solutions for the Web applications using new front-end technologies & Micro services. The work you do will deliver products to build solutions for S&P Global Commodity Insights customers. Responsibilities ETL Development: Design, develop, and maintain ETL processes using Informatica PowerCenter and other ETL tools. Data Integration: Integrate data from various sources, including databases, APIs, flat files, and cloud storage, into data warehouses or data lakes. Backend Development: Develop and maintain backend systems using relevant programming languages and frameworks. Salesforce Integration: Implement and manage data integration between Salesforce and other systems. Performance Tuning: Optimize ETL processes and backend systems for speed and efficiency. Data Quality: Ensure data quality and integrity through rigorous testing and validation.Monitoring and MaintenanceContinuously monitor ETL processes and backend systems for errors or performance issues and make necessary adjustments. Collaboration: Work closely with data architects, data analysts, and business stakeholders to understand data requirements and deliver solutions.Qualifications: Basic Qualifications: Bachelor's /Masters Degree in Computer Science, Information Systems or equivalent. A minimum of 8+ years of experience in software engineering & Architecture. A minimum 5+ years of experience in ETL development, backend development, and data integration. A minimum of 3+ years of Salesforce development, administration/Integration. Proficiency in Informatica PowerCenter and other ETL tools. Strong knowledge of SQL and database management systems (e.g., Oracle, SQL Server). Experience with Salesforce integration and administration. Proficiency in backend development languages (e.g., Java, Python, C#). Familiarity with cloud platforms (e.g., AWS, Azure) is a plus. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Nice to have GenAI, Java, Spring boot, Knockout JS, requireJS, Node.js, Lodash, Typescript, VSTest/ MSTest/ nUnit. Preferred Qualifications: Proficient with software development lifecycle (SDLC) methodologies like SAFe, Agile, Test- driven development. Experience with other ETL tools and data integration platforms. Informatica Certified ProfessionalSalesforce Certified Administrator or Developer Knowledge of back-end technologies such as C#/.NET, Java or Python. Excellent problem solving, analytical and technical troubleshooting skills. Able to work well individually and with a team. Good work ethic, self-starter, and results oriented. Excellent communication skills are essential, with strong verbal and writing proficiencies. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today.For more information, visit http://www.spglobal.com/commodity-insights . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hi all, This is an exciting career opportunity for Informatica Developer position. The Job Description is, experience on Informatica, IICS and any cloud technologies. Experience - 6 to 9years Location - Pune, Bangalore, Hyderabad, Chennai Notice period - Immediate/ max 10days if you are interested, please share your updated profile to jeyaramya.rajendran@zensar.com
Posted 1 week ago
8.0 - 13.0 years
5 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title Informatica/Stibo MDM Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Technology->Data Management - MDM->Informatica MDM,Technology->Data Management - MDM->Stibo MDM Preferred Skills: Technology->Data Management - MDM->Informatica MDM Technology->Data Management - MDM->Stibo MDM Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location - PAN INDIA EXP- 5+ Yrs
Posted 1 week ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Overview Customer Data Stewardship Sr Analyst (IBP) Job Overview PepsiCo Data Governance Program OverviewPepsiCo is establishing a Data Governance program that will be the custodian of the processes, policies, rules and standards by which the Company will define its most critical data. Enabling this program will - Define ownership and accountability of our critical data assets to ensure they are effectively managed and maintain integrity throughout PepsiCos systems - Leverage data as a strategic enterprise asset enabling data-based decision analytics - Improve productivity and efficiency of daily business operations Position OverviewThe Customer Data Steward IBP role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governance's (EDG) processes, rules and standards set to ensure data is fit for purpose. Responsibilities Responsibilities Primary Accountabilities - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCo's enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCo's enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCo's Enterprise Data Governance Capability and data management program across the organization. Qualifications Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG.
Posted 1 week ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.
Posted 1 week ago
3.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Overview This position will be part of the North America Beverage organization. This position contributes to the success of the Pepsi Beverages Company by supporting the sales customer team. The TPM Analyst will work with Market team to understand sales growth and profit objectives (Volume, Net Revenue, Profit both for PepsiCo and the Customer), and building effective relationships with the and maintain planning models in Trade Promotion Management (TPM) sales system. In addition, the role will be completing regular CDA and trade spend reconciliation reviews. Finally, the role will have responsibility for ensuring the forecast is accurate and reflects the latest customer planning. Responsibilities Building and maintaining TPM planning models Ensuring that aligned events that require on-ticket pricing changes and/or off-ticket adjustments are entered into all applicable systems in a timely manner (TPM) Manage expectations through verbal and written interactions with internal teams Ensure delivery of accurate and timely data in accordance with agreed service level agreements (SLA) Work across multiple functions to aid in collecting insights for action-oriented cause of change analysis Ability to focus against speed of execution and quality of service delivery rather than achievement of SLAs Recognize opportunities and take action to improve delivery of work Implement continued improvements and simplifications of processes, standardization of reporting and optimal use of technology (Automation) Create an inclusive and collaborative environment Qualifications 3-5 years of experience in Finance/Sales (for L04) Bachelors in commerce/business administration/marketing or Finance, Masters degree is a plus Prior Fast Moving Consumer Goods (FMCG) company experience required Analytical Skills: Ability to understand and translate delivery performance, identify opportunity & risks, and adjust develop and implement detailed accurate forecasts for demand/supply team CommunicationStrong communication skills and collaboration skills Time Management/OrganizationSolid capability to manage and prioritize schedule Support Systems LiteracyComputer literacy, Excel, Power Point, Word, ERT, Business Objects & SAP/ERP, willingness and ability to learn/quickly adapt to other internal support PepsiCo software systems Communication Skills: Communication across all formats (meetings, presentation, conference, planning session, weekly calls, direct communication with field, etc.) Strong Change Management Skills: Follow Up, Follow Through, Accountability, Sense of Urgency and superior customer service Ability to provide new ways of approaching situations and developing new efficient solutions Independent & motivated individual; ability to receive direction and convert into an action plan with coaching and feedback Develops strong relationships/partnership for overall success of the team & customer Requires a high level of analytical, critical thinking, and problem-solving skills as well as great attention to detail
Posted 1 week ago
3.0 - 5.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Overview We are looking for Associate analyst Role in our team. Person should have 3-5 Yrs of hands on experience on IDMC Informatica cloud Data integration. Person should have 3-5 Yrs of experience on SQL worked with DB's like oracle, SQL Server, mysql Person should have 1-2 Yrs of experience on unix shell scripting. Person should have knowledge on scheduling tools like Control M and Autosys. Should have good communication skills and should understand requirements and articulate them in implementations Person should have end to end project implementation knowledge. Should have worked on atleast 2 projects end to end. Person should have knowledge on testing, UAT and TCO activities. Responsibilities Work independently with business stakeholders to gather and understand integration requirements, with minimal or no support from technical leads. Take full ownership and drive projects from initiation to successful completion. Demonstrate strong communication skills to effectively collaborate with business and technical teams. Be responsible for end-to-end implementation of projects using the Informatica Cloud Data Integration tool. Analyze requirements and develop appropriate SQL queries or Unix shell scripts as needed. Prepare comprehensive test case documents, submit detailed test reports, and support the business during SIT, UAT, and TCO phases. Design mappings, mapping tasks, and taskflows in line with project requirements, with a solid understanding of applicable transformations. Identify and resolve performance issues by tuning underperforming jobs. Qualifications 35 years of hands-on experience with Informatica Cloud Data Integration (IDMC). 35 years of experience writing and optimizing SQL queries in Oracle, SQL Server, and MySQL environments. 12 years of experience in Unix shell scripting. Familiarity with job scheduling tools such as Control-M and AutoSys. Strong communication skills with the ability to understand and articulate business requirements. Experience in at least two end-to-end project implementations. Solid understanding of testing phases, including SIT, UAT, and TCO.
Posted 1 week ago
6.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Job Title:Salesforce Financial Services (FSC)Experience6-8 YearsLocation:Bangalore : JOB Description: Perform system Financial Services CloudConfigure and customize Salesforce Financial Services Cloud to meet specific business requirements in the financial services domain. Salesforce Lightning DevelopmentDevelop custom Lightning Components (Aura and Lightning Web Components) to enhance user interfaces and improve client interactions. Apex and Mera ProgrammingDevelop and maintain complex Apex code, triggers, and Mera programming for backend processes to support Salesforce customizations. Data Management and IntegrationImplement and optimize data integrations with external systems, including ETL processes, APIs, and web services. Custom Workflows & AutomationBuild and maintain custom workflows, process builders, flows, and approval processes. and Support CollaborationWork closely with other developers, administrators, and stakeholders to deliver a seamless experience for end users. Testing and DeploymentDevelop unit tests, perform code reviews, and deploy changes using CI/CD tools. Documentation Maintain clear documentation for solutions, configurations, and development processes. Qualifications: Experience5-7 years in Salesforce development, with 4+ years in Financial Services Cloud and Lightning. Technical Skills: Proficient in Apex, Mera programming, and JavaScript. Expertise in Lightning Web Components (LWC) and Aura Components. Experience with Salesforce API integrations (REST, SOAP, and other web services). Strong understanding of Salesforce Security, Data Modeling, and Object-Oriented principles. CertificationsSalesforce Certified Platform Developer I/II, Salesforce Certified Financial Services Cloud Consultant preferred. Other Skills: Strong problem-solving skills and the ability to troubleshoot issues. Excellent communication and collaboration skills. Nice-to-Have: Experience in the financial services industry, particularly with wealth management or banking. Familiarity with data migration tools like Data Loader or third-party ETL tools (MuleSoft, Informatica). Knowledge of CI/CD tools like Git, Jenkins, and Bitbucket
Posted 1 week ago
5.0 - 10.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Cobol DevExperience:5-10YearsLocation:Bangalore : Technical Skills: 5 + Years of experience as ETL data developer in Informatica, CONNX, SQL Server, Oracle DB setup CONNX metadata for Eligibility and Billing tables. Design and develop informatica ETLs to convert IMF data to Certifi format, Eligibility file format and IMF Notes format Support reconciliation of data across MMC systems and Javelina/Certifi/AWS Archive Proficiency in designing, developing, and debugging ETL workflows and mappings. Experience with Informatica transformations like Source Qualifier, Expression, Lookup, Aggregator, and Filter. Familiarity with performance optimization techniques in Informatica (e.g., session tuning, partitioning). Experience in extracting data from SQL Server and writing it into delimited file formats like CSV. Knowledge of handling large datasets and ensuring efficient data movement. Familiarity with file-level operations like encoding, delimiter configuration, and column formatting. Skills in implementing robust error handling and logging mechanisms in ETL workflows. Ability to debug issues related to data extraction, transformation, or file generation Ability to work independently and as part of a team Non-Technical Skills: Candidate needs to be Good Team Player Effective interpersonal, team building and communication skills. Ability to communicate complex technology to no tech audience in simple and precise manner.
Posted 1 week ago
10.0 - 15.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:SQL, AWS Redshift, PostgreSQLExperience10-15 YearsLocation:Bangalore : SQL, AWS Redshift, PostgreSQL
Posted 1 week ago
6.0 - 8.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Informatica Admin PowerCenter, IDQ, IICSExperience6-8 YearsLocation:Bangalore : Technical Skills: Informatica PowerCenter Administration: Install, configure, and maintain Informatica PowerCenter components (Repository Server, Integration Service, Domain Configuration) on Windows servers in AWS. Monitor and optimize PowerCenter performance, including troubleshooting and resolving issues. Informatica Data Quality (IDQ) Administration: Install, configure, and manage Informatica Data Quality (IDQ) components including IDQ Server and Data Quality Services. Ensure effective data profiling, cleansing, and enrichment processes. Informatica Intelligent Cloud Services (IICS) Migration: Plan and execute migration strategies for moving from on-premises Informatica PowerCenter and IDQ to Informatica Intelligent Cloud Services (IICS). Manage and facilitate the migration of ETL processes, data quality rules, and integrations to IICS. Ensure a smooth transition with minimal disruption to ongoing data processes. AWS Cloud Management: Manage Informatica PowerCenter, IDQ, and IICS environments within AWS, using services such as EC2, S3. Implement AWS security and compliance measures to protect data and applications. Performance Optimization: Optimize the performance of Informatica PowerCenter, IDQ, IICS, and Oracle databases to ensure efficient data processing and high availability. Conduct regular performance tuning and system health checks. Backup & Recovery: Develop and manage backup and recovery processes for Informatica PowerCenter, IDQ, and Oracle databases. Ensure data integrity and implement effective disaster recovery plans. Security & Compliance: Configure and manage security policies, user roles, and permissions for Informatica and Oracle environments. Monitor and enforce data security and compliance standards within AWS and Informatica platforms. Troubleshooting & Support: Diagnose and resolve issues related to Informatica PowerCenter, IDQ, IICS, and Oracle databases. Provide technical support and guidance to development and operational teams. Documentation & Reporting: Create and maintain detailed documentation for Informatica PowerCenter, IDQ, IICS configurations, and Oracle database settings. Generate and review performance and incident reports. Non-Technical Skill Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Ability to learn quickly in a dynamic start-up environment Able to talk to client directly and report to client/onsite Flexibility to work on different Shifts and Stretch
Posted 1 week ago
5.0 - 10.0 years
2 - 5 Lacs
Chennai, Bengaluru
Work from Office
Job Title:Data EngineerExperience5-10YearsLocation:Chennai, Bangalore : Minimum 5+ years of development and design experience in Informatica Big Data Management Extensive knowledge on Oozie scheduling, HQL, Hive, HDFS (including usage of storage controllers) and data partitioning. Extensive experience working with SQL and NoSQL databases. Linux OS configuration and use, including shell scripting. Good hands-on experience with design patterns and their implementation. Well versed with Agile, DevOps and CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem. Familiar with Distributed services resiliency and monitoring in a production environment. Experience in designing, building, testing, and implementing security systems Including identifying security design gaps in existing and proposed architectures and recommend changes or enhancements. Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth Understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action. Knowledge on security controls designing Source and Data Transfers including CRON, ETLs, and JDBC-ODBC scripts. Understand basics of Networking including DNS, Proxy, ACL, Policy, and troubleshooting High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures. Understand data sensitivity in terms of logging, events and in memory data storage such as no card numbers or personally identifiable data in logs. Implements wrapper solutions for new/existing components with no/minimal security controls to ensure compliance to bank standards.
Posted 1 week ago
7.0 - 15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Data Analyst Experience: 7-15 Years Skill: Datawarehouse Concepts ETL tool-Informatica is must Python-Moderate Advance SQL Data Visualization tools-Power BI/MSTR Finance Domain JD: 7-9 years of experience as a Data Analyst, with at least 5 years supporting Finance within the insurance industry. Hands-on experience with Vertica/Teradata for querying, performance optimization, and large-scale data analysis. Advanced SQL skills: proficiency in Python is a strong plus. Proven ability to write detailed source-to-target mapping documents and collaborate with technical teams on data integration. Experience working in hybrid onshore-offshore team environments. Deep understanding of data modeling concepts and experience working with relational and dimensional models. Strong communication skills with the ability to clearly explain technical concepts to non-technical audiences. A strong understanding of statistical concepts, probability and accounting standards, financial statements (balance sheet, income statement, cash flow statement), and financial ratios. Strong understanding of life insurance products and business processes across the policy lifecycle. Quantitative Finance: Understanding of financial modeling, risk management, and derivatives.
Posted 1 week ago
50.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About The Opportunity Job Type: Permanent Application Deadline: 30 July 2025 Job Description Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like you’re part of something bigger. About Your Team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About Your Role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years’ experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years’ experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About You B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 1 week ago
0 years
0 Lacs
Hyderābād
On-site
Overview: At PepsiCo, we’re accelerating our digital transformation by building the next generation of intelligent, connected, and agile systems. As part of this journey, we're seeking an CTO ASSOC Manager to drive the vision, design, and implementation of enterprise-wide integration strategies. This role is critical in ensuring our systems and data flows are connected, secure, and future-ready—supporting both global scale and localized agility. You’ll partner closely with cross-functional stakeholders, product teams, and enterprise architects to define scalable integration blueprints that align with business priorities and our evolving IT roadmap. If you have a passion for modern architecture, cloud-native technologies, and unlocking value through connected systems—this is your opportunity to make a global impact. Responsibilities: Integration Strategy & Architecture Define the enterprise integration strategy, aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. Technology Selection & Implementation Evaluate and recommend the right integration technologies, such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards (OAuth, JWT, OpenID Connect, API Gateway). Establish API versioning, governance, and lifecycle management. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications. Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday. Security & Compliance Ensure secure integration practices, including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations (GDPR, HIPAA, SOC 2). Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration.Provide technical guidance to developers, architects, and integration engineers. Qualifications: Extensive experience designing and executing enterprise-grade integration architectures. Hands-on expertise with integration tools such as Informatica, WebLogic, TIBCO, and Apache Kafka. Proven track record in API management, microservices architecture, and event-driven systems. Strong command of cloud integration patterns and hybrid deployment models. Deep understanding of security protocols and regulatory compliance in large-scale environments. Effective communicator and leader with the ability to influence across technical and non-technical audiences. Experience in a global, matrixed enterprise is a strong plus.
Posted 1 week ago
4.0 years
0 Lacs
Hyderābād
On-site
At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position Senior Analyst - Technology Roche Services & Solutions India Hyderabad / Chennai A healthier future. It’s what drives us to innovate. To continuously advance science and ensure everyone has access to the healthcare they need today and for generations to come. Creating a world where we all have more time with the people we love. That’s what makes us Roche. Roche has established Global Analytics and Technology Center of Excellence (GATE) to drive analytics & technology driven solutions by partnering with Roche affiliates across the globe. Your Opportunity: The Senior Tech Analyst will work with the US based Master Data Management (MDM) team of Roche and support data stewardship and MDM Operations related activities. In this role, you will be expected to work with the stakeholders across various business functions, MDM team and the GATE – India & Costa Rica team. Your role will include providing support in developing insights and strategies that optimize data-related processes, contributing to informed decision-making. Perform Data Stewardship activities and process the Data Change Requests related to Health Care Master Data. Conduct matching and merging of the Master records. Ensure the newly on-boarded data set are accurate, complete and adhere to currently defined data standards. Perform analysis and required maintenance of the Master Data including HCPs, HCOs, Payer / Managed Care and Affiliations. Help devise an adaptable governance methodology to enable efficiency and effectiveness in data operations, as it relates to MDM, data integration, taxonomy, and reporting & analytics. Foster effective communication and collaboration among cross-functional teams to understand data needs and deliver relevant information. Comprehend stakeholder requirements, prioritize tasks, and effectively manage day-to-day responsibilities, including liaising with MDM teams and coordinating with GATE team. Present findings and recommendations to senior management on various initiatives and process improvements. Who You Are: 4+ years of experience in Data Steward / Data Analyst role, particularly in MDM Operations and Data Stewardship, or related functions preferably in Pharma / Life Science / Biotech domain. Experience working on Reltio MDM Hub configurations - Data modeling & Data Mappings, Data validation, Match and Merge rules, building and customizing API services Parent / child Relationships, Workflows and LCA. Knowledge of MDM systems like Informatica MDM / Reltio; Pharma CRM systems like Salesforce, OCE, Veeva CRM; Cloud platforms like AWS / Google / Azure is a strong plus. Strong proficiency in Excel and SQL, along with knowledge of at programming language such as Python, PySpark. Excellent verbal and written communication skills, capable of interacting with senior leadership and stakeholders effectively. Proven ability to work independently, make decisions with minimal supervision, and prioritize tasks effectively. Ability to manage multiple priorities and meet deadlines in a fast-paced environment. Has a Bachelor’s or Master’s Degree (computer science, engineering or other technical disciplines) in Pharma is a plus. Who we are A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let’s build a healthier future, together. Roche is an Equal Opportunity Employer.
Posted 1 week ago
0 years
0 Lacs
Hyderābād
On-site
Req ID: 333186 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a DATA SENIOR ENGINEER to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Duties: Recent experience and clear responsibilities in an ETL / Extracts / Data Engineering developer capacity. Experience with Informatica PowerCenter Development High level of proficiency with Unix (AIX and Linux preferred) command line High level of comfort with Oracle SQL Minimum Skills Required: Experience with Informatica PowerCenter Development Good understanding on Unix (Linux preferred) command line High level of comfort with Oracle SQL About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 1 week ago
50.0 years
5 - 8 Lacs
Gurgaon
On-site
About the Opportunity Job Type: Permanent Application Deadline: 29 July 2025 Job Description Title Senior Analyst Programmer Department FIL India Technology - GPS Location Gurugram Level Software Engineer- 3 Fidelity International offers investment solutions and services and retirement expertise to more than 2.52 million customers globally. As a privately-held, purpose-driven company with a 50-year heritage, we think generationally and invest for the long term. Operating in more than 25 locations and with $750.2 billion in total assets, our clients range from central banks, sovereign wealth funds, large corporates, financial institutions, insurers and wealth managers, to private individuals. We’re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our GPS Data Platform team and feel like you’re part of something bigger. About your team The GPS Lakehouse & Reporting is a team of around 100 people whose role is to develop and maintain the datwarehouse and reporting platforms that we use to administer the pensions and investments of our workplace and retail customers across the world. In doing this we critical to the delivery of our core product and value proposition to these clients today and in future. About your role The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day to day basis including data centre, networks, proximity services, security, voice, incident management and remediation. Below are the key responsibilities: Work with Delivery Managers and System/Business Analysts and other subject matter experts to understand the requirements Implement Informatica mappings between inbound and target data model Produce Technical specifications, unit test cases for the interfaces under development Provide support through all phases of implementation Adhere to the source code control policies of the project Implement and use appropriate Change Management processes Develop capability to implement Business Intelligence tools. About you Must have technical skills: Strong understanding of standard ETL tool Informatica Power Centre with a minimum of 3 years’ experience. Strong Oracle SQL/PLSQL, Stored Procedure experience Knowledge of Devops, Configuration Management tools like SVN, CI tools Experience of using job scheduling tools (Control-M preferred) Experience in UNIX or Python scripting Good to have technical skills: - Familiarity in Data Warehouse, Data marts and ODS concepts Exposure to Agile (Scrum) development practices Knowledge of data normalisation and Oracle performance optimisation techniques Cloud Technologies like AWS and Snowflake Feel rewarded For starters, we’ll offer you a comprehensive benefits package. We’ll value your wellbeing and support your development. And we’ll be as flexible as we can about where and when you work – finding a balance that works for all of us. It’s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France