Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
20 - 25 Lacs
Bengaluru
Work from Office
: Design, develop, and maintain relational and non-relational database systems. Define analytical architecture including datalakes, lakehouse, data mesh and medallion patterns Ability to understand & analyze business requirements and translate them into analytical or relational database design Able to design for non-SQL datastores Optimize SQL queries, stored procedures, and database performance. Create and maintain ETL processes for data integration from various sources. Work closely with application teams to design database schemas and support integration. Monitor, troubleshoot, and resolve database issues related to performance, storage, and replication. Implement data security, backup, recovery, and disaster recovery procedures. Ensure data integrity and enforce best practices in database development. Participate in code reviews and mentor junior developers. Collaborate with business and analytics teams for reporting and data warehousing needs. Must Have: Strong expertise in SQL and PL/SQL Hands-on experience with at least one RDBMS: Snowflake, ADLS, Bigquery, SQL Server, Oracle, PostgreSQL, or MySQL Experience with NoSQL databases: MongoDB, Cassandra, or Redis (at least one) ETL Development Tools: SSIS, Informatica, Talend, or equivalent ( changed - Good to have ) Experience in performance tuning and optimization Database design and modeling tools: Erwin, dbForge, or similar Cloud platforms: Design and Development experience on cloud services AWS/GCP/ AZURE(must have) Understanding of indexing, partitioning, replication, and sharding Knowledge of CI/CD pipelines and DevOps practices for database deployments( Nice to Have) Cloud Certified Engineer( Nice to have) Experience with Big Data technologies (Hadoop, Spark)Experience working in Agile/Scrum environments (Must have ) Knowledge of star schema and snowflake schema (Must have ) Qualifications Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.8+ years of relevant experience in database design and development Qualifications Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.8+ years of relevant experience in database design and development
Posted 1 week ago
3.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
1. Develop and maintain plans and timelines for all work within a team, including project and operational deliverables 2. Collaborate with stakeholders to facilitate defining the scope, objectives, and deliverables for team efforts. 3. Facilitate agile ceremonies such as sprint planning, daily stand-ups, sprint reviews, and retrospectives. 4. maintain backlogs and analyse upcoming work for resource and schedule needs 5. track and maintain status in work tracking tools (e.g., Jira) to ensure accurate representation of progress Good to Have 1. PMIAgile Certified Practitioner (PMI-ACP) or Certified SAFe (Scrum Master or Advanced Scrum Master) or Professional Scrum Master (PSM) Certifications or Project Management Professional (PMP) is a plus 2. Excellent English communication (verbal/written) skills, with an ability to manage internal and external relationships up to senior level management. 3. Strong documentation skills - detailed tracking, metrics-based reporting and roll-ups, etc. Primary Skills Scrum Master JIRA Agile Project Management Process improvements Workflow Continuous improvement Reporting
Posted 1 week ago
2.0 - 6.0 years
3 - 7 Lacs
Pune, Bengaluru
Work from Office
Your Role 35 years of hands-on experience with BigID or similar data discovery/classification tools (e.g., Varonis, Informatica, and MIP). Strong understanding of data governance, data privacy, and compliance regulations (GDPR, CCPA, SOX, SEBI etc.). Experience working with structured data in RDBMS (Oracle, MS SQL Server, and PostgreSQL) and unstructured data sources (file servers, SharePoint, cloud repositories). Proficiency in configuring BigID policies, classifiers, data flows, and discovery and classification operations modules. Experience integrating BigID with security tools like Microsoft Information Protection, DLP solutions, or SIEM platforms. Familiarity with metadata management, data catalogs, and data lineage concepts Your Profile Design, implement, and manage data discovery and classification workflows using the BigID platform for both structured (e.g., databases, data warehouses) and unstructured data (e.g., file shares, SharePoint, email). Configure and maintain BigID connectors to integrate with enterprise data sources including databases (Oracle, SQL Server, MySQL), cloud storage (AWS S3, Azure Blob), collaboration platforms (O365, Google Drive), and more. Define and customize classification policies, sensitivity labels, and data tagging rules to align with organizational data governance and compliance frameworks (e.g., GDPR, CCPA, SEBI, DORA etc.). Collaborate with data owners, security teams, and compliance stakeholders to identify sensitive data types (PII, PCI, etc.) and apply appropriate classification and protection strategies. Integrate BigID with Microsoft Information Protection (MIP) and other DLP platforms or IRM tools to automate labeling and enforcement policies. Monitor discovery scans and classification jobs, troubleshoot issues, and optimize performance. Generate and present reports and dashboards to stakeholders, highlighting data classification coverage, risk areas, and remediation plans. What You"ll love about working hereShort Description We recognize the significance of flexible work arragemnets to provide support.Be it remote work, or flexible work hours. You will get an enviorment to maintain healthy work life balance. At the heart of our misssion is your career growth. our Array of career growth programs and diverse professions arecrafted to spport you in exploring a world of opportuneties Euip Yourself with valulable certification in the latest technlogies such as unix,Sql.
Posted 1 week ago
8.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
We are looking for an experienced Data Modelling professional, proficient in tools such as Erwin and ER/Studio. A strong understanding of Azure Databricks, Snowflake/Redshift, SAP HANA, and advanced SQL is required. Prior experience in leading teams is also preferred.
Posted 1 week ago
8.0 - 12.0 years
11 - 13 Lacs
Bengaluru
Work from Office
Wissen Technology is Hiring fo r DB2 Developer About Wissen Technology : Wissen Technology is a globally recognized organization known for building solid technology teams, working with major financial institutions, and delivering high-quality solutions in IT services. With a strong presence in the financial industry, we provide cutting-edge solutions to address complex business challenges . Job Summary: DB2 Developer/DBA We are looking for an experienced DB2 developer/DBA who has worked in a critical application with large sized Database. The role requires the candidate to understand the landscape of the application and the data including its topology across the Online Data store and the Data Warehousing counter parts. The challenges we strive to solve include scalability/performance related to dealing with very large data sets and multiple data sources. The role involves collaborating with global team members and provides a unique opportunity to network with a diverse group of people. The candidate who fills this role of Database developer in our team will be involved in building and creating solutions from the requirements stage through deployment. A successful candidate is self-motivated, innovative, thinks outside the box , has excellent communication skills and can work with clients and stakeholders from both the business and technology with ease. Experience: 8-12 Yea rs Location: Mumbai Required Skills: Expertise in writing complex data retrieval queries, stored procs and performance tuning. Experience in migrating large scale database from Sybase to a new tech stack. Expertise in relational DB: Sybase, AZURE SQL Server, DB2 and nosgl databases. Strong knowledge in Linux Shell Scripting Working knowledge of Python programming. Working knowledge of Informatica. Good knowledge of Auto-sys or any such scheduling tool. Detail oriented, ability to turn deliverables around quickly with high degree of accuracy.
Posted 1 week ago
2.0 - 7.0 years
7 - 11 Lacs
Kota, Jaipur, Bikaner
Work from Office
We are seeking a Data Consultant (Sr. consultant role is also available and will be determined based on experience) to manage large data migration projects with non-profit organizations and educational institutions. Successful candidates will ETL experts (i.e. Jitterbit, Informatica, Boomi, SSIS, MuleSoft) able to demonstrate expert-level skills and knowledge in designing and implementing data migrations. Experience with Salesforce, NPSP, and EDA are a plus. Candidates should have a strong understanding of relational databases, SOQL, and SQL. Because much of the work of this role is client-facing, communication skills and a genuine interest in helping people is very important. The ability to collaborate with peers and clients while building consensus is also critical for success. Key Responsibilities: Serve as the lead data resource on Salesforce implementation projects. Create and iterate on data conversion maps Evaluate, design, and implement data migration solutions for non-profit and higher education clients Plan all aspects of iterative data migration and coordinate with the project manager to align with the overall project plan Maintain up-to-date, accurate knowledge of NPC and/or EDC Assess client business requirements to design architecturally-sound solutions. Deliver on project assignments on time and within budget Perpetually contribute to the betterment of Cloud for Good Show a commitment to customer satisfaction Provide informal mentorship and facilitate knowledge-sharing and growth opportunities Be prepared to work in shifts to accommodate our global customer base across different time zones, including the AMER, APAC, and EMEA. Requirements Qualifications: Experience transforming and migrating data to Salesforce via data loading tools and ETL tools (e.g., Jitterbit, Informatica, Boomi, Mules), including creating initial data map At least 2+ years of consulting experience Strong understanding of relational database architecture, SOQL, and SQL Understanding of agile methodology Salesforce.com administrator certification (if not available will be required to complete within the onboarding period) Strong Salesforce configuration\u202fknowledge is a plus Familiarity working with nonprofits\u202fand/or higher education institutions\u200b Strong consulting skills,\u202fcommunication, and teamwork/collaboration Proven track record of continuously improving organizations Preferred Skills: Strong time management skills Strong written and verbal communication skills Intellectual curiosity Passion for\u202fcontinuous learning\u200b Mentoring skills Presentation skills Signs You May Be a Great Fit Impact: Play a pivotal role in shaping a rapidly growing venture studio. Culture: Thrive in a collaborative, innovative environment that values creativity and ownership. Growth: Access professional development opportunities and mentorship. Benefits: Competitive salary, health/wellness packages, and flexible work options.
Posted 1 week ago
10.0 - 15.0 years
37 - 45 Lacs
Pune
Work from Office
Position Summary The engineer role is to support external data transmission , operation s , scheduling and middleware transmission . Experience in Windows and Linux environments and knowledge of Informatica MFT & Data Exchange tools . Should be able to handle day to day customer transmission and Informatica MFT/DX activities. Job Responsibilities Design and implement complex integration solutions through collaboration with engineers, application teams and operations team across the global enterprise Provide technical support to application developers when required. This includes promoting use of best practices, ensuring standardization across applications and trouble shooting Able to create new setups and support existing transmissions Able to diagnose and troubleshoot transmission and connection issues Experience in Windows administration and good to have expertise in IBM workload scheduler Hands on experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload scheduler Responsibilities also include planning, engineering, and implementation of new transmissions as well as migration of setups The role will participate in the evaluation and recommendation of new products and technologies The role will also represent the domain in relevant automation and value innovation efforts Technical leadership, ability to think strategically and effectively communicate solutions to a variety of stake holders Able to debug production issues by analyzing the logs directly and using tools like Splunk. Begin tackling organizational impediments Comfortable opposing anti-patterns Strong collaboration with team members Learn new technologies based on demand and help team members by coaching and assisting Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelors degree in computer science, Information Systems, or related field Experience 10+ years of total experience and at least 7+ years of experience in designing and implementation of complex integration solutions through collaboration with engineers, application and operations team Create new setups and support existing transmissions Experience in tools like IIS, Informatica MFT & DX console, Splunk and IBM workload schedule Resolving the production issues by analyzing the logs directly and using tools like Splunk SSH/SSL/Tectia Microsoft IIS IBM Connect:Direct IBM Sterling Informatica MFT Operating System Knowledge (Linux/Windows/AIX) Troubleshooting Azure Dev Ops Pipeline Knowledge Mainframe z/OS Knowledge Open Shift Enterprise Scheduling Knowledge (Maestro) Good to Have : Python and/or Powershell Agile SAFe for Teams Ansible (Automation) Elastic
Posted 1 week ago
3.0 - 8.0 years
14 - 16 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Junior consulting position for those who are continuing to develop their expertise. Receives general instruction on routine work and detailed instruction on new projects or assignments. Career Level - IC1 Career Level - IC1 As a member of a project team, follows standard practices and procedures to analyze situations/data and provide quality work products to deliver functional and technical solutions on applications and technology installations. Work involves some problem solving with assistance and guidance in understanding and applying relevant Oracle methodologies and practices. Implements Oracle products and technology in various industries to meet customer specifications.
Posted 1 week ago
3.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
We are looking for an experienced and visionary BI Architect to lead the design and evolution of our Business Intelligence architecture. In this strategic role, you ll work closely with cross-functional leaders to build scalable, high-performance data solutions that empower smarter, faster decisions across the organization. If you re passionate about driving impact through architecture and innovation, this is your opportunity to make a lasting difference.
Posted 1 week ago
4.0 - 9.0 years
4 - 7 Lacs
Hyderabad
Work from Office
At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position Senior Analyst - Technology Roche Services & Solutions India Hyderabad / Chennai A healthier future. It s what drives us to innovate. To continuously advance science and ensure everyone has access to the healthcare they need today and for generations to come. Creating a world where we all have more time with the people we love. That s what makes us Roche. Roche has established Global Analytics and Technology Center of Excellence (GATE) to drive analytics & technology driven solutions by partnering with Roche affiliates across the globe. Your Opportunity: The Senior Tech Analyst will work with the US based Master Data Management (MDM) team of Roche and support data stewardship and MDM Operations related activities. In this role, you will be expected to work with the stakeholders across various business functions, MDM team and the GATE - India & Costa Rica team. Your role will include providing support in developing insights and strategies that optimize data-related processes, contributing to informed decision-making. Perform Data Stewardship activities and process the Data Change Requests related to Health Care Master Data. Conduct matching and merging of the Master records. Ensure the newly on-boarded data set are accurate, complete and adhere to currently defined data standards. Perform analysis and required maintenance of the Master Data including HCPs, HCOs, Payer / Managed Care and Affiliations. Help devise an adaptable governance methodology to enable efficiency and effectiveness in data operations, as it relates to MDM, data integration, taxonomy, and reporting & analytics. Foster effective communication and collaboration among cross-functional teams to understand data needs and deliver relevant information. Comprehend stakeholder requirements, prioritize tasks, and effectively manage day-to-day responsibilities, including liaising with MDM teams and coordinating with GATE team. Present findings and recommendations to senior management on various initiatives and process improvements. Who You Are: 4+ years of experience in Data Steward / Data Analyst role, particularly in MDM Operations and Data Stewardship, or related functions preferably in Pharma / Life Science / Biotech domain. Experience working on Reltio MDM Hub configurations - Data modeling & Data Mappings, Data validation, Match and Merge rules, building and customizing API services Parent / child Relationships, Workflows and LCA. Knowledge of MDM systems like Informatica MDM / Reltio; Pharma CRM systems like Salesforce, OCE, Veeva CRM; Cloud platforms like AWS / Google / Azure is a strong plus. Strong proficiency in Excel and SQL, along with knowledge of at programming language such as Python, PySpark. Excellent verbal and written communication skills, capable of interacting with senior leadership and stakeholders effectively. Proven ability to work independently, make decisions with minimal supervision, and prioritize tasks effectively. Ability to manage multiple priorities and meet deadlines in a fast-paced environment. Has a Bachelor s or Master s Degree (computer science, engineering or other technical disciplines) in Pharma is a plus. Who we are A healthier future drives us to innovate. Together, more than 100 000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let s build a healthier future, together. Roche is an Equal Opportunity Employer. "
Posted 1 week ago
5.0 - 10.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Design, develop, test and deploy Visualforce, Apex, LWC, Java, AJAX, JavaScript, CSS and other technologies to build customized solutions that support business requirements and drive key business decisions Contribute to team processes by participating in team activities, such as estimation, collaboration in requirements definition, code reviews, and contribution of feedback during retrospectives Support all technical aspects of Salesforce.com, including data migrations, data quality, systems integrations, 3rd party applications, AppExchange products, and custom code Support the organizationutilization of SFDC to improve processes and productivity, and make recommendations to support scaling at a rapid pace Support the global Salesforce.com team with additional projects as needed Maintain Salesforce certifications QUALIFICATIONS: 5 years prior development experience with Salesforce.com Salesforce Platform App Builder and Developer 1 Certifications preferred Bachelordegree in Computer Science, Software Engineering or equivalent. Solid understanding of and detailed experience with Salesforce.com architecture and API Experience writing Visualforce and Apex classes and triggers Working proficiency in HTML, XML, Flex, JavaScript, ASP, SQL, Java or C++, HTTP/REST and SOAP-based web services Experience using Salesforce data tools (Data Loader, Excel Connector, DemandTools, Eclipse Force.com IDE) and other development tools, including Informatica Strong understanding of relational databases Demonstrable success with multiple Salesforce.com integration projects Experience integrating Salesforce.com with other applications via real-time, batch, sync/async Experience with scripted data loader, web services, cloud or on-premise middleware and other enterprise integrating technologies Ability to work independently, as well as part of a team Strong business analysis and functional experience, including requirements gathering, creating/deploying solutions to end users Experience working in an Agile software development environment Strong attention to detail and excellent problem solving skills Strong verbal/written communication and data presentation skills, including an ability to effectively communicate with both business and technical teams Experience with Copado DevOps preferred
Posted 1 week ago
5.0 - 8.0 years
12 - 16 Lacs
Hyderabad
Work from Office
The Enterprise Operations Analytics (EOA) organization supports the data and analytical needs of CignaEnterprise Operations. As a center of excellence for best practices in Data Engineering, Web Application Development, Data Science, Advanced Analytics, and Business Intelligence, the team brings together data from customer contact centers, client benefit design/install, provider solutions/network operations, print/digital communications, and beyond in novel ways to provide actionable insights to business partners ranging from front-line Prior Authorization Representatives and Patient Care Advocates to Senior Leadership. By turning data into information, and information into insight, we passionately strive to make the operational services provided by Cigna/Evernorth to customers, clients, and providers the safest, most efficient, most convenient, and most trusted in the world. The Data Engineering Lead Analyst will play a crucial role in transforming raw data into actionable insights that drive business decisions. To be successful in this role, you must understand ETL processes and be experienced in supporting production ready applications. You must also have excellent communication skills and be able to build and maintain relationships with both internal and external stakeholders. They will be responsible for partnering with key players in Customer Service, Client Service, Provider Service, and Workforce Planning, while drawing on support from Technology, Finance, Strategy, Operational Readiness, and Solutions Delivery. Roles Responsibilities Design and implement ETL processes to extract, transform, and load data from various sources. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and maintain existing ETL workflows for improved efficiency and performance. Ensure data integrity and quality through rigorous testing and validation. Provide support on CI/CD pipelines and automation. Create and implement best practices to ensure successful delivery and support of Production ready applications. Qualifications Required Experience Education: 5 - 8 years of relevant production support and project experience. Hands-on experience with ETL tools such as Informatica, SSIS, along with proficiency in scripting languages like Python. Experience with Relational Database Development Oracle, Teradata, SQL Server, PostgreSQL or equivalent required. Strong understanding of database concepts and experience with query optimization and performance tuning. Experience with scheduling tools such as CA Workload Automation. Excellent problem-solving skills and the ability to analyze complex issues and implement effective solutions. Required Skills: Informatica, SSIS, SQL (SQL Server, Oracle, Teradata) Job Scheduling (Airflow, DAG, Control M, etc.) Performance Tuning, Issue Resolution Python, Databricks, DevOps, Basic Cloud Experience (AWS/Azure) Preferred Skills, Experience Education: Cloud Certification (AWS Cloud Practitioner or similar)
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
5.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Honeywell Connected Enterprise (HCE) is the software division of Honeywell with strategic focus on digitization, sustainability, and OT Cybersecurity SaaS offerings and solutions. HCE was established to leverage Honeywell s domain expertise and lead the transition into a cutting-edge industrial software company. Since our inception in 2018, HCE established the category of intelligent operations and built a new platform born out of decades of operational data and insights, uniting real-time data across assets, people, and processes into a system of record for a 360-degree view. This is our flagship offering - Honeywell Forge. We are a global team of thousands of innovators with expertise spanning industrial operations, software engineering, data science, artificial intelligence, and process engineering. We are paving the way for our customers to grow responsibly. We believe the future is what we make it. As a Honeywell Futureshaper, you are a part of something bigger. You can work with highly capable people to make the world a better place and become the best you. After all, we are not imagining the future; we re building it. 4 years of experience in building advanced analytics solutions with data from enterprise systems like ERPs(SAP, Oracle, etc. ), RDBMS, CRMs, Marketing & Logistics tools etc. 4 years of hands on experience with Spark, Pig/hive etc. and automation of data flows using Informatica BDM/DEI, NiFi and/or Airflow/Oozie. 4 years of experience in building advanced analytics solutions with data from enterprise systems like ERPs(SAP, Oracle, etc. ), RDBMS, CRMs, Marketing & Logistics tools etc. 4 years of hands on experience with Spark, Pig/hive etc. and automation of data flows using Informatica BDM/DEI, NiFi and/or Airflow/Oozie. As a Lead Data Engineer, you will be part of a team that delivers contemporary analytics solutions for all Honeywell business groups and functions. You will build strong relationships with leadership to effectively deliver contemporary data analytics solutions & contribute directly to business success. You will develop solutions on various Database systems viz. Databricks, Snowflake, Hive, Hadoop, PostgreSQL, etc. You will identify and implement process improvements - and you don t like to the same thing twice so you will automate it if you can. You are always keeping an eye on scalability, optimization, and process. You have worked with Big Data before, IoT data, SAP, SQL, Azure, AWS and a bunch of other acronyms. You will work on a team including scrum masters, product owners, data architects, data engineers/designers, data scientists and DevOps. You and your team collaborate to build products from the idea phase through launch and beyond. The software you write makes it to production in couple of sprints. Your team will be working on creating a new platform using your experience of APIs, microservices, and platform development. You Must Have Bachelors or Master s degree in Computer Science, Engineering, Applied Mathematics or related field 12+ years of data engineering, data design and/or enterprise data management & analytics experience Should be able to architect large enterprise analytics projects with optimal solutions on the Bigdata platform. Should have designed, developed and deployed complex big data ingestion jobs with contemporary data ingestion tools like Spark, Informatica Power Center & BDM on Databricks/Hadoop/NoSQL/MPP platforms. Experience with dimensional modeling, data warehousing and data miningA Job posting does not exist for this global job code, please work with your HRG to develop one As a Lead Data Engineer, you will be part of a team that delivers contemporary analytics solutions for all Honeywell business groups and functions. You will build strong relationships with leadership to effectively deliver contemporary data analytics solutions & contribute directly to business success. You will develop solutions on various Database systems viz. Databricks, Snowflake, Hive, Hadoop, PostgreSQL, etc. You will identify and implement process improvements - and you don t like to the same thing twice so you will automate it if you can. You are always keeping an eye on scalability, optimization, and process. You have worked with Big Data before, IoT data, SAP, SQL, Azure, AWS and a bunch of other acronyms. You will work on a team including scrum masters, product owners, data architects, data engineers/designers, data scientists and DevOps. You and your team collaborate to build products from the idea phase through launch and beyond. The software you write makes it to production in couple of sprints. Your team will be working on creating a new platform using your experience of APIs, microservices, and platform development. You Must Have Bachelors or Master s degree in Computer Science, Engineering, Applied Mathematics or related field 12+ years of data engineering, data design and/or enterprise data management & analytics experience Should be able to architect large enterprise analytics projects with optimal solutions on the Bigdata platform. Should have designed, developed and deployed complex big data ingestion jobs with contemporary data ingestion tools like Spark, Informatica Power Center & BDM on Databricks/Hadoop/NoSQL/MPP platforms. Experience with dimensional modeling, data warehousing and data miningA Job posting does not exist for this global job code, please work with your HRG to develop one
Posted 1 week ago
7.0 - 12.0 years
22 - 27 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Lead Data Engineer/Associate Architect to lead the design, implementation, and optimization of scalable data architectures. The ideal candidate will have a deep understanding of data modeling, ETL processes, cloud data solutions, and big data technologies. You will work closely with cross-functional teams to build robust, high-performance data pipelines and infrastructure to enable data-driven decision-making. Experience: 7 - 12 years Work Location: Hyderabad (Hybrid) / Remote Mandatory skills: Python, SQL, Snowflake Responsibilities: Design and Develop scalable and resilient data architectures that support business needs, analytics, and AI/ML workloads. Data Pipeline Development: Design and implement robust ETL/ELT processes to ensure efficient data ingestion, transformation, and storage. Big Data & Cloud Solutions: Architect data solutions using cloud platforms like AWS, Azure, or GCP, leveraging services such as Snowflake, Redshift, BigQuery, and Databricks. Database Optimization: Ensure performance tuning, indexing strategies, and query optimization for relational and NoSQL databases. Data Governance & Security: Implement best practices for data quality, metadata management, compliance (GDPR, CCPA), and security. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to translate business requirements into scalable solutions. Technology Evaluation: Stay updated with emerging trends, assess new tools and frameworks, and drive innovation in data engineering. Required Skills: Education: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Experience: 7 - 12+ years of experience in data engineering Cloud Platforms: Strong expertise in AWS/Azure data services. Databases: Hands-on experience with SQL, NoSQL, and columnar databases such as PostgreSQL, MongoDB, Cassandra, and Snowflake. Programming: Proficiency in Python, Scala, or Java for data processing and automation. ETL Tools: Experience with tools like Apache Airflow, Talend, DBT, or Informatica. Machine Learning & AI Integration (Preferred): Understanding of how to architect data solutions for AI/ML applications
Posted 1 week ago
1.0 - 4.0 years
3 - 6 Lacs
Mumbai
Work from Office
Developer Role and Responsibilities Your specific duties will be based on your experience as an UiPath developer. In this role, you will be responsible for designing and delivering UiPath solutions in accordance with WonderBotz standards and best practices. You will work closely together with our enthusiastic team of both business and technical specialists. You will be part of a fast-growing and successful team that helps our clients get the maximum benefit. Expected Activities: Support development of UiPath strategies, including assessing opportunities Under the supervision of more experienced developers, define, design, and develop automation on UiPath platforms for clients, including POCs, pilots, and production automation. More senior developers will be expected to work independently Participate in workshops and interviews with business process SMEs to gather and confirm business process details & documenting process definitions. More senior developers will lead these workshops and interviews. Participate in design and configuration sessions and apply feedback to improve and enhance work products. More senior developers will lead these sessions. Work alongside newly trained developers to guide and mentor them. Qualifications and Skills Have mastered or have a strong desire to master a leading RPA tool (UiPath a must, Blue Prism, Automation Anywhere), including advanced RPA vendor certification. At least one year of hands-on experience with at least one of the following programming languages (e.g. .Net, Java, VB, C#/C, HTML/CSS, Python, Web Services, mainframe, web applications, SQL, data integration tools, technical automation tools). More senior developers should have a minimum of 2 to 4 years of this hands-on experience. Reasonably proficiency in reading Microsoft Office Visio or other equivalent process flow-charting tool or workflow-based logic Extra - Any prior work or academic experience with Document management and processing tools (e.g. Kofax, ABBYY, Data Cap), Data integration tools (e.g. Informatica, Microsoft SSIS), Technical automation tools (e.g. shell scripting, PHP), or Business process management tools (e.g. Pega). Desired characteristics in candidates Effective communication skills for technical and non-technical audiences Analytical and proven problem-solving skills High Emotional IQ Embraces challenges Team-orientation rather than an individual contributor Compensation and start dates Hiring now for immediate start Salary: Competitive base and bonus determined by level and experience Benefits: Healthcare, relocation, vacation, holidays Training: WonderBotz provides training, depending upon experience level, with the expectation that candidates will pass the vendor developer certification exam by end of their training period US professional services hubs: Princeton-NJ, Las Vegas-NV, Boston-MA, and additional major cities India RPA Factory: various metro cities WonderBotz is an Equal Employment Opportunity employer.
Posted 1 week ago
7.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Responsibilities: Experience in Data Warehouse, Solution Design and Data Analytics. Experience in data modelling exercises like dimensional modelling, data vault modelling Understand, Interpret, and clarify Functional requirements as well as technical requirements. Should understand the overall system landscape, upstream and downstream systems. Should be able understand ETL tech specifications and develop the code efficiently. Ability to demonstrate Informatica Cloud features/functions to the achieve best results. Hands on experience in performance tuning, pushdown optimization in IICS. Provide mentorship on debugging and problem-solving. Review and optimize ETL tech specifications and code developed by the team. Ensure alignment with overall system architecture and data flow. Mandatory skill sets: Data Modelling, IICS/any leading ETL tool, SQL Preferred skill sets: Python Years of experience required: 7 - 10 yrs Education qualification: B.tech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills ETL Tools Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship Government Clearance Required Job Posting End Date
Posted 1 week ago
6.0 - 10.0 years
10 - 15 Lacs
Mumbai, Hyderabad, Pune
Work from Office
Staff Technical Product Manager Are you excited about the opportunity to lead a team within an industry leader in Energy Technology Are you passionate about improving capabilities, efficiency and performance Join our Digital Technology Team! As a Staff Technical Product Manager, this position will operate in lock-step with product management to create a clear strategic direction for build needs, and convey that vision to the service s scrum team. You will direct the team with a clear and descriptive set of requirements captured as stories and partner with the team to determine what can be delivered through balancing the need for new features, defects, and technical debt.. Partner with the best As a Staff Technical Product Manager, we are seeking a strong background in business analysis, team leadership, and data architecture and hands-on development skills. The ideal candidate will excel in creating roadmap, planning with prioritization, resource allocation, key item delivery, and seamless integration of perspectives from various stakeholders, including Product Managers, Technical Anchors, Service Owners, and Developers. Results - oriented leader, and capable of building and executing an aligned strategy leading data team and cross functional teams to meet deliverable timelines. As a Staff Technical Product Manager, you will be responsible for: Demonstrating wide and deep knowledge in data engineering, data architecture, and data science. Ability to guide, lead, and work with the team to drive to the right solution Engaging frequently (80%) with the development team; facilitate discussions, provide clarification, story acceptance and refinement, testing and validation; contribute to design activities and decisions; familiar with waterfall, Agile scrum framework; Owning and manage the backlog; continuously order and prioritize to ensure that 1-2 sprints/iterations of backlog are always ready. Collaborating with UX in design decisions, demonstrating deep understanding of technology stack and impact on final product. Conducting customer and stakeholder interviews and elaborate on personas. Demonstrating expert-level skill in problem decomposition and ability to navigate through ambiguity. Partnering with the Service Owner to ensure a healthy development process and clear tracking metric to form standard and trustworthy way of providing customer support Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness Implementing and maintaining data governance and security measures to protect sensitive data. Monitoring and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes. Fuel your passion: To be successful in this role you will require: Have a Bachelors or higher degree in Computer Science, Information Systems, or a related field. Have minimum 6-10 years of proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Extensive knowledge working with SAP systems, Tcode, data pipelines in SAP, Databricks related technologies. Have Experience with building complex jobs for building SCD type mappings using ETL tools like PySpark, Talend, Informatica, etc. Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have Strong problem-solving and analytical skills, with the ability to handle complex data challenges. Have Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing, and ETL principles. Have familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Certification in relevant technologies or data engineering disciplines. Having working knowledge in Databricks, Dremio, and SAP is highly preferred. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all our people are developed, engaged, and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenge with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. About Us With operations in over 120 countries, we provide better solutions for our customers and richer opportunities for our people. As a leading partner to the energy industry, were committed to achieving net-zero carbon emissions by 2050 and were always looking for the right people to help us get there. People who are as passionate as we are about making energy safer, cleaner, and more efficient. Join Us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clientsCome join us and grow with a team of people who will challenge and inspire you!
Posted 1 week ago
6.0 - 10.0 years
22 - 27 Lacs
Mumbai, Hyderabad, Pune
Work from Office
Staff Technical Product Manager Are you excited about the opportunity to lead a team within an industry leader in Energy Technology Are you passionate about improving capabilities, efficiency and performance Join our Digital Technology Team! As a Staff Technical Product Manager, this position will operate in lock-step with product management to create a clear strategic direction for build needs, and convey that vision to the service s scrum team. You will direct the team with a clear and descriptive set of requirements captured as stories and partner with the team to determine what can be delivered through balancing the need for new features, defects, and technical debt.. Partner with the best As a Staff Technical Product Manager, we are seeking a strong background in business analysis, team leadership, and data architecture and hands-on development skills. The ideal candidate will excel in creating roadmap, planning with prioritization, resource allocation, key item delivery, and seamless integration of perspectives from various stakeholders, including Product Managers, Technical Anchors, Service Owners, and Developers. Results - oriented leader, and capable of building and executing an aligned strategy leading data team and cross functional teams to meet deliverable timelines. As a Staff Technical Product Manager, you will be responsible for: Demonstrating wide and deep knowledge in data engineering, data architecture, and data science. Ability to guide, lead, and work with the team to drive to the right solution Engaging frequently (80%) with the development team; facilitate discussions, provide clarification, story acceptance and refinement, testing and validation; contribute to design activities and decisions; familiar with waterfall, Agile scrum framework; Owning and manage the backlog; continuously order and prioritize to ensure that 1-2 sprints/iterations of backlog are always ready. Collaborating with UX in design decisions, demonstrating deep understanding of technology stack and impact on final product. Conducting customer and stakeholder interviews and elaborate on personas. Demonstrating expert-level skill in problem decomposition and ability to navigate through ambiguity. Partnering with the Service Owner to ensure a healthy development process and clear tracking metric to form standard and trustworthy way of providing customer support Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness Implementing and maintaining data governance and security measures to protect sensitive data. Monitoring and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes. Fuel your passion: To be successful in this role you will require: Have a Bachelors or higher degree in Computer Science, Information Systems, or a related field. Have minimum 6-10 years of proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Extensive knowledge working with SAP systems, Tcode, data pipelines in SAP, Databricks related technologies. Have Experience with building complex jobs for building SCD type mappings using ETL tools like PySpark, Talend, Informatica, etc. Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have Strong problem-solving and analytical skills, with the ability to handle complex data challenges. Have Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing, and ETL principles. Have familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Certification in relevant technologies or data engineering disciplines. Having working knowledge in Databricks, Dremio, and SAP is highly preferred. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns (where applicable): Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all our people are developed, engaged, and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent, and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we must push the boundaries today. We prioritize rewarding those who embrace challenge with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities. About Us With operations in over 120 countries, we provide better solutions for our customers and richer opportunities for our people. As a leading partner to the energy industry, were committed to achieving net-zero carbon emissions by 2050 and were always looking for the right people to help us get there. People who are as passionate as we are about making energy safer, cleaner, and more efficient. Join Us Are you seeking an opportunity to make a real difference in a company with a global reach and exciting services and clientsCome join us and grow with a team of people who will challenge and inspire you!
Posted 1 week ago
3.0 - 8.0 years
9 - 14 Lacs
Ahmedabad
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 10 S&P Global - Mobility The Role: Senior Business Analyst - Data Engineering The Team We are seeking a Senior Business Analyst in the Data Engineering Team, you will be responsible for bridging the gap between business needs and technical solutions. You will collaborate with stakeholders to gather requirements, analyze data workflows, and ensure the successful delivery of data-driven projects. The Impact In this role, you will have the opportunity to work in an Agile team, ensuring we meet our customer requirements and deliver impactful quality data. Using your technical skills, you will contribute to data analysis, design and implement complex solutions, and support the business strategy. Responsibilities Collaborate with business stakeholders to identify and document requirements for data engineering projects. Analyze existing data processes and workflows to identify opportunities for improvement and optimization. Work closely with data engineers and data scientists to translate business requirements into technical specifications. Conduct data analysis and data validation to ensure accuracy and consistency of data outputs. Develop and maintain documentation related to data processes, requirements, and project deliverables. Facilitate communication between technical teams and business stakeholders to ensure alignment on project goals and timelines. Participate in project planning and prioritization discussions, providing insights based on business needs. Support user acceptance testing (UAT) and ensure that solutions meet business requirements before deployment. Utilize Jira for project tracking, issue management, and to facilitate Agile project management practices. Stay updated on industry trends and best practices in data engineering and analytics. What Were Looking For Minimum of 6 years of experience as a Business Analyst in a data engineering environment. Strong understanding of data engineering concepts, data modeling, and ETL processes. Proficiency in data visualization tools (e.g., Tableau, Power BI) and SQL for data analysis. Experience with Jira for project management and tracking. Excellent analytical and problem-solving skills, with a keen attention to detail. Strong communication and interpersonal skills, with the ability to work collaboratively in a team environment. Experience with Agile methodologies and project management tools is must. Return to Work Have you taken time out for caring responsibilities and are now looking to return to workAs part of our Return-to-Work initiative (link to career site page when available), we are encouraging enthusiastic and talented returners to apply, and will actively support your return to the workplace. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), PDMGDV202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 week ago
8.0 - 13.0 years
10 - 15 Lacs
Hyderabad
Work from Office
About the Role: Grade Level (for internal use): 10 Title Senior ETL and Backend Developer (Salesforce) Job Location Hyderabad, Ahmedabad, Gurgaon, Virtual-India The Team: We are seeking a skilled Senior ETL and Backend Developer with extensive experience in Informatica and Salesforce. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes and backend systems to ensure seamless data integration and management. The team works in a challenging environment that gives ample opportunities to use innovative ideas to solve complex problems. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. The Impact: You will be making significant contribution in building solutions for the Web applications using new front-end technologies & Micro services. The work you do will deliver products to build solutions for S&P Global Commodity Insights customers. Responsibilities ETL Development: Design, develop, and maintain ETL processes using Informatica PowerCenter and other ETL tools. Data Integration: Integrate data from various sources, including databases, APIs, flat files, and cloud storage, into data warehouses or data lakes. Backend Development: Develop and maintain backend systems using relevant programming languages and frameworks. Salesforce Integration: Implement and manage data integration between Salesforce and other systems. Performance Tuning: Optimize ETL processes and backend systems for speed and efficiency. Data Quality: Ensure data quality and integrity through rigorous testing and validation.Monitoring and MaintenanceContinuously monitor ETL processes and backend systems for errors or performance issues and make necessary adjustments. Collaboration: Work closely with data architects, data analysts, and business stakeholders to understand data requirements and deliver solutions.Qualifications: Basic Qualifications: Bachelor's /Masters Degree in Computer Science, Information Systems or equivalent. A minimum of 8+ years of experience in software engineering & Architecture. A minimum 5+ years of experience in ETL development, backend development, and data integration. A minimum of 3+ years of Salesforce development, administration/Integration. Proficiency in Informatica PowerCenter and other ETL tools. Strong knowledge of SQL and database management systems (e.g., Oracle, SQL Server). Experience with Salesforce integration and administration. Proficiency in backend development languages (e.g., Java, Python, C#). Familiarity with cloud platforms (e.g., AWS, Azure) is a plus. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Nice to have GenAI, Java, Spring boot, Knockout JS, requireJS, Node.js, Lodash, Typescript, VSTest/ MSTest/ nUnit. Preferred Qualifications: Proficient with software development lifecycle (SDLC) methodologies like SAFe, Agile, Test- driven development. Experience with other ETL tools and data integration platforms. Informatica Certified ProfessionalSalesforce Certified Administrator or Developer Knowledge of back-end technologies such as C#/.NET, Java or Python. Excellent problem solving, analytical and technical troubleshooting skills. Able to work well individually and with a team. Good work ethic, self-starter, and results oriented. Excellent communication skills are essential, with strong verbal and writing proficiencies. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today.For more information, visit http://www.spglobal.com/commodity-insights . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 week ago
6.0 - 9.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Hi all, This is an exciting career opportunity for Informatica Developer position. The Job Description is, experience on Informatica, IICS and any cloud technologies. Experience - 6 to 9years Location - Pune, Bangalore, Hyderabad, Chennai Notice period - Immediate/ max 10days if you are interested, please share your updated profile to jeyaramya.rajendran@zensar.com
Posted 1 week ago
8.0 - 13.0 years
5 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title Informatica/Stibo MDM Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Technology->Data Management - MDM->Informatica MDM,Technology->Data Management - MDM->Stibo MDM Preferred Skills: Technology->Data Management - MDM->Informatica MDM Technology->Data Management - MDM->Stibo MDM Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location - PAN INDIA EXP- 5+ Yrs
Posted 1 week ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Overview Customer Data Stewardship Sr Analyst (IBP) Job Overview PepsiCo Data Governance Program OverviewPepsiCo is establishing a Data Governance program that will be the custodian of the processes, policies, rules and standards by which the Company will define its most critical data. Enabling this program will - Define ownership and accountability of our critical data assets to ensure they are effectively managed and maintain integrity throughout PepsiCos systems - Leverage data as a strategic enterprise asset enabling data-based decision analytics - Improve productivity and efficiency of daily business operations Position OverviewThe Customer Data Steward IBP role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governance's (EDG) processes, rules and standards set to ensure data is fit for purpose. Responsibilities Responsibilities Primary Accountabilities - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCo's enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCo's enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCo's Enterprise Data Governance Capability and data management program across the organization. Qualifications Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi