Home
Jobs

3093 Informatica Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy

Posted 3 days ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a skilled and experienced Cognos and Informatica Administrator to join our team. You will be responsible for the installation, configuration, maintenance, and support of Cognos and Informatica software in our organization. Your role will involve collaborating with cross-functional teams, solve system issues, and ensuring the smooth functioning of the Cognos and Informatica environments. Role Scope Deliverables: Responsibilities: Install, configure, and upgrade Cognos and Informatica application components, including servers, clients, and related tools. Monitor and maintain the performance, availability, and security of Cognos and Informatica environments. Collaborate with developers, business analysts, and other stakeholders to understand requirements and provide technical guidance. Troubleshoot and resolve issues related to Cognos and Informatica applications, databases, servers, and integrations. Perform system backups, disaster recovery planning, and implementation. Implement and enforce best practices for Cognos and Informatica administration, security, and performance tuning. Manage user access, roles, and permissions within Cognos and Informatica environments. Coordinate with vendors for product support, patches, upgrades, and license management. Stay up to date with the latest trends and advancements in Cognos and Informatica technologies. Document technical processes, procedures, and configurations. Nice-to-Have Skills: Development Skills: Familiarity with Cognos Report Studio, Framework Manager, Informatica PowerCenter, and other development tools to assist in troubleshooting and providing guidance to developers and users. Databricks Experience: Practiced in designing and building dashboards in Power BI or Power BI Administration experience Microsoft SQL Server Analysis Services (SSAS) Experience: Install, configure, and maintain Microsoft SQL Server Analysis Services (SSAS) environments. Proven knowledge as a Microsoft SQL Server Analysis Services (SSAS) Administrator Databricks Experience: Knowledge with Databricks, and a strong understanding of its architecture, capabilities, and best practices. Key Skills: Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Cognos and Informatica Administrator or similar role. Solid understanding of Cognos and Informatica installation, configuration, and administration. Familiarity with relational databases, SQL, and data warehousing concepts. Excellent troubleshooting and problem-solving skills. Ability to work independently and collaboratively in a team environment. Strong communication and interpersonal skills. Attention to detail and ability to prioritize tasks effectively.

Posted 3 days ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Mumbai

Hybrid

Naukri logo

PF Detection is mandatory Minimum 5 years of experience in database development and ETL tools. 2. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). 3. Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). 4. Experience with data modeling and schema design. 5. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). 6. Understanding of data warehousing concepts and best practices

Posted 3 days ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Karnataka

Hybrid

Naukri logo

PF Detection is mandatory : Looking for a candidate with over 6 years of hands-on involvement in Snowflake. The primary expertise required is in Snowflake, must be capable of creating complex SQL queries for manipulating data. The candidate should excel in implementing complex scenarios within Snowflake. The candidate should possess a strong foundation in Informatica PowerCenter, showcasing their proficiency in executing ETL processes. Strong hands-on experience in SQL and RDBMS Strong hands-on experience in Unix Shell Scripting Knowledge in Data warehousing and cloud data warehousing Should have good communication skills

Posted 3 days ago

Apply

0.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities and Tasks: Understand the Business Problem and the Relevant Data Maintain an intimate understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture the relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems: Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data. Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements: Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications Qualifications and Experience: 0-7 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) Software development. Ability to travel as needed Education: B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred. About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.

Posted 3 days ago

Apply

5.0 years

8 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

About this role: Wells Fargo is seeking a Lead data Engineer In this role, you will: Lead complex initiatives with broad impact and act as key participant in large scale software planning for the Technology area Design, develop, and run tooling to discover problems in data and applications and report the issues to engineering and product leadership Review and analyze complex software enhancement initiatives for business, operational or technical improvements that require in depth evaluation of multiple factors including intangibles or unprecedented factors Make decisions in complex and multi-faceted data engineering situations requiring understanding of software package options and programming language and compliance requirements that influence and lead Technology to meet deliverables and drive organizational change Strategically collaborate and consult with internal partners to resolve highly risky data engineering challenges Required Qualifications: 5+ years of Database Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Need expertise on Informatica power center 10.2, Oracle, Unix and Autosys. Job Expectations: Able to work individually and work along with the US counter part. Able to handle production issues at data level. Able to create new jobs and schedule them with in the time limit should follow agile JIRA process and adhere ti the JIRA standards Should handle the CR effectively and get the job to production. Posting End Date: 26 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Posted 3 days ago

Apply

10.0 years

6 - 11 Lacs

Hyderābād

On-site

GlassDoor logo

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 25-Jun-2025 Job ID 9085 Description and Requirements This position is responsible for design, implementation, and support of MetLife's enterprise data management and integration systems, the underlying infrastructure, and integrations with other enterprise systems and applications using AIX, Linux, or Microsoft Technologies. Job Responsibilities Provide technical expertise in the planning, engineering, design, implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes Partner with the Capacity Management, Production Management, Application Development Teams and the Business to ensure customer expectations are maintained and exceeded Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise Identify and resolve complex data management and integration system issues (Tier 3 support) utilizing product knowledge and structured troubleshooting tools and techniques Support Disaster Recovery implementation and testing as required Experience in design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java…) Good decision-making skills Take ownership for the deliverables from the entire team Strong collaboration with leadership groups Learn new technologies based on demand Coach other team members and bring them up to speed Track project status working with team members and report to leadership Participate in cross-departmental efforts Leads initiatives within the community of practice Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field. Experience 10+ years of total experience and at least 7+ years of experience in Informatica applications implementation and support of data management and integration system infrastructures and technologies. This includes the system's operational procedures and processes. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Good understanding in Disaster Recovery implementation and testing Design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java…) Informatica PowerCenter Informatica PWX Informatica DQ Informatica DEI Informatica B2B/DX Informatica MFT Informatica MDM Informatica ILM Informatica Cloud (IDMC/IICS) Ansible (Automation) Operating System Knowledge (Linux/Windows/AIX) Azure Dev Ops Pipeline Knowledge Python and/or Powershell Agile SAFe for Teams Enterprise Scheduling Knowledge (Maestro) Troubleshooting Communications CP4D Datastage Mainframe z/OS Knowledge Open Shift Elastic Experience in creating and working on Service Now tasks/tickets About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDG Specialist / Consultant-Senior Job Summary: We are looking for an experienced Informatica IDG (Data Governance ) professional to lead and support our enterprise data governance initiatives. The candidate will be responsible for configuring and deploying Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM ) tools to establish robust governance, data discovery, metadata management, and regulatory compliance across the organization. Key Responsibilities: Implement and configure Informatica IDG components including Axon Data Governance, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Collaborate with data owners, stewards, and business users to define and maintain business glossaries, data domains, policies, and governance workflows. Integrate IDG with other platforms (IDQ, MDM, IICS, PowerCenter, Snowflake, etc.) to ensure metadata lineage and impact analysis. Design and implement data governance strategies that align with data privacy regulations (GDPR, CCPA, etc.) and internal compliance requirements. Create and maintain data lineage maps, stewardship dashboards, and data quality insights using Informatica tools. Define and enforce role-based access controls and security configurations within IDG tools. Support adoption of data governance processes, including stewardship, policy approval, and issue resolution. Train business users and data stewards on using Axon, EDC, and other governance components. Ensure the sustainability of governance programs through change management, documentation, and governance councils. Required Qualifications: 3-7 years of experience in Informatica Data Governance (IDG) or related tools. Strong hands-on experience with Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Understanding of data governance frameworks, metadata management, and policy management. Familiarity with data classification, data lineage, and data stewardship workflows. Experience with metadata ingestion and cataloging across hybrid/cloud platforms. Solid SQL skills and familiarity with cloud data platforms (AWS, Azure, GCP, Snowflake, etc.). Strong communication, stakeholder engagement, and documentation skills. Preferred Qualifications: Informatica certifications in Axon, EDC, or Data Governance. Experience with Data Quality (IDQ), Master Data Management (MDM), or IICS. Knowledge in CDGC will be an added value Knowledge of industry-specific regulations and data governance mandates. Familiarity with governance best practices from DAMA-DMBOK, DCAM, or similar frameworks. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Hi ,Greetings From IDESLABS.This is Navyafrom IDESLABS, we have a requirement on Etl Testing for one of our clients for contract to Hire role. job Details: skillsEtl TestingExperience7+ YearsLocationBangaloreJob typeContract to HirePay roll companyIDESLABSWork ModelHybrid JD JD with Primary Skill ETL testing /Strong SQL Looking for ETL/DB tester with 5+ years of experience. Should have strong SQL skills. Should have hands on coding knowledge in any scripting language. should be able to design and write SQL queries for data validation should be able to verify and test ETL processes understanding of Data warehousing concepts is a plus good communication skill and testing mindset

Posted 3 days ago

Apply

3.0 - 6.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

"Spark & Delta Lake Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"

Posted 3 days ago

Apply

3.0 years

9 - 10 Lacs

Gurgaon

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India

Posted 3 days ago

Apply

15.0 years

0 Lacs

Bhubaneshwar

On-site

GlassDoor logo

Project Role : Advanced Application Engineer Project Role Description : Develop innovative technology solutions for emerging industries and products. Interpret system requirements into design specifications. Must have skills : Informatica MDM Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Advanced Application Engineer, you will engage in the development of innovative technology solutions tailored for emerging industries and products. Your typical day will involve interpreting system requirements and translating them into detailed design specifications, ensuring that the solutions meet the needs of the business and its clients. You will collaborate with cross-functional teams to refine these specifications and contribute to the overall success of the projects you are involved in, while also staying updated on the latest technological advancements in your field. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of design specifications and system requirements. - Engage in continuous learning to stay abreast of industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration and data quality processes. - Experience with data modeling and metadata management. - Familiarity with ETL processes and data warehousing concepts. - Ability to troubleshoot and resolve data-related issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica MDM. - This position is based at our Bhubaneswar office. - A 15 years full time education is required. 15 years full time education

Posted 3 days ago

Apply

7.0 - 12.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"

Posted 3 days ago

Apply

3.0 years

7 - 9 Lacs

Calcutta

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 years

7 - 9 Lacs

Calcutta

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Operations Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary – Senior Associate – Azure Data Engineer Responsibilities: Role : Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure, PySpark Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

SSIS Senior Developer At least 5+ years of data integration (sourcing, staging, mapping, loading, ) experience, SSIS preferred Demonstrated experience with an enterprise-class integration tool such as SSIS, Informatica, Ab Initio, Data Stage Demonstrated experience working in a team development environment using an IDE

Posted 3 days ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Minimum 3 to 5 years of Talend Developer experience. Work on the User stories and develop the Talend jobs development following the best practices. Create detailed technical design documents of talend jobs development work. Work with the SIT team and involve for defect fixing for Talend components. Note: Maximo IBM tool knowledge would have an advantage for Coned otherwise it is Ok.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDG Specialist / Consultant-Senior Job Summary: We are looking for an experienced Informatica IDG (Data Governance ) professional to lead and support our enterprise data governance initiatives. The candidate will be responsible for configuring and deploying Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM ) tools to establish robust governance, data discovery, metadata management, and regulatory compliance across the organization. Key Responsibilities: Implement and configure Informatica IDG components including Axon Data Governance, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Collaborate with data owners, stewards, and business users to define and maintain business glossaries, data domains, policies, and governance workflows. Integrate IDG with other platforms (IDQ, MDM, IICS, PowerCenter, Snowflake, etc.) to ensure metadata lineage and impact analysis. Design and implement data governance strategies that align with data privacy regulations (GDPR, CCPA, etc.) and internal compliance requirements. Create and maintain data lineage maps, stewardship dashboards, and data quality insights using Informatica tools. Define and enforce role-based access controls and security configurations within IDG tools. Support adoption of data governance processes, including stewardship, policy approval, and issue resolution. Train business users and data stewards on using Axon, EDC, and other governance components. Ensure the sustainability of governance programs through change management, documentation, and governance councils. Required Qualifications: 3-7 years of experience in Informatica Data Governance (IDG) or related tools. Strong hands-on experience with Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Understanding of data governance frameworks, metadata management, and policy management. Familiarity with data classification, data lineage, and data stewardship workflows. Experience with metadata ingestion and cataloging across hybrid/cloud platforms. Solid SQL skills and familiarity with cloud data platforms (AWS, Azure, GCP, Snowflake, etc.). Strong communication, stakeholder engagement, and documentation skills. Preferred Qualifications: Informatica certifications in Axon, EDC, or Data Governance. Experience with Data Quality (IDQ), Master Data Management (MDM), or IICS. Knowledge in CDGC will be an added value Knowledge of industry-specific regulations and data governance mandates. Familiarity with governance best practices from DAMA-DMBOK, DCAM, or similar frameworks. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 days ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Atos Atos is a global leader in digital transformation in 73 countries and annual revenue of € 12 billion. European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos|Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. Role Overview The Technical Architect specializes in Traditional ETL tools such as Informatica Intelligent Cloud Services (IICS), and similar technologies. The jobholder designs, implements, and oversees robust ETL solutions to support our organization's data integration and transformation needs. Responsibilities Design and develop scalable ETL architectures using tools like IICS, and other traditional ETL platforms. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Ensure data quality, integrity, and security throughout the ETL processes. Optimize ETL workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Troubleshoot and resolve complex technical issues related to ETL processes. Document architectural designs and decisions for future reference. Stay updated with emerging trends and technologies in ETL and data integration. Key Technical Skills & Responsibilities 12+ years of experience in data integration and ETL development, with at least 3 years in an Informatica architecture role. Extensive expertise in Informatica PowerCenter, IICS, and related tools (Data Quality, EDC, MDM). Proven track record of designing ETL solutions for enterprise-scale data environments Advanced proficiency in Informatica PowerCenter and IICS for ETL/ELT design and optimization. Strong knowledge of SQL, Python, or Java for custom transformations and scripting. Experience with data warehousing platforms (Snowflake, Redshift, Azure Synapse) and data lakes. Familiarity with cloud platforms (AWS, Azure, GCP) and their integration services. Expertise in data modeling, schema design, and integration patterns. Knowledge of CI/CD, Git, and infrastructure-as-code (e.g., Terraform) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Primary Skills Informatica,IICS Data Lineage and Metadata Management Data Modeling Data Governance Data integration architectures Informatica data quality Eligibility Criteria Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience in ETL architecture and development using tools like IICS, etc. Strong understanding of data integration, transformation, and warehousing concepts. Proficiency in SQL and scripting languages. Experience with cloud-based ETL solutions is a plus. Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Knowledge of data governance and compliance standards. Ability to work in a fast-paced environment and manage multiple priorities. Let’s grow together.

Posted 3 days ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Chennai

Hybrid

Naukri logo

Data Engineer, AWS We're looking for a skilled Data Engineer with 5+ years of experience to join our team. You'll play a crucial role in building and optimizing our data infrastructure on AWS, transforming raw data into actionable insights that drive our business forward. What You'll Do: Design & Build Data Pipelines: Develop robust, scalable, and efficient ETL/ELT pipelines using AWS services and Informatica Cloud to move and transform data from diverse sources into our data lake (S3) and data warehouse (Redshift). Optimize Data Models & Architecture: Create and maintain performant data models and contribute to the overall data architecture to support both analytical and operational needs. Ensure Data Quality & Availability: Monitor and manage data flows, ensuring data accuracy, security, and consistent availability for all stakeholders. Collaborate for Impact: Work closely with data scientists, analysts, and business teams to understand requirements and deliver data solutions that drive business value. What You'll Bring: 5+ years of experience as a Data Engineer, with a strong focus on AWS. Proficiency in SQL for complex data manipulation and querying. Hands-on experience with core AWS data services: Storage: Amazon S3 (data lakes, partitioning). Data Warehousing: Amazon Redshift. ETL/ELT: AWS Glue (Data Catalog, crawlers). Serverless & Orchestration: AWS Lambda, AWS Step Functions. Security: AWS IAM. Reports : Looker and Power Bi Experience with big data technologies like PySpark. Experience with Informatica Cloud or similar ETL tools. Strong problem-solving skills and the ability to optimize data processes. Excellent communication skills and a collaborative approach. Added Advantage: Experience with Python for data manipulation and scripting. Looker or PowerBi expirence is desirable Role & responsibilities Preferred candidate profile

Posted 3 days ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Gurugram

Work from Office

Naukri logo

Hiring for ServiceNow Integrations Developer specialists Job Title: ServiceNow Integrations Developer Location: Gurgaon, Haryana Experience: 2 to 4 Years Skill Required: ServiceNow with Integration Development Job Description: We are seeking a skilled and motivated ServiceNow Integration Developer with 24 years of experience to join our team. The ideal candidate will have hands-on experience designing, developing, and maintaining system integrations using modern middleware tools and APIs. You will work closely with business analysts, solution architects, and other developers to deliver scalable and secure integrations across platforms. Key Responsibilities: Design, develop, and maintain integrations using middleware platforms and APIs. Collaborate with stakeholders to gather integration requirements and develop technical designs. Build and manage RESTful and SOAP web services for system connectivity. Ensure data consistency and accuracy across systems through reliable integration practices. Monitor, troubleshoot, and enhance integration performance and system reliability. Maintain comprehensive documentation of integration workflows and configurations. Adhere to best practices for security , scalability , and high availability in development. Essential Skills: 24 years of experience in integration development or software engineering roles. Hands-on proficiency in integration platforms like MuleSoft, Dell Boomi, Apache Camel, WSO2, or Informatica . Strong understanding of API development (REST, SOAP), messaging systems (Kafka, RabbitMQ), and data formats (JSON, XML). Experience in scripting/programming languages such as Java, Python, or JavaScript . Familiarity with SQL Server, MySQL, or Oracle databases. Understanding of OAuth2.0, SAML, SSL/TLS , and other security protocols. Strong problem-solving skills and the ability to work both independently and in a collaborative team. Desirable Skills (Good to Have): ServiceNow Integration Hub or Flow Designer experience. Exposure to ITSM/ITOM processes within ServiceNow. Certification in integration platforms or ServiceNow would be an added advantage. Interested candidates can share your updated CV to mageswari.r@kiya.ai.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge tech to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged, it's ingrained in our DNA What We Are Looking For As the Associate Engagement Lead, you’ll leverage data to unravel complexities, adept at devising strategic solutions that deliver tangible results for our clients. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You'll Do Collaborate with various teams to drive data design, identify architectural risks, and improve the data landscape by developing and refining data models Utilize technical expertise in cloud, master data management, data warehousing, and data transformation to develop and test high-quality data solutions Lead hands-on development, with familiarity in data mastering and configuration of Reltio and Informatica MDM for HCP and HCO subject areas Document complex MDM solutions, including source-to-target mappings, match-merge rules, survivorship rules, and MDM architecture design considerations Lead and deliver data-centric projects with a focus on data quality, adherence to data standards, and best practices. Must Have Computer Science degree with 2-5 years in MDM configurations in pharmaceutical domain Should have done configurations on the MDM tool (Match & Merge, survivorship rule, base object, queries and packages and more) Expertise in MDM concepts and implementing complex MDM systems from scratch Excellent communication and collaboration skills with the ability to manage multiple projects with a quality-focused approach Experience in data modeling, development, and testing for enterprise solutions. Should have done configurations on the MDM tool (Match & Merge, survivorship rule, base object, queries and packages and more) Must have Knowledge of Azure and AWS. Familiarity with Healthcare data sources like SP/SD, IQVIA, claims is a plus. Skills: reltio,informatica mdm,data management,data accuracy,testing,data,mdm configurations,aws,data quality,mdm,commercial analytics,match-merge rules,data warehousing,informatica,data transformation,sql,data standards,cloud technologies (azure, aws),mdm systems

Posted 3 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Greetings from Synergy Resource Solutions, a leading recruitment consultancy firm Our Client is an ISO 27001:2013 AND ISO 9001 Certified company, and pioneer in web design and development company from India. Company has also been voted as the Top 10 mobile app development companies in India. Company is leading IT Consulting and web solution provider for custom software, website, games, custom web application, enterprise mobility, mobile apps and cloud-based application design & development. Company is ranked one of the fastest growing web design and development company in India, with 3900+ successfully delivered projects across United States, UK, UAE, Canada and other countries. Over 95% of client retention rate demonstrates their level of services and client satisfaction. Position : Senior Database Administrator (WFO) Experience : 5-8 Years relevant experience Education Qualification : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Job Location : Ahmedabad Shift : 11 AM – 8.30 PM CTC: 18 to 25 Lacs Key Responsibilities: Our client seeking an experienced and motivated Senior Data Engineer to join their AI & Automation team. The ideal candidate will have 5–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data driven insights across the organization. Job Description: • Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. • Architect and optimize data storage solutions to ensure reliability, security, and scalability. • Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. • Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. • Develop and enforce data quality standards, governance policies, and monitoring systems to ensure data integrity. • Create and maintain comprehensive documentation for data systems, workflows, and models. • Implement data modeling best practices and optimize data retrieval processes for better performance. • Stay up-to-date with emerging technologies and bring innovative solutions to the team. Qualifications: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. • 5–8 years of experience in data engineering, designing and managing large-scale data systems. The mandatory skills are as follows: SQL NoSQL (MongoDB or Cassandra, or CosmosDB) One of the following : Snowflake or Redshift or BigQuery or Microsft Fabrics Azure Strong expertise in database technologies, including: o SQL Databases: PostgreSQL, MySQL, SQL Server o NoSQL Databases: MongoDB, Cassandra o Data Warehouse/ Unified Platforms: Snowflake, Redshift, BigQuery, Microsoft Fabric • Hands-on experience implementing and working with generative AI tools and models in production workflows. • Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). • Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. • Strong understanding of data architecture, data modeling, and data governance principles. • Experience with cloud platforms (preferably Azure) and associated data services. Skills: • Advanced knowledge of Database Management Systems and ETL/ELT processes. • Expertise in data modeling, data quality, and data governance. • Proficiency in Python programming, version control systems (Git), and data pipeline orchestration tools. • Familiarity with AI/ML technologies and their application in data engineering. • Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work independently, lead projects, and mentor junior team members. • Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. If your profile is matching with the requirement & if you are interested for this job, please share your updated resume with details of your present salary, expected salary & notice period.

Posted 3 days ago

Apply

130.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Manager, Data Visualization Based in Hyderabad, join a global healthcare biopharma company and be part of a 130-year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the world's greatest health threats As a Manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What Will You Do In This Role Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should You Have 5 years’ relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Data Management, Data Modeling, Data Visualization, Measurement Analysis, Stakeholder Relationship Management, Waterfall Model Preferred Skills Job Posting End Date 07/7/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R335923

Posted 3 days ago

Apply

1.0 - 4.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

:. Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS. We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more. . We are looking for a highly skilled Software Developer/Engineer with at least 2 years of experience to design, develop, and deploy high-performance software solutions that support our growing business needs. Key Responsibilities:. Write clean, maintainable, and efficient code for both frontend and backend systems. Work closely with cross-functional teams to design and implement software solutions. Troubleshoot, debug, and optimize software to improve performance and scalability. Required Qualifications:. 2+ years of experience in software development and engineering. Proficiency in programming languages such as Python, Java, C++, or JavaScript. Experience with web development frameworks, cloud platforms, and version control systems (Git). Why Join Us. Competitive pay (‚1200/hour). Flexible hours. Remote opportunity. NOTEPay will vary by project and typically is up to Rs. 1200 per hour (if you work an average of 3 hours every day that could be as high as Rs. . Shape the future of AI with Soul AI!.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies