Home
Jobs

2793 Informatica Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

17 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a talented Data Engineer cum Database Developer with a strong background in the banking sector. The ideal candidate will have experience with SQL Server, AWS PostgreSQL, AWS Glue, and ETL tools, along with expertise in data ingestion frameworks and Control-M scheduling Key Responsibilities: Design, develop, and maintain scalable data pipelines to support data ingestion and transformation processes. Collaborate with cross-functional teams to gather requirements and implement solutions tailored to banking applications. Utilize SQL Server and AWS PostgreSQL for database development, optimization, and management. Implement data ingestion frameworks to ensure efficient and reliable data flow. Develop and maintain ETL processes using AWS Glue / other ETL tool, Control-M for scheduling Ensure data quality and integrity through validation and testing processes. Monitor and optimize system performance to support business analytics and reporting needs. Document data architecture, processes, and workflows for reference and compliance purposes. Stay updated on industry trends and best practices related to data engineering and management. Qualifications: Bachelor s degree in computer science, Information Technology, or a related field. 4+ years of experience in data engineering and database development, preferably in the banking sector. Proficiency in SQL Server and AWS PostgreSQL. Experience with Databricks/ AWS Glue or any other ETL tools (e.g., Informatica, ADF). Strong understanding of data ingestion frameworks and methodologies. Excellent problem-solving skills and attention to detail. Knowledge of Securitization in the banking industry would be plus Strong communication skills for effective collaboration with stakeholders Familiarity with cloud-based data architectures and services. Experience with data warehousing concepts and practices. Knowledge of data privacy and security regulations in banking.

Posted 1 day ago

Apply

3.0 - 10.0 years

7 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Snowflake Must Have- Relevant Experience 3-10 years Snowflake Dev exp. not migration SQL Basic and Advanced SQL is must (joins, null handling, performance tuning, windowing functions- partition, rank etc.) o Know how of Architecture o Stored Procs o ETL/ELT/ETLT, pipeline Basic Python scripting Snowflake features Time Travel, Zero copy cloning, Data Sharing Good to Have-Python Advanced , DBT

Posted 1 day ago

Apply

3.0 - 5.0 years

4 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 328866 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SQL, PL/SQL, Informatica, Unix/Linux, EKS, AWS/Azure - Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). SQL, PL/SQL, Informatica, Unix/Linux, EKS, AWS/Azure - Developer FMRJP00035038 - Systems Engineer 2 (3-5 Years) Technical/Process SQL & PL/SQL - Expertise in writing and debugging/Trouble shooting skills on SQL and PL-SQL code, Stored Procedures, Functions & Triggers Informatica or any similar ETL tool - Expertise in debugging/Trouble shooting Session/Workflow logs, understands Mapping flows Hands-on experience with Autosys/Control-M is a must Unix/Linux - Write Shell/PERL scripting and programming experience and basic commands Experience in Support /developing applications in Cloud & EKS, AWS/Azure certified (good to have) Hands-on experience with Splunk/Sitescope/Datadog Ability to handle incident bridge calls and crisis situations to mitigate incident impact. Experience in incident management and problem management. Ability to understand the business criticality of various applications as they relate to complex business processes. Familiarity with ITIL framework and/or Agile Project Management Good Analytical, Reporting and Problem-Solving Skills Apache Tomcat & Core Java - Experience in supporting Java based applications (good to have) Working knowledge of basic investment terms and practices is desirable. Minimum Experience on Key Skills 3-5 Years General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location GV, Manyata or EGL, Bangalore 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office from Feb/Mar25 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons

Posted 1 day ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 322003 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr. ETL Developers to join our team in Bangalore, Karn taka (IN-KA), India (IN). Job Duties: Minimum of 5 to 7 years of ETL development experience. Knowledge of ETL ideas, tools, and data structures. Capability to analyse and troubleshoot complicated data sets. Outstanding verbal and written communication abilities. Outstanding interpersonal skills. Strong conceptual abilities. Exceptional analytical abilities. Minimum Skills Required: SQL competence (query performance tuning, index management, etc.) and a knowledge of database structure are required. Understanding of data modelling concepts. Understanding of standard ETL/ELT processes and flow. Knowledge of at least one ETL tool (Informatica, SSIS, Talend etc.) Knowledge of different SQL/NoSQL data storage techniques and technologies. Passionate about sophisticated data structures and problem solutions. Quickly learn new data tools and ideas (imp).

Posted 1 day ago

Apply

4.0 - 6.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: o9 Demand Planning Implementation Senior Consultant Location: Bangalore Department: Supply Chain Consulting About the Role: We are seeking a highly motivated and skilled o9 Solutions Implementation Consultant to join our dynamic team. In this role, you will play a critical part in delivering successful end-to-end implementations of o9 Solutions, focusing specifically on the Demand Planning module. You will work closely with cross-functional teams including client stakeholders, business analysts, and technical teams to ensure seamless deployment of the o9 platform. Key Responsibilities: Lead the functional implementation of o9s Demand Planning solution for global and regional supply chain firms. Work closely with clients to gather business requirements and translate them into effective o9 platform configurations. Drive data integration by collaborating with client s IT teams and o9 technical consultants (data ingestion, transformation, modeling). Develop and execute test scenarios to validate functional and system performance. Support user training and change management initiatives during deployment. Act as a trusted advisor, guiding clients through system adoption and continuous improvement post-implementation. Coordinate with o9 technical teams and client-side stakeholders to troubleshoot and resolve issues. Required Qualifications: 4-6 years of relevant experience in Supply Chain Planning or IT consulting, with at least 1-2 full-cycle implementations of the o9 platform, specifically in Demand Planning. Strong understanding of demand forecasting concepts, statistical modeling, and S&OP processes. Hands-on experience in configuring o9 Demand Planning modules including DP foundational blocks, setting up master data and associated hierarchies, IBPL, creating active rules and procedures, setting up UIs and user access roles. Knowledge of SQL, Python, or integration tools (e.g. Informatica, SSIS) is a strong advantage. Strong analytical, problem-solving, and communication skills. Bachelor s or Master s degree in Engineering, Supply Chain, Operations, or related disciplines. Preferred Qualifications: Experience with other advanced planning systems (e.g., Kinaxis, Blue Yonder, SAP IBP) is a plus. Familiarity with Agile project management methodologies. o9 certification(s) on Technical Level 1 & 2, DP Ref Model.

Posted 1 day ago

Apply

7.0 - 10.0 years

10 - 12 Lacs

Noida

Work from Office

Naukri logo

Req ID: 319585 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica Support Sr. Developer to join our team in noida, Uttar Pradesh (IN-UP), India (IN). Informatica Support Sr. Developer Job Responsibilities: - 7-10 years of hands on experience with Informatica PowerCenter/IDQ/MDM/IICS , SQL Query Editor tools, Data modeling and managing databases - Hands on experience with data warehousing, table structure, fact and dimension tables, logical and physical database design, data modeling, reporting process metadata, and ETL processes - Strong experience in design, development, implementation, administration troubleshooting and support of ETL process using Informatica - Experience with Shell scripting Unix, Linux is plus - Exposure to AutoSys is plus - Strong oral and written communication skills - Must be flexible to work on week-ends and afternoon shifts Position General Duties and Tasks: - Support large production environment of Informatica Power Center/IDQ/MDM - Tune performance of Informatica sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval - Resolve technical blockers and manage day to day workload on different ETL jobs - Help other team members on technical issues

Posted 1 day ago

Apply

8.0 - 10.0 years

7 - 11 Lacs

Chennai

Work from Office

Naukri logo

Req ID: 328613 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer / Developer to join our team in Chennai, Tamil N du (IN-TN), India (IN). Data Engineer / Developer Primary Skillset (Must Have) Oracle PL/SQL, SnapLogic Secondary Skillset (Good to Have) Informatica, Aerospike Tertiary Skillset (Nice to Have) Python Scripting Minimum Experience on Key Skills - 8 - 10 Years General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons.

Posted 1 day ago

Apply

12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Skill : Technical Business Analyst Exp : 10+ Yrs Location : Gurgaon Key Responsibilities Requirements Gathering & Analysis – Engage with stakeholders (business users, risk, compliance, operations) to elicit detailed functional and non-functional requirements. – Translate complex banking processes into clear user stories, process flows, and specification documents. Solution Design & Validation – Collaborate with data engineers, architects, and BI teams to define data models, ETL pipelines, and integration points. – Validate technical designs against business needs and regulatory requirements (e.g., KYC, AML, Basel norms). Development Support – Leverage your hands-on development experience (SQL, Python/Java) to prototype data extracts, transformations, and reports. – Assist the development team with code reviews, test data setup, and troubleshooting. Testing & Quality Assurance – Define acceptance criteria; design and execute system, integration, and user-acceptance test cases. – Coordinate defect triage and ensure timely resolution. Documentation & Training – Maintain up-to-date functional specifications, data dictionaries, and user guides. – Conduct workshops and training sessions for users and support teams. Project & Stakeholder Management – Track project deliverables, highlight risks and dependencies, and communicate progress. – Act as the primary point of contact between business, IT, and external vendors. Required Skills & Experience Domain Expertise: – 10–12 years’ experience in banking (retail, corporate, or investment) with focus on data-driven initiatives. – Strong understanding of banking products (loans, deposits, payments) and regulatory landscape. Technical Proficiency: – Hands-on development background: advanced SQL; scripting in Python or Java. – Experience designing and supporting ETL processes (Informatica, Talend, or equivalent). – Familiarity with data warehousing concepts, dimensional modeling, and metadata management. Exposure to cloud data services (AWS Redshift, Azure Synapse, GCP Big Query) is a plus. Analytical & Process Skills: – Solid experience in data profiling, data quality assessment, and root-cause analysis. – Comfortable with Agile methodologies; adept at sprint planning and backlog management. Communication & Collaboration: – Excellent verbal and written skills; able to explain technical concepts to non-technical audiences. – Proven track record of stakeholder management at all levels.

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

On-site

Linkedin logo

Extensive technical experience Complex SQL queries in SQL Server. • 3 Years exp in IICS (Informatica Intelligent Cloud Services), Snowflake, Strong SQL Skills • Experience working in Azure technologies like Data Verse, ADF, ADLS, Power Platform, Synapse etc. • Should have strong experience in all phases of the project life cycle from Requirements gathering to implementation • Should have expertise in extracting and loading data from source/target systems like Snowflake, Oracle SQL, flat files, XML sources, etc. • Experienced in using transformations like Expression, Router, Filter, Lookup, Hierarchy Builder, Hierarchy Parser, Business Service, Update Strategy, Union, Joiner and Aggregator • Strong technical experience building data integration processes by constructing mappings, mapping tasks, task flows, schedules, and parameter files • Experienced in developing Data Integration mappings using REST APIs and configuring swagger files and rest connections • Experienced with performance optimization, error handling, debugging, and monitoring • Experienced in writing complex SQL queries and knowledge of SQL Analytical functions

Posted 1 day ago

Apply

7.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Salesforce system integration at PwC will focus on connecting Salesforce with other systems, applications, or databases to enable seamless data flow and process automation. You will be responsible for designing, developing, and implementing integration solutions using various integration technologies and tools, such as Salesforce APIs, middleware platforms, and web services. Growing as a strategic advisor, you leverage your influence, expertise, and network to deliver quality results. You motivate and coach others, coming together to solve complex problems. As you increase in autonomy, you apply sound judgment, recognising when to take action and when to escalate. You are expected to solve through complexity, ask thoughtful questions, and clearly communicate how things fit together. Your ability to develop and sustain high performing, diverse, and inclusive teams, and your commitment to excellence, contributes to the success of our Firm. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Craft and convey clear, impactful and engaging messages that tell a holistic story. Apply systems thinking to identify underlying problems and/or opportunities. Validate outcomes with clients, share alternative perspectives, and act on client feedback. Direct the team through complexity, demonstrating composure through ambiguous, challenging and uncertain situations. Deepen and evolve your expertise with a focus on staying relevant. Initiate open and honest coaching conversations at all levels. Make difficult decisions and take action to resolve issues hindering team effectiveness. Model and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Application Evolution Services team will provide you with the opportunity to help organizations harness the power of their enterprise applications by optimizing the technology while driving transformation and innovation to increase business performance. We assist our clients in capitalizing on technology improvements, implementing new capabilities, and achieving operational efficiencies by managing and maintaining their application ecosystems. We help our clients maximize the value of their Salesforce investment by managing the support and continuous transformation of their solutions in areas such as sales, service, marketing and customer relationship Management. Minimum Degree Required (BQ) *: Bachelor’s Degree Degree Preferred Bachelor's degree Required Field(s) Of Study (BQ) Preferred Field(s) of Study: Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 7 years of experience Salesforce.com certifications: Certified Administrator, Certified Developer, Certified Business Analyst, or Certified Sales/Service Consultant; Additional specialized Salesforce.com certifications such as Marketing Cloud, Experience Cloud, App Builder, AI Associate Preferred Knowledge/Skills *: Demonstrates intimate level of abilities and/or a proven record of success identifying and addressing client needs: Acts as Salesforce domain specialist and provides coaching, mentoring, guidance and feedback to develop skills of team members; Analyzes and customizes Salesforce seasonal release notes for engagements, presenting them to clients; Serve as an SME for resolving complex production support issues and enhancements, staying current on Salesforce’s product roadmap and proposing solutions to clients; Leads ticket procedure calls with clients in their area of expertise; Leads design, development, and deployment of enhancements; Designs and develops deliverables/processes to improve delivery quality and efficiency; Leads aspects of delivery on multiple engagements; Manages client relationships and expectations, confirming client satisfaction of services; Leads delivery resource recruitment efforts; Develops training and certification plans for delivery resources; Conceptualizes, designs, and develops deliverables/processes to improve delivery quality and efficiency; Facilitates team operations management of multiple engagements and clients; Possesses strong functional and/or technical skills in Salesforce to provide solution architecture, design trade-offs, and ability to decipher design choices; Has ability to provide functional and/or technical leadership setting industry leading practices including quality of design, implementation, maintenance, and support; Possesses extensive experience in Force.com platform using Apex, Lightning and LWC; Proven experience with software configuration, Mobile solutions, Apex coding, or Visualforce coding experience with Salesforce and/or Veeva; Understanding of enterprise applications to which Salesforce.com clouds (for example: Sales, Service, Marketing, Revenue, Slack, MuleSoft) is commonly integrated to enable an end-to-end ecosystem for enterprise customers (e.g., SAP, Oracle, Marketo and related cloud and/or on-premises ERP business applications); Extensive abilities and/or a proven record of success serving as a solution or technical architect on one or more Salesforce Managed services engagements; Leads continuous improvement of solutions; Identifies automations and designs solutions to improve service delivery or simplify application processes for end users; Oversees transitioning and leading application support operations; Understands the common issues facing PwC's clients of specific industry sectors; Manages teams to deliver contracted services including troubleshooting and resolving production issues, developing and testing enhancements, and assessing impact of solutions within applications; Provides consistent communication of status to clients, including managing client expectations in regard to scope and service levels; Experience in: Development methodologies including Agile; Application technology stack for Salesforce; DevOps processes and tools; and, ITIL process knowledge/understanding is highly preferred. Should have strong technical skills in Salesforce to provide solution architecture, design trade-offs, and ability to decipher design choices. Should have managed multi environments, multi regions complex implementation support projects and therefore able to define scalable and robust solutions. Extensive experience in Force.com platform using Apex, Lightning and LWC. Solid implementation support experience using Sales / Service / Marketing /Custom cloud. Should have strong experience in working with middleware that supports Salesforce platforms like Mulesoft, Boomi, Informatica, Tibco, and Fusion middleware. Demonstrated solutioning experience in handling one or more Industry domain. Deep expertise in one or more Salesforce domain products – CPQ, CLM, nCino, Vlocity, FSL etc. Ability to address security complexities, and design solutions aligning with Salesforce security models. Experience in working with a broad range of emerging Salesforce products – B2B Commerce, Tableau CRM, CG Cloud, MFG Cloud, Loyalty cloud and Slack. Demonstrating ability to develop value propositions, solution approaches, and business proposals to meet client goals. Good experience with proposal activities like RFI/RFP analysis, RAID analysis, resource and effort estimation for Salesforce projects. Demonstrating communication skills to lead client executive discussions focused on scope, approach, design and implementation support considerations. Extensive experience managing and delivering multiple projects using Agile Methodology. Able to run practice initiatives and enable capabilities within the Salesforce practice. Good experience in articulating Point of Values and defining Go-to market solution. Review releases from Salesforce.com on a regular basis to determine new features that are appropriate for end users. Define, develop, and follow best practices in Salesforce. Able to handle data management inclusive of data load, data translation, data hygiene, data migration and integration. Proven ability to look at technical processes from a strategic standpoint and understand the inter-relationships. Recommend to team members or customers the appropriate and optimal use/configuration of a custom build solution. Familiarity building custom solutions on: SAP, Oracle, MS-SQL Server, or other RDMS. Proven track record of writing, interpreting, and managing deliverables of a consulting engagement. Awareness of the changing Cloud ecosystem and adjust to new technologies, methods and apps. Strong communication skills, negotiation skills, and conflict resolution. Possess advanced Salesforce certifications and Certified as Scrum Master. Demonstrating and directing multi-competency teams to deliver complex, quote-to-cash transformation programs. Additional Information Experience Level: 12-15 years

Posted 1 day ago

Apply

5.0 years

2 - 9 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Senior MIS Analyst Industry: Digital Transformation & Analytics Consulting We empower enterprises to unlock data-driven efficiency by building robust Management Information Systems (MIS) that turn raw operational data into strategic insight. Join our on-site analytics hub in India and steer mission-critical reporting initiatives end-to-end. Role & Responsibilities Own the complete MIS lifecycle—data extraction, transformation, validation, visualization, and scheduled distribution. Design automated dashboards and reports in Excel/Power BI that track KPIs, SLAs, and cost metrics for cross-functional leadership. Write optimized SQL queries and ETL scripts to consolidate data from ERP, CRM, and cloud platforms into a single reporting warehouse. Establish strong data governance, ensuring integrity, version control, and auditability of all reports. Collaborate with finance, operations, and technology teams to gather requirements, translate into reporting specs, and deliver within committed timelines. Mentor junior analysts on advanced Excel functions, VBA macros, and visualization best practices. Skills & Qualifications Must-Have Bachelor’s degree in Information Systems, Computer Science, or equivalent. 5+ years professional experience in MIS or Business Intelligence. Expert-level Excel with pivots, Power Query, and VBA scripting. Proficiency in SQL and relational databases (MySQL/SQL Server/PostgreSQL). Hands-on building interactive dashboards in Power BI or Tableau. Demonstrated ability to translate raw data into executive-ready insights. Preferred Experience with ETL tools (Informatica, Talend) or Python pandas. Knowledge of cloud data stacks (Azure Synapse, AWS Redshift, or GCP BigQuery). Understanding of statistical methods for forecasting and trend analysis. Benefits & Culture Highlights High-ownership role with direct visibility to C-suite decision makers. Continuous learning budget for BI certifications and advanced analytics courses. Collaborative, innovation-first culture that rewards data-driven thinking. Apply now to transform complex datasets into strategic clarity and accelerate your analytics career. Skills: business intelligence,ims,automation,sql,fms,google sheets,data governance,vba,excel,etl,analytics,pms,data visualization,dashboard design,data analysis,advanced,looker,power bi,dashboards

Posted 1 day ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Description Hiring Locations: Chennai, Trivandrum, Kochi Experience Range: 3 to 6 years Role Description The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels. Key Responsibilities Development & Engineering: Design, code, test, and implement scalable and efficient data pipelines. Develop features in accordance with requirements and low-level design. Write optimized, clean code using Python, PySpark, SQL, and ETL tools. Conduct unit testing and validate data integrity. Maintain comprehensive documentation of work. Monitoring & Support Monitor dashboards, pipelines, and databases across assigned shifts. Identify, escalate, and resolve anomalies using defined SOPs. Collaborate with L2/L3 teams to ensure timely issue resolution. Analyze trends and anomalies using SQL and Excel. Process Adherence & Contribution Follow configuration and release management processes. Participate in estimation, knowledge sharing, and defect management. Adhere to SLA and compliance standards. Contribute to internal documentation and knowledge bases. Mandatory Skills Strong command of SQL for data querying and analysis. Proficiency in Python or PySpark for data manipulation. Experience in ETL tools (any of the following): Informatica, Talend, Apache Airflow, AWS Glue, Azure ADF, GCP DataProc/DataFlow. Experience working with cloud platforms (AWS, Azure, or GCP). Hands-on experience with data validation and performance tuning. Working knowledge of data schemas and data modeling. Good To Have Skills Certification in Azure, AWS, or GCP (foundational or associate level). Familiarity with monitoring tools and dashboard platforms. Understanding of data warehouse concepts. Exposure to BigQuery, ADLS, or similar services. Soft Skills Excellent written and verbal communication in English. Strong attention to detail and analytical skills. Ability to work in a 24x7 shift model, including night shifts. Ability to follow SOPs precisely and escalate issues appropriately. Self-motivated with minimal supervision. Team player with good interpersonal skills. Outcomes Expected Timely and error-free code delivery. Consistent adherence to engineering processes and release cycles. Documented and trackable issue handling with minimal escalations. Certification and training compliance. High availability and uptime of monitored pipelines and dashboards. Skills Sql,Data Analysis,Ms Excel,Dashboards

Posted 1 day ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities: Able to develop , manage, design and performance on SQL and PL/SQL Skill set Design, develop, and maintain robust Databricks pipelines to support data processing and analytics. Implement and optimize Spark engine concepts for efficient data processing. Collaborate with business stakeholders to understand and translate business requirements into technical solutions. Utilize Informatica for data integration and ETL processes. Write complex SQL queries to extract, manipulate, and analyze data. Perform data analysis to support business decision-making and identify trends and insights. Ensure data quality and integrity across various data sources and platforms. Communicate effectively with cross-functional teams to deliver data solutions that meet business needs. Stay updated with the latest industry trends and technologies in data engineering and analytics. Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or a related role. Strong expertise in Databricks and Spark engine concepts. Proficiency in Informatica for ETL processes. Advanced SQL skills for data extraction and analysis. Excellent analytical skills with the ability to interpret complex data sets. Strong communication skills to effectively collaborate with business stakeholders and technical teams. Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus. Knowledge of data warehousing concepts and tools is desirable.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About The Role Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s In It For You Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills And Qualifications Required Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Greetings, TCS is conducting in-person interview drive in Kolkata on 28-jun-25 Job Role: Data Architect Experience : 4 -8 Years Location : Kolkata JOB DESCRIPTION: Languages – Java, Python, Scala AWS – S3, EMR, Glue, Redshift, Athena, Lamda Azure – Blob, ADLS, ADF, Synapse, PowerBI Google Cloud – Bigquery, DataProc, Looker Snowflake Databricks CDH - Hive, Spark, HDFS, Kafka CDH etc. ETL – Informatica, DBT, Mattilion,

Posted 1 day ago

Apply

7.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Key responsibilities • Assists IDQ leads & Architects in the development of the enterprise MDM data model • Develops efficient and tuned data pipelines as per the project data models • Evaluates and recommends new and emerging MDM/DQ technologies, methodologies and standards • Ensures consistency and compatibility with all other system technology components • Assists in developing the MDM data model and architecture • Utilizes a data modeling tool to develop "blueprints" for the Conceptual Data Model of the MDM architecture • Implements enterprise conceptual models to integrate master data from diverse data sources into conformed, high quality, and referentially sound enterprise master data model. • Implements master data modeling standards that includes naming standards, data quality indicators, data categorization, ETL attribute usage, data consistency and reuse, data quality. • Implements logical master data models using the data modeling tool for all layers in the MDM technical architecture framework or a DQ Dashboards. • Mitigates performance implications of various data modeling and architecture decisions at the logical and physical layer as related to ETL, ad hoc query access, and static reporting. • Reviews and validates master data models with stakeholders and subject matter experts. Qualifications and Requirements Essential qualifications • 7 years of experience in data quality development is required. • 5 years of experience in data integration or data management development. • 5 years of experience in data warehousing required • Informatica Code Migration trough Ansible playbook or any other significant CI/CD process • Hands-On on Informatica Data Quality tool is must • Working experience on Informatica MDM Project as IDQ developer will be added advantage • Working understanding of PowerBI or similar in memory analysis tool is preferred. • Experience with Agile development methodologies and concepts preferred. • Experience with the following disciplines is preferred: Data Warehousing, EAI, Metadata Management, Master Data Management (implementation and data stewardship), Data Quality Management, Relational Databases, Dimensional Databases, Semantics, Data Lifecycle Management, SQL, PL/SQL, UML, XML. • Common Manufacturing Operation procedures (like Raw materials, finished goods, suppliers, etc.) and the processes and procedures utilized to carry out Business Systems • Good written communication and meeting facilitation skills. • Excellent written communication and meeting facilitation skills required About Regal Rexnord Regal Rexnord is a publicly held global industrial manufacturer with 30,000 associates around the world who help create a better tomorrow by providing sustainable solutions that power, transmit and control motion. The Company’s electric motors and air moving subsystems provide the power to create motion. A portfolio of highly engineered power transmission components and subsystems efficiently transmits motion to power industrial applications. The Company’s automation offering, comprised of controls, actuators, drives, and precision motors, controls motion in applications ranging from factory automation to precision control in surgical tools. The Company’s end markets benefit from meaningful secular demand tailwinds, and include factory automation, food & beverage, aerospace, medical, data center, warehouse, alternative energy, residential and commercial buildings, general industrial, construction, metals and mining, and agriculture. Regal Rexnord is comprised of three operating segments: Industrial Powertrain Solutions, Power Efficiency Solutions, and Automation & Motion Control. Regal Rexnord has offices and manufacturing, sales and service facilities worldwide. For more information, including a copy of our Sustainability Report, visit RegalRexnord.com.

Posted 1 day ago

Apply

0 years

4 - 10 Lacs

Gurgaon

On-site

GlassDoor logo

Req ID: 327967 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a .Net Consultant SME to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Cloud Computer ( AWS ) - Expert .NET experience - Strong / Expert SQL DB - Strong SSIS / IICS ( Informatica ) - Desirable Life Insurance domain - Desirable About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

3.0 - 8.0 years

9 - 19 Lacs

Pune

Work from Office

Naukri logo

Basic Qualifications:- • Bachelors degree in computer science, engineering or a related field • Data: 3+ years of experience with data analytics and warehousing in Investment & Finance Domain • Areas of expertise: Capital Markets, Investment Banking, Asset Management, Asset classes,Trade life cycle, trade settlement, Securities, Cusips, Custodian,Fixed Income • SQL: Deep knowledge of SQL and query optimization • ELT: Good understanding of ELT methodologies and tools, Informatica experience is a must • Troubleshooting: Experience with troubleshooting and root cause analysis to determine and remediate potential issues • Communication: Excellent communication, problem solving and organizational and analytical skills • Able to work independently and to provide leadership to small teams of developers • 3+ years of coding and scripting (Python, Java, Scala) and design experience. • 3+ years of experience with Spark framework. • Experience with Vertica or any Columnar Databases. • Strong data integrity, analytical and multitasking skills. Preferred Qualifications • Masters degree in computer science or engineering or a related field • Cloud: Experience working in a cloud environment (e.g. AWS) • Python: Hands on experience developing with Python • Advanced Data Processing: Experience using data processing technologies such as Apache Spark or Kafka • Workflow: Good knowledge of orchestration and scheduling tools (e.g. Apache Airflow) • Reporting: Experience with data reporting (e.g. Microstrategy, Tableau, Looker) and data cataloging tools (e.g. Alation) Kavya.p(8341137995)

Posted 1 day ago

Apply

0 years

4 - 10 Lacs

Noida

On-site

GlassDoor logo

Req ID: 327967 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a .Net Consultant SME to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Cloud Computer ( AWS ) - Expert .NET experience - Strong / Expert SQL DB - Strong SSIS / IICS ( Informatica ) - Desirable Life Insurance domain - Desirable About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Mumbai, Hyderabad, New Delhi

Work from Office

Naukri logo

We are seeking a highly skilled and experienced OBIEE Consultant with over 5 years of expertise in OBIEE Reporting and RPD development, including at least 2 years working on BI 12c version The role requires strong SQL skills to write and debug scripts effectively The ideal candidate will have experience managing large-scale projects, with a solid understanding of project lifecycles and OBIEE security configurations Proficiency in OBIEE reporting, Informatica, and DAC is essential The consultant should be adept at accessing Informatica tools for log analysis and checking schedules in DAC The position demands strong problem-solving skills and the ability to work collaboratively within a fast-paced IT environment Immediate joiners are preferred for this remote opportunity Location: Remote- Delhi / NCR , Bangalore / Bengaluru , Hyderabad / Secunderabad , Chennai , Pune , Kolkata , Ahmedabad , Mumbai

Posted 1 day ago

Apply

8.0 years

25 - 30 Lacs

India

Remote

Linkedin logo

Job Title: Informatica ETL Developer Primary Skills: Informatica ETL, SQL, Data Migration, Redshift Informatica ETL SQL Data Migration Redshift Total Experience Required: 8+ Years Relevant Informatica Experience: Minimum 5+ Years Location: [Remote ] Employment Type: [Contract] Job Overview We are seeking an experienced Informatica ETL Developer with strong expertise in ETL development, data migration, and SQL optimization —especially on Amazon Redshift . The ideal candidate will have a solid foundation in data warehousing principles and hands-on experience with Informatica PowerCenter and/or Informatica Cloud , along with proven experience migrating ETL processes from platforms like Talend or IBM DataStage to Informatica. Key Responsibilities Lead or support the migration of ETL processes from Talend or DataStage to Informatica. Design, develop, and maintain efficient ETL workflows using Informatica PowerCenter or Informatica Cloud. Write, optimize, and troubleshoot complex SQL queries—especially in Amazon Redshift. Work on data lakehouse architectures and ensure smooth integration with ETL processes. Understand and implement data warehousing concepts, including star/snowflake schema design, SCDs, data partitioning, etc. Ensure ETL performance, data integrity, scalability, and data quality in all stages of processing. Collaborate with business analysts, data engineers, and other developers to gather requirements and design end-to-end data solutions. Perform performance tuning, issue resolution, and support production ETL jobs as needed. Contribute to design and architecture discussions, documentation, and code reviews. Work with structured and unstructured data and transform it as per business logic and reporting needs. Required Skills & Qualifications Minimum 8+ years of experience in data engineering or ETL development. At least 5+ years of hands-on experience with Informatica PowerCenter and/or Informatica Cloud. Experience in ETL migration projects, specifically from Talend or DataStage to Informatica. Proficiency in Amazon Redshift and advanced SQL scripting, tuning, and debugging. Strong grasp of data warehousing principles, dimensional modeling, and ETL performance optimization. Experience working with data lakehouse architecture (e.g., S3, Glue, Athena, etc. with Redshift). Ability to handle large data volumes, complex transformations, and data reconciliation. Strong understanding of data integrity, security, and governance best practices. Effective communication skills and ability to work cross-functionally with both technical and non-technical stakeholders. Nice To Have Experience with CI/CD for data pipelines or version control tools like Git. Exposure to Agile/Scrum development methodologies. Familiarity with Informatica Intelligent Cloud Services (IICS). Experience with Python or Shell scripting for automation. Skills: etl performance optimization,data integrity,redshift,informatica etl,amazon redshift,sql,informatica,amazon,data migration,data lakehouse architecture,etl development,data warehousing,data governance,data,architecture,etl

Posted 1 day ago

Apply

8.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

As Associate Manager, Data Engineering, You Will Lead the team of Data Engineers and develop innovative approaches on performance optimization & automation Analyzing enterprise specifics to understand current-state data schema and data model and contribute to define future-state data schema, data normalization and schema integration as required by the project Apply coding expertise, best practices and guidance in Python, SQL, Informatica and cloud data platform development to members of the team Collaborate with clients to harden, scale, and parameterize code to be scalable across brands and regions Understanding business objectives and develop business intelligence applications that help to monitor & improve critical business metrics Monitor project timelines ensuring deliverables are being met by team members Communicate frequently to stakeholders on project requirements, statuses and risks Manage the monitoring of productionized processes to ensure pipelines are executed successfully every day communicating delays as required to stakeholders Contribute to the design of scalable data integration frameworks to move and transform a variety of large data sets Develop robust work products by following best practices through all stages of development, testing & deployment Skills and Qualifications BTECH / master’s degree in a quantitative field (statistics, business analytics, computer science Team management experience is must. 8-10 Years of experience (with at least 2-4 yrs of experience in managing team) Vast background in all things data related Intermediate level of proficiency with Python and data related libraries (PySpark, Pandas, etc.) High level of proficiency with SQL (Snowflake a big plus) Snowflakes is REQUIRED. We need someone with a high level of Snowflake experience. Certification is a big plus AWS data platform development experience High level of proficiency with Data Warehousing and Data Modeling Experience with ETL tools (Informatica, Talend, DataStage) required Informatica is our tool and is required. IICS or Power Center is accepted. Ability to coach team members setting them up for success in their roles Capable of connecting with team members inspiring them to be their best The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands – KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -- born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the world’s largest restaurant company we have a clear and compelling mission: to build the world’s most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. We’re looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences – and we are only getting started. Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFC’s still use its 75-year-old finger lickin’ good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We don’t just say we are a great place to work – our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of “believe in all people” is lived every day, enjoying benefits including but not limited to: 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match – all encompassed in Yum!’s world-famous recognition culture.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to enhance data workflows and contribute to the overall efficiency of data management practices. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the design and implementation of data architecture to support data initiatives. - Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data integration techniques and ETL processes. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data quality best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required.

Posted 1 day ago

Apply

4.0 - 9.0 years

0 - 2 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

position: Contract to Hire(C2H) Skill: Skill Informatica – IDQ & IDE + Inofrmatca Powercenter + DBMS( SQL or PL SQL) + Three Implementations(End to end) in IDQ Experience:5+ Location: Hyd / BAng Notice Period: Immediate to 15 Days Job Descrption Experience in Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE). - Experience in admin activities associated with configuring IDQ and IDE. - Experience in at least one Address Validation and Cleansing tool ( Address Doctor, Trillium , etc.). - Experience in least one ETL tool (Informatica Power Center, Data Stage, SAP BODS, etc.). - Experience in DBMS concepts , SQL, PL/SQL, and Java (desired). - Experience in integrating ETL tools with Informatica Data Quality. - Experience in integrating IDQ with downstream and upstream applications through a batch/real-time interface. - Experience working in at least three end-to-end implementation of IDQ solutions . - Experience in fine tuning match/merge process and troubleshooting performance issues in IDQ. - Good knowledge of data quality concepts, data quality trends and other tools in market. - Good knowledge of data profiling and data monitoring concepts.

Posted 1 day ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job Title: IICS Developer Location: Pune, Gurgaon, Bangalore, Chennai, Hyderabad We are seeking an experienced IICS Developer with minimum 3 years of experience in Cloud Data Integration (CDI) and Cloud Application Integration (CAI) development. The candidate must be skilled in ETL development, data pipeline creation, data transformation, data migration, SQL, and unit testing. Exposure to Informatica MDM, Mulesoft, Core Java, IDMP platform development, life sciences industry, and GxP compliance is a plus. This role focuses on building robust, scalable solutions for enterprise data processing and integration workflows while ensuring compliance and quality. Key Responsibilities: 1. Core Responsibilities: ETL Development & Data Transformation: Design, implement, and optimize ETL workflows and data transformation processes using IICS CDI. Data Pipeline & Migration: Create and manage data pipelines for seamless integration and migrate data across heterogeneous systems. Cloud Application Integration (CAI): Build CAI workflows for real-time and batch application integration processes. SQL Proficiency: Write, debug, and optimize SQL for data manipulation and reporting. Unit Testing: Perform comprehensive unit testing on workflows to ensure reliability and quality across environments.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies