Jobs
Interviews

143 Iics Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 17.0 years

12 - 17 Lacs

Pune

Work from Office

Role Overview: The Technical Architect specializes in Traditional ETL tools such as Informatica Intelligent Cloud Services (IICS), and similar technologies. The jobholder designs, implements, and oversees robust ETL solutions to support our organization's data integration and transformation needs. Responsibilities: Design and develop scalable ETL architectures using tools like IICS, and other traditional ETL platforms. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Ensure data quality, integrity, and security throughout the ETL processes. Optimize ETL workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Troubleshoot and resolve complex technical issues related to ETL processes. Document architectural designs and decisions for future reference. Stay updated with emerging trends and technologies in ETL and data integration. Key Technical Skills & Responsibilities 12+ years of experience in data integration and ETL development, with at least 3 years in an Informatica architecture role. Extensive expertise in Informatica PowerCenter, IICS, and related tools (Data Quality, EDC, MDM). Proven track record of designing ETL solutions for enterprise-scale data environments Advanced proficiency in Informatica PowerCenter and IICS for ETL/ELT design and optimization. Strong knowledge of SQL, Python, or Java for custom transformations and scripting. Experience with data warehousing platforms (Snowflake, Redshift, Azure Synapse) and data lakes. Familiarity with cloud platforms (AWS, Azure, GCP) and their integration services. Expertise in data modeling, schema design, and integration patterns. Knowledge of CI/CD, Git, and infrastructure-as-code (e.g., Terraform) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Primary Skills: Informatica,IICS Data Lineage and Metadata Management Data Modeling Data Governance Data integration architectures Informatica data quality Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience in ETL architecture and development using tools like IICS, etc. Strong understanding of data integration, transformation, and warehousing concepts. Proficiency in SQL and scripting languages. Experience with cloud-based ETL solutions is a plus. Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Knowledge of data governance and compliance standards. Ability to work in a fast-paced environment and manage multiple priorities.

Posted 7 hours ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

You should have good experience in AWS, Informatica Power Centre, IICS, Unix, Unix Scripting or Python Scripting. Strong experience in SQL is required and you should be an expert on Snowflake. A minimum of 5+ years of experience in a Data Engineering role is necessary. Additionally, good communication skills are essential for this position.,

Posted 14 hours ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Work from Office

Gyansys is looking for resource with informatica CDGC Opportunity with one of our direct customer. Role & responsibilities 5+ years of experience of informatica and Good hands on experience in CDGC.

Posted 3 days ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Gurugram

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 4 days ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Noida

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 4 days ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Pune

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 4 days ago

Apply

3.0 - 8.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Requirement : Data Architect & Business Intelligence Experience: 5-12 Years Work Type: Full-Time Job Summary: We are looking for a Data Architect & Business Intelligence Expert who will be responsible for designing and implementing enterprise-level data architecture solutions. The ideal candidate will have extensive experience in data warehousing, data modeling, and BI frameworks , with a strong focus on Salesforce, Informatica, DBT, IICS, and Snowflake . Key Responsibilities: Design and implement scalable and efficient data architecture solutions for enterprise applications. Develop and maintain robust data models that support business intelligence and analytics. Build data warehouses to support structured and unstructured data storage needs. Optimize data pipelines, ETL processes, and real-time data processing. Work with business stakeholders to define data strategies that support analytics and reporting. Ensure seamless integration of Salesforce, Informatica, DBT, IICS, and Snowflake into the data ecosystem. Establish and enforce data governance, security policies, and best practices . Conduct performance tuning and optimization for large-scale databases and data processing systems. Provide technical leadership and mentorship to development teams. Key Skills & Requirements: Strong experience in data architecture, data warehousing, and data modeling . Hands-on expertise with Salesforce, Informatica, DBT, IICS, and Snowflake . Deep understanding of ETL pipelines, real-time data streaming, and cloud-based data solutions . Experience in designing scalable, high-performance, and secure data environments . Ability to work with big data frameworks and BI tools for reporting and visualization. Strong analytical, problem-solving, and communication skills

Posted 4 days ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Hyderabad

Hybrid

Job Title: Tech Lead GCP Data Engineer Location: Hyderabad, India Experience: 5+ Years Job Type: Full-Time Industry: IT / Software Services Functional Area: Data Engineering / Cloud / Analytics Role Category: Cloud Data Engineering Position Overview We are seeking a GCP Data Engineer with strong expertise in SQL , Python , and Google Cloud Platform (GCP) services including BigQuery , Cloud Composer , and Airflow . The ideal candidate will play a key role in building scalable, high-performance data solutions to support marketing analytics initiatives. This role involves collaboration with cross-functional global teams and provides an opportunity to work on cutting-edge technologies in a dynamic marketing data landscape. Key Responsibilities Lead technical teams and coordinate with global stakeholders. Manage and estimate data development tasks and delivery timelines. Build and optimize data pipelines using GCP , especially BigQuery , Cloud Storage , and Cloud Composer . Work with Airflow DAGs , REST APIs, and data orchestration workflows. Collaborate on development and debugging of ETL pipelines , including IICS and Ascend IO (preferred). Perform complex data analysis across multiple sources to support business goals. Implement CI/CD pipelines and manage version control using Git. Troubleshoot and upgrade existing data systems and ETL chains. Contribute to data quality, performance optimization, and cloud-native solution design. Required Skills & Qualifications Bachelors or Masters in Computer Science, IT, or related field. 5+ years of experience in Data Engineering or relevant roles. Strong expertise in GCP , BigQuery , Cloud Composer , and Airflow . Proficient in SQL , Python , and REST API development. Hands-on experience with IICS , MySQL , and data warehousing solutions. Knowledge of ETL tools like Ascend IO is a plus. Exposure to marketing analytics tools (e.g., Google Analytics, Blueconic, Klaviyo) is desirable. Familiarity with performance marketing concepts (segmentation, A/B testing, attribution modeling, etc.). Excellent communication and analytical skills. GCP certification is a strong plus. Experience working in Agile environments. To Apply, Send Your Resume To:krishnanjali.m@technogenindia.com

Posted 4 days ago

Apply

7.0 - 9.0 years

10 - 13 Lacs

Bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Technology-Data on Cloud - Datastore-Cloud based Integration Platforms-Informatica Intelligent Cloud services(IICS) Preferred Skills: Technology-Data on Cloud - Datastore-Cloud based Integration Platforms-Informatica Intelligent Cloud services(IICS)

Posted 5 days ago

Apply

6.0 - 11.0 years

7 - 17 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Informatica IICS Developer We are looking for a highly experienced Senior Informatica IICS Developer to lead the design development and optimization of cloudbased ETL solutions This role demands deep expertise in Informatica Intelligent Cloud Services IICS strong data integration skills and the ability to collaborate across business and technical teams to deliver scalable data pipelines Key Responsibilities Design and develop complex ETL workflows mappings and transformations using Informatica IICS and PowerCenter Implement data integration solutions across cloud and onpremise systems including Oracle SQL Server Snowflake and AWS Redshift Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications Optimize ETL processes for performance scalability and reliability Lead code reviews enforce best practices and mentor junior developers Participate in deployment planning documentation and production support Ensure data quality governance and security compliance across all data flows Required Qualifications https://forms.office.com/r/CHAgRiU4E6

Posted 5 days ago

Apply

15.0 - 24.0 years

20 - 35 Lacs

Hyderabad

Remote

Senior Technology Manager - Data Position Overview We are seeking an experienced Senior Technology Manager - Data to lead our data warehousing initiatives and manage complex ETL projects. This role combines hands-on technical expertise with strategic leadership to manage high-performing teams delivering mission-critical data solutions. Key Responsibilities Team Leadership & Management Lead and mentor a team of data engineers, ETL developers, and Data QAs Collaborate with cross-functional teams including Client teams and third-party Partners Drive agile development practices and ensure project delivery timelines are met Foster a culture of innovation, continuous learning, and technical excellence Provide guidance and career development for team members Data Warehousing & Architecture Drive modernization initiatives for legacy data warehouse systems Oversee data integration strategies across multiple source systems Ensure data quality, consistency, and reliability across all data platforms ETL Development & Management Manage complex ETL project implementations using Informatica and other enterprise tools Establish ETL development standards, coding practices, and deployment procedures Evaluate and implement new ETL technologies and methodologies Strategic Planning & Operations Develop technology roadmaps aligned with business objectives and data strategy Manage project budgets, resource allocation, and vendor relationships Collaborate with stakeholders to gather requirements and work with the technical team to translate business needs into technical solutions Ensure compliance with data privacy regulations and security standards Present technical strategies and project status to senior leadership Required Qualifications Technical Expertise 15+ years of experience in data warehousing, ETL development, and data management 5+ years of hands-on experience with Informatica PowerCenter/IICS or any other ETL tool Knowledge of cloud data platforms (AWS, Azure, GCP) and modern data stack technologies Leadership & Management 8+ years of experience managing large technical teams Proven track record of successfully delivering large-scale data warehousing projects Experience with project management methodologies (Agile, Scrum, Waterfall) Strong vendor management and budget oversight experience Preferred Qualifications Experience with real-time data processing and streaming technologies Knowledge of data governance tools and frameworks Background in healthcare, or retail industries Experience with DevOps practices for data pipelines (CI/CD, version control) Familiarity with data visualization tools Understanding of machine learning and advanced analytics concepts Key Competencies Strategic Thinking : Translate business requirements into scalable technical solutions Communication : Excellent written and verbal communication skills for technical and non-technical audiences Problem Solving : Strong analytical and troubleshooting capabilities Adaptability : Comfortable working in fast-paced, evolving technology environments Collaboration : Proven ability to work eectively across departments and with external partners

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You will be working at Everest DX, a Digital Platform Services company based in Stamford. The company's focus is on enabling Digital Transformation for enterprises by providing services such as Design, Build, Develop, Integrate, and Manage cloud solutions. They specialize in modernizing Data centers, building Cloud-native applications, and migrating existing applications into secure, multi-cloud environments to support digital transformation. The Digital Platform Services offered by Everest DX aim to reduce IT resource requirements, improve productivity, lower costs, and speed up digital transformation efforts. As a part of your responsibilities, you will need to have hands-on experience with ETL and SQL. Your tasks will include designing, developing, and optimizing ETL workflows using Informatica PowerCenter, implementing cloud-based ETL solutions using Informatica IDMC and IICS, and leading data migration projects from on-premise to cloud environments. Additionally, you will be expected to write complex SQL queries, perform data validation and transformation, troubleshoot and optimize ETL processes, and collaborate with cross-functional teams to gather requirements and design solutions. It will be essential for you to create and maintain documentation for ETL processes and system configurations, as well as implement industry best practices for data integration and performance tuning. The required skills for this role include hands-on experience with Informatica Power Center, IDMC, and IICS, strong expertise in writing complex SQL queries, and experience in data migration projects from on-premise to cloud environments. You should possess strong data analysis skills, a solid understanding of ETL design & development concepts, familiarity with cloud platforms like AWS and Azure, and experience with version control tools like Git and deployment processes. Preferred skills for this position include experience with data lakes, data warehousing, or big data platforms, familiarity with Agile methodologies, and knowledge of other ETL tools. If you are interested in joining Everest DX and contributing to their Digital Transformation initiatives, visit their website at http://www.everestdx.com to learn more about the company and its services.,

Posted 5 days ago

Apply

9.0 - 14.0 years

20 - 32 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Skills : Informatica Intelligent Cloud Services (IICS), Cloud Application Integration (CAI) IICS Integration with CAI (Mandatory) Experience : 8 - 16 yrs Location : All LTIM Office Location Job Description: Experience in IICS Application Integration components like Processes Service Connectors and process object Ability to integrating diverse cloud applications seamlessly and efficiently and build high volume mission critical cloud native applications Strong understanding across Cloud and infrastructure components server storage network data and applications to deliver end to end Cloud Infrastructure architectures and designs Build Service connectors for Real Time integration with Third party applications Experience in Integrating Informatica cloud with other applications like SAP Workday ServiceNow Experience in installing Addon connectors and drivers for IICS Expertise in using methodologies for data extraction transformation and loading data using various transformations like Expression Router Filter Lookup Update Strategy Union and Aggregator Strong technical experience building data integration processes by constructing mappings tasks task flows schedules and parameter files

Posted 6 days ago

Apply

4.0 - 8.0 years

8 - 14 Lacs

Noida, Pune, Bengaluru

Work from Office

Job Description: 3+ Years of experience in Informatica B2B (DX, DT) 3+ Years of experience in Informatica Power Center and IICS 3+ Years of experience with databases (MS-SQL server), Experience with application monitoring tools. Experience in Informatica Cloud Data Governance Catalog is preferred Analytical Skills: Ability to diagnose and solve complex technical problems. Communication: Strong verbal and written communication skills, able to explain technical concepts to non-technical users. Customer Service: Ability to provide excellent customer service under pressure and manage competing priorities. Knowledge of ITIL processes (incident, problem, and change management). Technical Support: Provide L1, L2 and L3 support for software applications. Troubleshoot and resolve application-related issues for end-users. Collaborate with developers, IT teams, and external vendors to address issues and implement fixes. Escalate unresolved issues to higher-level support or specialized teams. Monitoring and Maintenance: Monitor scheduled Jobs and ensure their successful completion. Perform routine maintenance tasks, including system updates, backups, and configuration changes. Assist with system upgrades, patches, and migrations to ensure continuity of service. Incident Management: Log, track, and manage incidents and service requests via ticketing systems. Follow established procedures for incident escalation and resolution. Participate in root cause analysis and problem management efforts. Documentation and Reporting: Maintain and update application documentation, including configuration settings and user guides. Create and present reports on system performance, issues, and resolution times. Document changes, fixes, and enhancements in a knowledge base for future reference. Collaboration: Work with cross-functional teams (IT, development, business users) to gather requirements and improve applications. Participate in system testing and quality assurance activities. Assist in the development and implementation of new application modules or features.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Hyderabad, Telangana, India

On-site

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Lead Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Description Support Engineer (Data Practice) Snowflake, Airflow, IICS Role: Support Engineer Job Summary: We are looking for a Support Engineer in Data Practice with expertise in Snowflake, Apache Airflow, and Informatica IICS (Intelligent Cloud Services). The ideal candidate will be responsible for monitoring, troubleshooting, and optimizing data pipelines, ETL workflows, and cloud-based data warehouses to ensure seamless data operations. Key Responsibilities: L1/L2 Support & Monitoring: Monitor Snowflake workloads, Airflow DAGs, and IICS ETL jobs, ensuring smooth execution. Troubleshoot data pipeline failures, performance bottlenecks, and job scheduling issues. Provide L1/L2-level support by analyzing logs, errors, and resolving real-time data ingestion issues. Perform incident management and root cause analysis (RCA) for recurring issues. Data Pipeline & ETL Support: Assist in maintaining and optimizing Snowflake data warehouse performance. Support ETL workflows in Informatica IICS, ensuring data integrity and transformation accuracy. Handle Apache Airflow job scheduling, orchestration, and task dependencies. Work on data validation, troubleshooting missing or incorrect records, and implementing fixes. Security, Maintenance & Documentation: Ensure data security, access control, and role-based permissions in Snowflake. Assist in managing data backup, retention policies, and performance tuning. Maintain detailed runbooks, support documentation, and troubleshooting guides for issue resolution. Collaborate with data engineers and developers to enhance system reliability. Required Skills & Qualifications: 3+ years of experience in Data Engineering Support or ETL Support. Strong understanding of Snowflake architecture, SQL queries, and performance tuning. Hands-on experience in Apache Airflow DAG monitoring and troubleshooting. Knowledge of Informatica IICS (Cloud Data Integration, task flows, and error handling). Basic understanding of Python or SQL scripting for debugging data issues. Experience in data warehousing concepts, ETL workflows, and cloud platforms (AWS/Azure/GCP). Strong analytical and problem-solving skills for resolving data-related issues. Preferred Qualifications: Exposure to CI/CD pipelines for data deployment. Knowledge of cloud storage (S3, ADLS, or GCS) and API-based data integrations. Familiarity with Git for version control and JIRA for ticket tracking.

Posted 1 week ago

Apply

4.0 - 7.0 years

7 - 17 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Key Responsibilities: Design, develop, and maintain data transformation pipelines using dbt/IICS on Snowflake . Write optimized SQL and Python scripts for complex data modeling and processing tasks. Collaborate with data analysts, engineers, and business teams to implement scalable ELT workflows. Create and manage data models , schemas , and documentation in dbt . Optimize Snowflake performance using best practices (clustering, caching, virtual warehouses). Manage data integration from data lakes , external systems, and cloud sources. Ensure data quality, lineage, version control, and compliance across all environments. Participate in code reviews, testing, and deployment activities using CI/CD pipelines. Required Skills: 58 years of experience in Data Engineering or Data Platform Development . Hands-on experience with Snowflake – data warehousing, architecture, and performance tuning. Proficient in dbt (Data Build Tool) – model creation, Jinja templates, macros, testing, and documentation. Hands-on experience in creating mapping and workflows in IICS and have extensive experience in performance tuning and troubleshooting activities Strong Python scripting for data transformation and automation. Advanced skills in SQL – writing, debugging, and tuning queries. Experience with Data Lake and Data Warehouse concepts and implementations. Familiarity with Git-based workflows and version control in dbt projects. Preferred Skills (Good to Have): Experience with Airflow , Dagster , or other orchestration tools. Knowledge of cloud platforms like AWS, Azure, or GCP. Exposure to BI tools like Power BI , Tableau , or Looker . Understanding Data Governance , Security , and Compliance . Experience in leading a development team

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

hyderabad, telangana

On-site

The ideal candidate for this role should have at least 4 years of experience as an ETL/Informatica developer. You should also have a minimum of 1 year of experience working with Snowflake and 1 year of experience with IICS. It is essential that you have hands-on experience developing specifications, test scripts, and code coverage for all integrations. Additionally, you should be adept at supporting the migration of integration code from lower to higher environments, such as production. In this role, you will be responsible for full and incremental ETL using Informatica Power Center. Your expertise in developing ETL/Informatica for Data Warehouse Integration from various data sources will be valuable. You should also have experience supporting integration configurations with iPaaS through connected apps or web services. Being able to work with Agile framework is a must for this position. The successful candidate should be willing to be on-call for selected off-shift hours. If you meet the requirements and are interested in this onsite position located in Hyderabad, please share your resumes with bhavana@ketsoftware.com and contact at 91828 22519.,

Posted 1 week ago

Apply

7.0 - 12.0 years

0 - 1 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities 6+ Years of experience in Informatica B2B (DX, DT). 6+ Years of experience in Informatica Power Center and IICS 6+ Years of experience with databases (MS-SQL server), Experience with application monitoring tools. Experience in Informatica Cloud Data Governance Catalog is preferred Analytical Skills: Ability to diagnose and solve complex technical problems. Communication: Strong verbal and written communication skills, able to explain technical concepts to non-technical users. Customer Service: Ability to provide excellent customer service under pressure and manage competing priorities. Knowledge of ITIL processes (incident, problem, and change management). Technical Support: Provide L1, L2 and L3 support for software applications. Troubleshoot and resolve application-related issues for end-users. Collaborate with developers, IT teams, and external vendors to address issues and implement fixes. Escalate unresolved issues to higher-level support or specialized teams. Monitoring and Maintenance: Monitor scheduled Jobs and ensure their successful completion. Perform routine maintenance tasks, including system updates, backups, and configuration changes. Assist with system upgrades, patches, and migrations to ensure continuity of service. Incident Management: Log, track, and manage incidents and service requests via ticketing systems. Follow established procedures for incident escalation and resolution. Participate in root cause analysis and problem management efforts. Documentation and Reporting: Maintain and update application documentation, including configuration settings and user guides. Create and present reports on system performance, issues, and resolution times. Document changes, fixes, and enhancements in a knowledge base for future reference. Collaboration: Work with cross-functional teams (IT, development, business users) to gather requirements and improve applications. Participate in system testing and quality assurance activities. Assist in the development and implementation of new application modules or features.

Posted 1 week ago

Apply

3.0 - 8.0 years

15 - 22 Lacs

Gurugram

Remote

The details of the position are: Position Details: Job Title : Data Engineer Client : Yum brands Job ID : 1666-1 Location : (Remote) Project Duration : 06 months Contract Job Description: We are seeking a skilled Data Engineer, who is knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projectswith the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India. Key Responsibilities As a data engineer, you will: • Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to build data pipelines to enable best-in-class restaurant technology solutions. • Play a key role in our Data Operations team—developing data solutions responsible for driving Yum! growth. • Design and develop data pipelines—streaming and batch—to move data from point-of-sale, back of house, operational platforms, and more to our Global Data Hub • Contribute to standardizing and developing a framework to extend these pipelines across brands and markets • Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.). • Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting, and other data integration points. Skills and Qualifications: • Vast background in all things data-related • AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.) • Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus • High level of proficiency with SQL (Snowflake a big plus) • Proficiency with Python for transforming data and automating tasks • Experience with Kafka, Pulsar, or other streaming technologies • Experience orchestrating complex task flows across a variety of technologies • Bachelor’s degree from an accredited institution or relevant experience

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Telangana

Work from Office

Key Responsibilities ETL Development: Design and implement ETL processes using Informatica PowerCenter, Cloud Data Integration, or other Informatica tools. Data Integration: Integrate data from various sources, ensuring data accuracy, consistency, and high availability. Performance Optimization: Optimize ETL processes for performance and efficiency, ensuring minimal downtime and maximum throughput.

Posted 1 week ago

Apply

5.0 - 7.0 years

25 - 40 Lacs

Gurugram

Work from Office

Our world is transforming, and PTC is leading the way.Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to achieve significant digital transformation through our market-leading software solutions. We empower customers to innovate faster, improve operations, and drive business growth—whether on-premises, in the cloud, or through our SaaS platform. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Senior Technical Support Specialist , you will serve as a key technical advisor and escalation point within the Servigistics Support organization. You will bring your rich industry experience to drive strategic customer success, mentor junior team members, and lead complex troubleshooting efforts. You will work cross-functionally with engineering, product management, and customer teams to ensure seamless and proactive technical support delivery. Key Responsibilities: Serve as the primary technical contact for high-priority and complex customer escalations. Lead resolution of mission-critical issues involving product functionality, performance, and deployment. Partner with global cross-functional teams to ensure holistic and timely resolution of customer challenges. Proactively identify and drive improvements in support processes and product usability. Contribute to and review KCS-aligned knowledge articles and promote customer self-service strategies. Collaborate with product and engineering teams to influence product roadmap based on customer feedback and insights. Mentor and guide junior technical support engineers; provide coaching and best practices. Represent support in customer meetings, escalations, and business reviews. Maintain high SLA compliance for enterprise customers with complex environments. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Strong experience in diagnosing and resolving enterprise-grade application issues across multiple layers (web, application, and database). Deep expertise in SQL (Oracle and SQL Server), with ability to write and optimize complex queries. Hands-on experience with ETL tools (Informatica, IICS, Kettle/Pentaho) and resolving batch job failures. Solid understanding of open-source web technologies such as Apache Tomcat and Apache Web Server. Experience in performance tuning, server configuration, log analysis, and application scalability. Knowledge of Java-based enterprise applications and implementation or support lifecycle. Familiarity with enterprise IT environments (networks, load balancing, security protocols, integrations). Proven ability to work independently under pressure while managing multiple complex issues. Preferred Qualifications: Experience with UNIX/Linux environments and command-line utilities. Knowledge of cloud platforms such as AWS including services S3. Exposure to machine learning concepts and their integration within enterprise systems Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 6+ years of relevant technical support, implementation, or consulting experience in enterprise software. Excellent written and verbal communication skills; able to interact confidently with senior stakeholders. Why Join PTC? Work with innovative products and talented global teams. Collaborative and inclusive culture where your voice matters. Extensive benefits including: Best-in-class insurance Employee stock purchase plan and RSUs Generous PTO and paid parental leave Flexible work hours and no probation clause Career growth opportunities and higher education support Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 1 week ago

Apply

6.0 - 8.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:Informatica Admin PowerCenter, IDQ, IICSExperience6-8 YearsLocation:Bangalore : Technical Skills: Informatica PowerCenter Administration: Install, configure, and maintain Informatica PowerCenter components (Repository Server, Integration Service, Domain Configuration) on Windows servers in AWS. Monitor and optimize PowerCenter performance, including troubleshooting and resolving issues. Informatica Data Quality (IDQ) Administration: Install, configure, and manage Informatica Data Quality (IDQ) components including IDQ Server and Data Quality Services. Ensure effective data profiling, cleansing, and enrichment processes. Informatica Intelligent Cloud Services (IICS) Migration: Plan and execute migration strategies for moving from on-premises Informatica PowerCenter and IDQ to Informatica Intelligent Cloud Services (IICS). Manage and facilitate the migration of ETL processes, data quality rules, and integrations to IICS. Ensure a smooth transition with minimal disruption to ongoing data processes. AWS Cloud Management: Manage Informatica PowerCenter, IDQ, and IICS environments within AWS, using services such as EC2, S3. Implement AWS security and compliance measures to protect data and applications. Performance Optimization: Optimize the performance of Informatica PowerCenter, IDQ, IICS, and Oracle databases to ensure efficient data processing and high availability. Conduct regular performance tuning and system health checks. Backup & Recovery: Develop and manage backup and recovery processes for Informatica PowerCenter, IDQ, and Oracle databases. Ensure data integrity and implement effective disaster recovery plans. Security & Compliance: Configure and manage security policies, user roles, and permissions for Informatica and Oracle environments. Monitor and enforce data security and compliance standards within AWS and Informatica platforms. Troubleshooting & Support: Diagnose and resolve issues related to Informatica PowerCenter, IDQ, IICS, and Oracle databases. Provide technical support and guidance to development and operational teams. Documentation & Reporting: Create and maintain detailed documentation for Informatica PowerCenter, IDQ, IICS configurations, and Oracle database settings. Generate and review performance and incident reports. Non-Technical Skill Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Ability to learn quickly in a dynamic start-up environment Able to talk to client directly and report to client/onsite Flexibility to work on different Shifts and Stretch

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced IICS Developer, you will be responsible for supporting a critical data migration project from Oracle to Snowflake. This remote opportunity requires working night-shift hours to align with the U.S. team. Your primary focus will be on developing and optimizing ETL/ELT workflows, collaborating with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. To excel in this role, you must possess strong hands-on experience with IICS (Informatica Intelligent Cloud Services), a solid background in Oracle databases (including SQL, PL/SQL, and data modeling), and a working knowledge of Snowflake, specifically data staging, architecture, and data loading. Your responsibilities will also include building mappings, tasks, and parameter files in IICS, as well as understanding data pipeline performance tuning to enhance efficiency. In addition, you will be expected to implement error handling, performance monitoring, and scheduling to support the migration process effectively. Your role will extend to providing assistance during the go-live phase and post-migration stabilization to ensure a seamless transition. This position offers the flexibility of engagement as either a Contract or Full-time role, based on availability and fit. If you are looking to apply your expertise in IICS development to contribute to a challenging data migration project, this opportunity aligns with your skill set and availability. The shift timings for this role are from 7:30 PM IST to 1:30 AM EST, allowing you to collaborate effectively with the U.S. team members.,

Posted 2 weeks ago

Apply

5.0 - 10.0 years

1 - 6 Lacs

Bengaluru

Remote

Key Responsibilities: Lead and execute end-to-end implementation of Informatica MDM solutions including Cloud Customer 360 (C360), Reference 360, IICS, and other Informatica tools. Design and configure match & merge rules, survivorship logic, and conduct performance tuning of MDM applications. Develop and maintain batch and API-based data integrations across systems using Informatica tools. Define and implement data quality rules and build dashboards to monitor data health and compliance. Create and maintain logical, physical, hierarchical, and reference data models supporting business needs Perform root cause analysis, provide resolutions for data-related issues, and create detailed technical documentation and solution designs. Collaborate with business stakeholders to gather requirements, translate them into technical solutions, and ensure alignment with data governance policies.Role & responsibilities Preferred candidate profile Functional Skills: • Strong ability to communicate with stakeholders, elicit business and technical requirements, and present solutions clearly. • In-depth knowledge of Customer, Product, and Vendor Master Data domains and their business impact. • Exposure to Agile, Scrum, or other project management methodologies for sprint planning and project execution (preferred). Required Skills & Expertise: • Informatica Suite: Cloud Customer 360 (C360), Master Data Management (MDM), Reference 360, IICS • MDM Implementation: Configuration, match/merge logic, performance optimization • Data Integration: Batch and API-level integration skills • Data Governance & Quality: Design and implementation of governance frameworks and monitoring dashboards • Data Modeling: Logical and physical modeling, including hierarchical and reference data structures • Troubleshooting & Documentation: Root cause analysis, documentation, and solution delivery Qualifications: • Bachelor's or Masters degree in Computer Science, Information Systems, or related field • Minimum 5 years of relevant experience in Informatica MDM implementations

Posted 2 weeks ago

Apply

6.0 - 11.0 years

4 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Dear Candidate Wonderful Job opportunity for Informatica Developer Greetings from LTIMindtree !!! Informatica IICS Developer We are looking for a highly experienced Senior Informatica IICS Developer to lead the design development and optimization of cloudbased ETL solutions This role demands deep expertise in Informatica Intelligent Cloud Services IICS strong data integration skills and the ability to collaborate across business and technical teams to deliver scalable data pipelines Key Responsibilities Design and develop complex ETL workflows mappings and transformations using Informatica IICS and PowerCenter Implement data integration solutions across cloud and onpremise systems including Oracle SQL Server Snowflake and AWS Redshift Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications Optimize ETL processes for performance scalability and reliability Lead code reviews enforce best practices and mentor junior developers Participate in deployment planning documentation and production support Ensure data quality governance and security compliance across all data flows Required Qualifications https://forms.office.com/r/Nr0f3qMG2t

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies