Jobs
Interviews

4334 Informatica Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment. Roles and Responsibilities Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment.

Posted 1 week ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Chennai

Work from Office

Minimum of 4+ years of hands-on experience in Ab Initio development. Develop and optimize ETL workflows and processes for data extraction transformation and loading. Proficiency in different Ab Initio suite components Strong understanding of ETL concepts, data warehousing principles and relational databases. Experience in designing and implementing ETL processes for large-scale data sets. Solid knowledge of SQL and scripting languages for data manipulation and analysis. Excellent problem-solving skills and ability to work independently or as part of a team. Strong communication skills and the ability to interact effectively with stakeholders at various levels. Perform unit testing debugging and troubleshooting of Ab Initio graphs and applications to ensure data accuracy and integrity.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Location - Pan India, Job Summary: We are seeking a highly skilled and motivated Data Quality Expert with 5–8 years of hands-on experience in implementing and managing data quality initiatives using tools like IBM Data Quality, IBM Cloud Pak for Data, or other industry-standard data quality platforms. The ideal candidate will work closely with business and technical teams to ensure data integrity, accuracy, completeness, and alignment with enterprise data governance policies. Key Responsibilities: Collaborate with data stewards, data owners, and business stakeholders to understand data quality requirements and priorities. Configure, implement, and maintain data quality rules, profiling, cleansing, and monitoring using tools like IBM Data Quality, IBM Cloud Pak for Data etc. Conduct regular data profiling and root cause analysis to identify data quality issues and define remediation plans. Define and enforce data quality metrics, dashboards, and scorecards to track data health. Participate in data governance activities, including metadata management, data lineage tracking, and policy enforcement. Support the implementation of Data Quality frameworks and standards across projects and domains. Work closely with the Data Governance team to align data quality initiatives with governance policies and regulatory compliance needs. Document data quality findings, technical specifications, workflows, and standard operating procedures. Recommend improvements to business processes, systems, and data flows to enhance data quality. Actively contribute to data quality assessments and audits as needed. Required Skills & Experience: 5 to 8 years of experience in Data Quality management and implementation. Hands-on expertise with one or more of the following tools: IBM Information Analyzer / IBM Data Quality IBM Cloud Pak for Data Informatica Data Quality (IDQ) Talend Data Quality Other similar tools are a plus. Strong understanding of data profiling, data cleansing, matching, standardization, and data validation techniques. Working knowledge of Data Governance concepts and frameworks (e.g., DAMA DMBOK). Good understanding of relational databases, data warehousing, ETL processes, and SQL. Familiarity with data catalogs, metadata management, and data lineage tools is desirable. Experience working in Agile environments and using tools like Jira, Confluence, or Azure DevOps. Strong analytical thinking, problem-solving skills, and attention to detail. Effective verbal and written communication skills for collaboration with business and technical teams. Preferred Qualifications: Bachelor's or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. IBM Certified Data Quality or Cloud Pak for Data certification is a plus.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This is a full-time role based in Bangalore. The Platform Owner - Data will lead a team of Data and ML practitioners –data engineers, ML engineers and technical engineers. The job holder is responsible for all data used at the Bank including Databases (Oracle, PostgreSQL, MongoDB), Hadoop, SAP HANA, Informatica, Spark, Kafka, and Redis. The role is required to work closely with Business and Technology stakeholders. This role involves ensuring the data platforms meet security requirements, are used efficiently by different workloads e.g., applications, data scientists, analytical users. There is also the need to audit and monitor activity. A key part of the role is resource management e.g., working with resource and product vendors scaling to provide resources to meet delivery demand. A key requirement is Servant Leadership, creating an environment where the engineers are empowered and encouraged to grow and build expertise and ownership. The role involves team management including recruiting of FTE and contractors and ensuring individuals’ careers and performance is managed according to HR policies. Leading platform architecture that power the Bank with new ML technologies. Ensure timely optimization and proper maintenance of ML processes. The role requires an Agile mindset as well as building an Agile culture as defined by the LEAP Strategy. The role also is required to adopt a continuous improvement approach to delivery and automation implementation e.g., DevOps, CI/CD. Roles & Responsibilities: Drive Data Platform strategy Monitor platform capacity and usage metrics. Capture performance metrics. Monitor user access. Ensure data security policy and procedures have been implemented. Implement measures Allocate user workspace and sandbox. Ensure data is protected. Manage platform capacity Resource planning Team skills matrix Manage resource utilization, demand, skills shortages. Team development, empowerment Create project codes. Ensure resources cost recover. Ensure vendor billing is accurate. Ensure invoices are processed in a timely manner Adopt Continuous Integration and Continuous Deployment Test case automation Requirements: Bachelor of Computer Science or Equivalent Relevant Certification Minimum 7 years of professional experience in Banking & Finance industry or technology company Minimum 5 years professional experience in Data Warehouse and Analytics Knowledge/exposure to Modern Data Warehouse Architecture and Analytics platforms Agile working practices 5 years Data Engineering experience or analytics experience 2 years’ experience of DevOps, CI/CD 5 years’ experience of managing large team >10 Solid knowledge of Data Warehouse principles 5 years’ Experience of working with Databases (Oracle, PostgresSQL, MongoDB) 5 years’ Experience of working with Hadoop and HANA 4+ years of working experience with ETL tool, preferably Informatica. 3+ years’ Experience of implementing ML technologies Experience of working with Hadoop related tools e.g., Spark, Scoop 2 years’ Experience of working in Agile teams as part of a squad. 2-year Experience of technical aspects of Data Governance including glossary, metadata, data quality, master data management & data lifecycle management Excellent communications, people management, coaching, leadership and strategic planning Team management experience for 10 plus staff members. Ability to develop collaborative and trustful relationships within a complex, outcome-focused organization Customer oriented work ethic, shows ownership and results orientation Analytical Thinking, Ability to influence others, leadership, Achievement Orientation, Innovative, Excellent written and verbal communication Ability to work across organization at all levels Ability to influence, coach/mentor, and listen Demonstrated ability to develop business cases and take broader stakeholder groups on the journey Demonstrated ability to identify the strategic drivers and benefits for change Good experience in influencing people and coordinating, facilitating and motivating senior management Strong organizational and people management skills developed in a metrics-based including experience with metrics performance management Strong verbal presentation and written communications skills with experience in senior management reporting and presentation Ability to recognize the need to respond rapidly when necessary while maintaining a strategic stance Demonstrated ability to manage complex vendor and supplier relationships Proven experience in contributing to and executing technology service development plans or roadmaps

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

The Platform Owner - Data will lead a team of Data and ML practitioners –data engineers, ML engineers and technical engineers. The job holder is responsible for all data used at the Bank including Databases (Oracle, PostgreSQL, MongoDB), Hadoop, SAP HANA, Informatica, Spark, Kafka, and Redis. The role is required to work closely with Business and Technology stakeholders. This role involves ensuring the data platforms meet security requirements, are used efficiently by different workloads e.g., applications, data scientists, analytical users. There is also the need to audit and monitor activity. A key part of the role is resource management e.g., working with resource and product vendors scaling to provide resources to meet delivery demand. A key requirement is Servant Leadership, creating an environment where the engineers are empowered and encouraged to grow and build expertise and ownership. The role involves team management including recruiting of FTE and contractors and ensuring individuals’ careers and performance is managed according to HR policies. Leading platform architecture that power the Bank with new ML technologies. Ensure timely optimization and proper maintenance of ML processes. The role requires an Agile mindset as well as building an Agile culture as defined by the LEAP Strategy. The role also is required to adopt a continuous improvement approach to delivery and automation implementation e.g., DevOps, CI/CD. Roles & Responsibilities: Drive Data Platform strategy Monitor platform capacity and usage metrics. Capture performance metrics. Monitor user access. Ensure data security policy and procedures have been implemented. Implement measures Allocate user workspace and sandbox. Ensure data is protected. Manage platform capacity Resource planning Team skills matrix Manage resource utilization, demand, skills shortages. Team development, empowerment Create project codes. Ensure resources cost recover. Ensure vendor billing is accurate. Ensure invoices are processed in a timely manner Adopt Continuous Integration and Continuous Deployment Test case automation Requirements: Bachelor of Computer Science or Equivalent Relevant Certification Minimum 7 years of professional experience in Banking & Finance industry or technology company Minimum 5 years professional experience in Data Warehouse and Analytics Knowledge/exposure to Modern Data Warehouse Architecture and Analytics platforms Agile working practices 5 years Data Engineering experience or analytics experience 2 years’ experience of DevOps, CI/CD 5 years’ experience of managing large team >10 Solid knowledge of Data Warehouse principles 5 years’ Experience of working with Databases (Oracle, PostgresSQL, MongoDB) 5 years’ Experience of working with Hadoop and HANA 4+ years of working experience with ETL tool, preferably Informatica. 3+ years’ Experience of implementing ML technologies Experience of working with Hadoop related tools e.g., Spark, Scoop 2 years’ Experience of working in Agile teams as part of a squad. 2-year Experience of technical aspects of Data Governance including glossary, metadata, data quality, master data management & data lifecycle management Excellent communications, people management, coaching, leadership and strategic planning Team management experience for 10 plus staff members. Ability to develop collaborative and trustful relationships within a complex, outcome-focused organization Customer oriented work ethic, shows ownership and results orientation Analytical Thinking, Ability to influence others, leadership, Achievement Orientation, Innovative, Excellent written and verbal communication Ability to work across organization at all levels Ability to influence, coach/mentor, and listen Demonstrated ability to develop business cases and take broader stakeholder groups on the journey Demonstrated ability to identify the strategic drivers and benefits for change Good experience in influencing people and coordinating, facilitating and motivating senior management Strong organizational and people management skills developed in a metrics-based including experience with metrics performance management Strong verbal presentation and written communications skills with experience in senior management reporting and presentation Ability to recognize the need to respond rapidly when necessary while maintaining a strategic stance Demonstrated ability to manage complex vendor and supplier relationships Proven experience in contributing to and executing technology service development plans or roadmaps

Posted 1 week ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities: Accountability for the practice in Asia, including driving quality, sales, recruiting, account management, consulting, and all operational aspects, including: Practice Building – Drive overall growth of practice area through a combination of business development, pre-sales and estimating, delivery work, and thought leadership. Team & Individual Development – Maximise team performance through an effective team approach that increases productivity and job satisfaction. Manage the allocation of our offshore resources to local projects. Engagement Management – Manage engagement risk, project economics including planning and budgeting, defining deliverable content, and ensuring buy-in of proposed solutions from top management levels at the client. Manage and deliver MuleSoft engagements initially while we are building the practice. Responsible for the profitability of all MuleSoft offerings, with revenue management expectations Builds and develops relationships with MuleSoft Alliance and Sales teams Owns joint sales pursuits in partnership with MuleSoft Identifies opportunities for growth and maturation of MuleSoft offerings Provides oversight and governance of all sold and managed MuleSoft projects Initially manage projects. Drives business development with the proper information, tools, and subject matter expertise to sell engagements within the offering Builds and develops relationship / partnership with local market teams, aligning on sales pursuits, resource capacity and capabilities, and awareness across the region Develops or supports the development of case studies and training materials; conducts brown bags and provides guidance for MuleSoft Practice Develops or supports the development and delivery of best practices, delivery templates, and point-of-view papers Oversees quality assurance of project delivery Facilitates client satisfaction surveys (where applicable) Oversees alignment of global resources to projects based on appropriate skills and availability, while being responsible for the overall utilization numbers of the team. Supports recruiting and onboarding of new employees. Profile Minimum Bachelor’s Degree in Software Development or Engineering 10+ years’ experience in a large consulting environment Deep technical understanding in the Integration and API Management space 6+ years prior experience leading projects built on integration platforms, preferably MuleSoft, or the following: Boomi, Informatica, TIBCO Expert at project delivery, including all aspects of program management and the SDLC Expert business development skills as well as managing relationships with both clients and internal stakeholders Expert communication (verbal and written) Expert business operations (e.g., invoicing, SOWs, margins, utilization) Skilled at managing multiple clients Excellent mentoring and leadership skills.

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Job Summary: We are seeking a skilled and analytical Data Architect & Business Intelligence Specialist to design, model, and implement robust data architectures, pipelines, and reporting frameworks. This role will be responsible for building and maintaining data models, overseeing data migrations, and developing scalable data warehouse solutions to support business intelligence and analytics initiatives. Key Responsibilities: 2. Data Pipelines & ETL/ELT 3. Data Migration 4. Data Warehousing 5. Business Intelligence & Reporting Data Architecture & Modeling Design and maintain the enterprise data architecture aligned with business and technical requirements. Develop logical and physical data models using industry best practices. Establish and maintain metadata standards and data dictionaries. Ensure data consistency, quality, and governance across all systems. Design and build efficient and scalable data pipelines for structured and unstructured data. Develop ETL/ELT processes using tools like Apache Airflow, Talend, Informatica, or Azure Data Factory. Optimize data ingestion, transformation, and loading procedures to support analytics. Plan and execute data migration projects from legacy systems to modern data platforms. Ensure data integrity and minimal downtime during migration activities. Collaborate with stakeholders to map old data structures to new architecture. Design, implement, and manage modern data warehouses (e.g., Snowflake, Redshift, BigQuery, Synapse). Ensure high performance, scalability, and security of data warehousing environments. Implement data partitioning, indexing, and performance tuning techniques. Collaborate with business stakeholders to gather reporting and analytics requirements. Build interactive dashboards and reports using tools like Power BI, Tableau, Looker, or Qlik. Enable self-service reporting and ensure data accuracy in BI platforms. Monitor data usage, performance, and drive continuous improvement in reporting frameworks. Requirements Requirements: Education & Experience: Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. 5+ years of experience in data architecture, modeling, pipelines, and BI/reporting. Technical Skills: Strong expertise in SQL, data modeling (3NF, dimensional, star/snowflake schemas). Experience with data warehouse technologies and cloud platforms (AWS, Azure, GCP). Proficiency in BI/reporting tools and data visualization best practices. Knowledge of Python, Scala, or other scripting languages is a plus. Familiarity with data governance, security, and compliance standards. Soft Skills: Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with both technical and non-technical stakeholders. Ability to translate complex technical concepts into business language

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role: Data Engineer ( Informatica BDM ) Exp – 5+Yrs Location - Bangalore/ Chennai Notice Period: Immediate JOB DESCRIPTION: · Minimum 5+ years of development and design experience in Informatica Big Data Management · Extensive knowledge on Oozie scheduling, HQL, Hive, HDFS (including usage of storage controllers) and data partitioning. · Extensive experience working with SQL and NoSQL databases. · Linux OS configuration and use, including shell scripting. · Good hands-on experience with design patterns and their implementation. · Well versed with Agile, DevOps and CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem. · Familiar with Distributed services resiliency and monitoring in a production environment. · Experience in designing, building, testing, and implementing security systems – including identifying security design gaps in existing and proposed architectures and recommend changes or enhancements. · Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action. · Knowledge on security controls designing Source and Data Transfers including CRON, ETLs, and JDBC-ODBC scripts. · Understand basics of Networking including DNS, Proxy, ACL, Policy, and troubleshooting · High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures. · Understand data sensitivity in terms of logging, events and in memory data storage– such as no card numbers or personally identifiable data in logs. · Implements wrapper solutions for new/existing components with no/minimal security controls to ensure compliance to bank standards.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement Informatica Data Quality solutions. - Collaborate with cross-functional teams to analyze and address data quality issues. - Create and maintain documentation for data quality processes. - Participate in data quality improvement initiatives. - Assist in training junior professionals in data quality best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of data quality principles. - Experience with ETL processes and data integration. - Knowledge of data profiling and cleansing techniques. - Familiarity with data governance and metadata management. Additional Information: - The candidate should have a minimum of 3 years of experience in Informatica Data Quality. - This position is based at our Kolkata office. - A 15 years full-time education is required., 15 years full time education

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Advanced Application Engineer Project Role Description : Develop innovative technology solutions for emerging industries and products. Interpret system requirements into design specifications. Must have skills : Informatica MDM Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Advanced Application Engineer, you will engage in the development of innovative technology solutions tailored for emerging industries and products. Your typical day will involve interpreting system requirements and translating them into detailed design specifications, ensuring that the solutions meet the needs of the business and its clients. You will collaborate with cross-functional teams to refine these specifications and contribute to the overall success of the projects you are involved in, while also staying updated on the latest technological advancements in your field. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of design specifications and system requirements. - Engage in continuous learning to stay abreast of industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration and data quality processes. - Experience with data modeling and metadata management. - Familiarity with ETL processes and data warehousing concepts. - Ability to troubleshoot and resolve data-related issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica MDM. - This position is based at our Bhubaneswar office. - A 15 years full time education is required., 15 years full time education

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

The SQL Developer is responsible for designing, developing, and maintaining database systems and complex SQL queries to support business operations, reporting, and analytics. This role involves working with stakeholders to ensure data integrity, performance, and accessibility across multiple systems. Design, write, and optimize SQL queries, stored procedures, functions, and triggers Develop and maintain relational databases and data models Perform database tuning, performance monitoring, and optimization Create and maintain reports, dashboards, and data visualizations as needed Collaborate with software developers, data analysts, and business users to meet data needs Ensure data accuracy, consistency, and security across all database systems Document database structures, processes, and procedures Assist in database upgrades, backups, and recovery processes Analyze existing queries for performance improvements Support data migration, integration, and ETL (Extract, Transform, Load) processes Proficiency in writing complex SQL queries and working with relational databases (e.g., SQL Server, MySQL, PostgreSQL, Oracle) Experience with stored procedures, indexing, and query optimization Familiarity with database design and data normalization principles Knowledge of data warehousing and ETL tools (e.g., SSIS, Talend, Informatica) is a plus Understanding of BI tools (e.g., Power BI, Tableau) is beneficial Strong analytical and problem-solving skills Good communication and teamwork abilities

Posted 1 week ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica MDM Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Expand upon the provided project role description and add more details to showcase creativity in your work. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and maintain data pipelines for efficient data processing. - Implement ETL processes to migrate and deploy data across systems. - Ensure data quality and integrity throughout the data lifecycle. - Collaborate with cross-functional teams to optimize data solutions. - Stay updated with industry trends and best practices in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data modeling and database concepts. - Experience with data integration tools and technologies. - Hands-on experience in designing and implementing data solutions. - Knowledge of data governance and data security practices. Additional Information: - The candidate should have a minimum of 3 years of experience in Informatica MDM. - This position is based at our Indore office. - A 15 years full-time education is required., 15 years full time education

Posted 1 week ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education "Summary:As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the functionality and efficiency of the applications. This role requires a strong understanding of Oracle Procedural Language Extensions to SQL (PLSQL) and the ability to work collaboratively with the team to provide solutions to work-related problems. Roles & Responsibilities:- Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test PLSQL code to meet business needs. - Troubleshoot and debug application issues to ensure optimal performance. - Optimize database queries and improve application performance. - Document technical specifications and user guides for developed applications. Professional & Technical Skills:- Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL) and Informatica - Strong understanding of database concepts and SQL. - Experience in performance tuning and query optimization. - Knowledge of software development life cycle (SDLC) methodologies. - Familiarity with version control systems such as Git or SVN. Additional Information:- The candidate should have a minimum of 4 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL). - This position is based in Gurugram. - A 15 years full-time education is required.", 15 years full time education

Posted 1 week ago

Apply

0.0 - 5.0 years

8 - 18 Lacs

Chennai, Tamil Nadu

On-site

Senior ETL Developer/Lead Data Engineer Job Summary Experience : 5-8 years Hybrid mode Full time/Contract Chennai Immediate joiner US shift timings Job Overview We are looking for a Senior ETL Developer who can take ownership of projects end-to-end , lead technical implementation, and mentor team members in ETL, data integration, and cloud data workflows. The ideal candidate will have 5–8 years of experience working with Talend , PostgreSQL , and AWS , and must be comfortable in a Linux environment . We are seeking a Senior ETL Developer with strong expertise in Talend , PostgreSQL , AWS , and Linux . The candidate should be able to take complete ownership of project execution—from design to delivery— while mentoring junior developers and driving technical best practices. The ideal candidate will have hands-on experience in data integration , cloud-based ETL pipelines , data versioning , and automation , and must be ready to work in a hybrid setup from Chennai or Madurai . · Design and implement scalable ETL workflows using Talend and PostgreSQL. · Handle complex data transformations and integrations across structured/unstructured sources. · Develop automation scripts using Shell/Python in a Linux environment. · Build and maintain stable ETL pipelines integrated with AWS services (S3, Glue, RDS, Redshift). · Ensure data quality, governance, and version control using tools like Git and Quilt. · Troubleshoot data pipeline issues and optimize for performance. · Schedule and manage jobs using tools like Apache Airflow, Cron, or Jenkins. · Mentor team members, review code, and promote technical best practices. · Drive continuous improvement and training on modern data tools and techniques. ETL & Integration · Must Have : Talend (Open Studio / DI / Big Data) · Also Good : SSIS, SSRS, SAS · Bonus : Apache NiFi, Informatica Databases · Required : PostgreSQL (3+ years) · Bonus : Oracle, SQL Server, MySQL Cloud Platforms · Required : AWS (S3, Glue, RDS, Redshift) · Bonus : Azure Data Factory, GCP · Certifications : AWS / Azure (Good to have) OS & Scripting · Required : Linux, Shell scripting · Preferred : Python scripting Data Versioning & Source Control · Required : Quilt, Git/GitHub/Bitbucket · Bonus : DVC, LakeFS, Git LFS Scheduling & Automation · Apache Airflow, Cron, Jenkins, Talend JobServer Bonus Tools · REST APIs, JSON/XML, Spark, Hive, Hadoop Visualization & Reporting · Power BI / Tableau (Nice to have) · Strong verbal and written communication. · Proven leadership and mentoring capabilities. · Ability to manage projects independently. · Comfortable adopting and teaching new tools and methodologies. · Willingness to work in a hybrid setup from Chennai or Madurai. Job Types: Full-time, Contractual / Temporary Pay: ₹800,000.00 - ₹1,800,000.00 per year Benefits: Flexible schedule Schedule: Evening shift Monday to Friday Rotational shift UK shift US shift Weekend availability Application Question(s): Do you have experience in AWS, GCP, Snowflakes, Data bricks ? If yes mention your field. Experience: ETL developer: 5 years (Required) Talend/Informatica : 5 years (Required) Location: Chennai, Tamil Nadu (Required) Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Position: SAP MDM Specialist Location: 100% Remote Employment Type: Contract Experience Level: 5+ years Qualifications Required: • 5+ years of experience in SAP MDM or MDG (Master Data Governance), including hands-on experience with SAP ECC and S/4HANA. • Experience with data governance frameworks, data standards, and best practices. • Familiarity with SAP MDM/MDG configuration, SAP Data Services, SAP LSMW, and ETL tools • Experience with SQL and Excel for data analysis is also highly beneficial. • Good understanding of SAP modules such as MM, SD, FI, and how master data impacts business processes across the organization. • Bachelor's degree in Information Technology, Business Administration, Data Management, or a related field. Qualifications Preferred: • SAP certification in MDM or MDG. • Experience with SAP Data Services, Informatica, or other data migration tools. • Previous experience in industries such as manufacturing, finance, retail, or pharmaceuticals where SAP systems are critical. Beverage industry is a plus. • Strong analytical, problem-solving, and communication skills. Ability to collaborate with cross-functional teams and manage multiple priorities in a fast paced environment.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About The Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About The Role We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties And Responsibilities Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools : Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies : Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience : Proficiency with AWS S3 for data storage and management. Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge : Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages : Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills : Strong understanding of relational databases and NoSQL databases. Version Control : Experience with version control systems (e.g., Git). Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. Communication : Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. (ref:hirist.tech)

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Greater Noida

Work from Office

We're Hiring: Informatica IDMC. Location: Greater Noida Only. Experience: 5 to 8 Years Company: Coforge Ltd Send your CV to Gaurav.2.Kumar@coforge.com WhatsApp at 9667427662 for queries. We are seeking a skilled ETL Developer with strong expertise in Informatica Intelligent Data Management Cloud (IDMC/IICS) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining scalable ETL pipelines that support our data integration and analytics initiatives. Key Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica IDMC (IICS) . Collaborate with data architects, analysts, and business stakeholders to gather and understand data requirements. Integrate data from diverse sources including databases, APIs, and flat files. Optimize data workflows for performance, scalability, and reliability. Monitor and troubleshoot ETL jobs and resolve data quality issues. Implement data governance and security best practices. Maintain comprehensive documentation of data flows, transformations, and architecture. Participate in code reviews and contribute to continuous improvement initiatives. Required Skills & Qualifications: Strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, PostgreSQL . Experience working with cloud platforms like AWS, Azure, or GCP . Familiarity with data warehousing concepts and tools such as Snowflake, Redshift, or BigQuery . Excellent problem-solving abilities and strong communication skills. Preferred Qualifications: Experience with CI/CD pipelines and version control systems. Knowledge of data modeling and metadata management. Certification in Informatica or cloud platforms is a plus. About Company:- Coforge is a leading global IT solutions organization, enabling its clients to transform at the intersect of unparalleled domain expertise and emerging technologies to achieve real-world business impact. A focus on very selects industries, a detailed understanding of the underlying processes of those industries and partnerships with leading platforms provides us a distinct vantage. We leverage AI, Cloud and Insight driven technologies, allied with our industry expertise, to transform client businesses into intelligent, high growth enterprises. Coforge is an Indian publicly traded IT software and services company, operating in many countries. The company's stock trades on the Bombay Stock Exchange and on the National Stock Exchange of India under the ticker symbol COFORGE. Coforge Web Link:- https://www.coforge.com/ Coforge LinkedIn Page Link - https://lnkd.in/dJ9n2HuT

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Required Skills Oracle Cloud, Oracle R12, SQL Requirements: • Familiarity with Oracle Product Data (PIM, PBH) and foundational data domains. • Understanding of Oracle Cloud ERP data model. • Strong SQL skills for data mining across multiple sources. • Excel proficiency – advanced formulas, VLOOKUPs, and working with large data sets. • Detail-oriented, organized, and able to follow instructions precisely. • Fluent in English, able to work during European hours. Preferred: • Background in Oracle R12 or other Oracle ERP systems. • Experience in Oracle Cloud • Experience working with PL/SQL, Oracle BI tools, or APEX. • Exposure to ETL tools (Informatica preferred) and data conversion projects. Additional Skills Requirements: • Familiarity with Oracle Product Data (PIM, PBH) and foundational data domains. • Understanding of Oracle Cloud ERP data model. • Strong SQL skills for data mining across multiple sources. • Excel proficiency – advanced formulas, VLOOKUPs, and working with large data sets. • Detail-oriented, organized, and able to follow instructions precisely. • Fluent in English, able to work during European hours. Preferred: • **MUST HAVE Background in Oracle R12 or other Oracle ERP systems. • **MUST HAVE Experience in Oracle Cloud • Experience working with PL/SQL, Oracle BI tools, or APEX. • Exposure to ETL tools (Informatica preferred) and data conversion projects. **all conversions will go through IDMC **Good communication skills are important. Will be interacting directly with the business to understand requirements, discuss next steps, etc. Job Description Overview Team is supporting an initiative to replace legacy ERP systems (e.g., Oracle 11i) with Oracle Cloud. This role is critical to managing data conversion and transformation efforts across multiple global releases, including a major rollout in North America and Europe (Netherlands, Germany, France) by April 2026. Key Responsibilities Collaborate with consultants and business teams on data profiling, governance, and enrichment. Extract data from 3–6 legacy databases using SQL. Validate data against transformation rules and prepare it for Oracle Cloud loading. Map data from source systems to Oracle Cloud tables, including complex logic and new field creation. Support ETL validation and enrichment across mock loads, SIT, UAT, and production. Provide structured Excel templates and repositories to business users. Manually clean and remediate poor-quality data as needed Required Qualifications Experience with Oracle R12 or other Oracle ERP systems. Hands-on experience with Oracle Cloud. Familiarity with PL/SQL, Oracle BI tools, or APEX. Strong SQL skills for data mining and transformation. Exposure to ETL tools (Informatica preferred) and data conversion projects. Requirements: • Familiarity with Oracle Product Data (PIM, PBH) and foundational data domains. • Understanding of Oracle Cloud ERP data model. • Strong SQL skills for data mining across multiple sources. • Excel proficiency – advanced formulas, VLOOKUPs, and working with large data sets. • Detail-oriented, organized, and able to follow instructions precisely. • Fluent in English, able to work during European hours. Preferred: • Background in Oracle R12 or other Oracle ERP systems. • Experience in Oracle Cloud • Experience working with PL/SQL, Oracle BI tools, or APEX. • Exposure to ETL tools (Informatica preferred) and data conversion projects.

Posted 1 week ago

Apply

0 years

0 Lacs

India

On-site

Required Skills Oracle Cloud, Oracle R12, SQL Requirements: • Familiarity with Oracle Product Data (PIM, PBH) and foundational data domains. • Understanding of Oracle Cloud ERP data model. • Strong SQL skills for data mining across multiple sources. • Excel proficiency – advanced formulas, VLOOKUPs, and working with large data sets. • Detail-oriented, organized, and able to follow instructions precisely. • Fluent in English, able to work during European hours. Preferred: • Background in Oracle R12 or other Oracle ERP systems. • Experience in Oracle Cloud • Experience working with PL/SQL, Oracle BI tools, or APEX. • Exposure to ETL tools (Informatica preferred) and data conversion projects. Additional Skills Preferred: • **MUST HAVE Background in Oracle R12 or other Oracle ERP systems. • ***MUST HAVE Experience in Oracle Cloud • Experience working with PL/SQL, Oracle BI tools, or APEX. • Exposure to ETL tools (Informatica preferred) and data conversion projects. **all conversions will go through IDMC **Good communication skills are important. Will be interacting directly with the business to understand requirements, discuss next steps, etc. Overview Team is supporting an initiative to replace legacy ERP systems (e.g., Oracle 11i) with Oracle Cloud. This role is critical to managing data conversion and transformation efforts across multiple global releases, including a major rollout in North America and Europe (Netherlands, Germany, France) by April 2026. Key Responsibilities Collaborate with consultants and business teams on data profiling, governance, and enrichment. Extract data from 3–6 legacy databases using SQL. Validate data against transformation rules and prepare it for Oracle Cloud loading. Map data from source systems to Oracle Cloud tables, including complex logic and new field creation. Support ETL validation and enrichment across mock loads, SIT, UAT, and production. Provide structured Excel templates and repositories to business users. Manually clean and remediate poor-quality data as needed Required Qualifications Experience with Oracle R12 or other Oracle ERP systems. Hands-on experience with Oracle Cloud. Familiarity with PL/SQL, Oracle BI tools, or APEX. Strong SQL skills for data mining and transformation. Exposure to ETL tools (Informatica preferred) and data conversion projects. Requirements: • Familiarity with Oracle Product Data (PIM, PBH) and foundational data domains. • Understanding of Oracle Cloud ERP data model. • Strong SQL skills for data mining across multiple sources. • Excel proficiency – advanced formulas, VLOOKUPs, and working with large data sets. • Detail-oriented, organized, and able to follow instructions precisely. • Fluent in English, able to work during European hours. Preferred: • Background in Oracle R12 or other Oracle ERP systems. • Experience in Oracle Cloud • Experience working with PL/SQL, Oracle BI tools, or APEX. • Exposure to ETL tools (Informatica preferred) and data conversion projects EoE

Posted 1 week ago

Apply

7.0 years

0 Lacs

India

On-site

Job Description Required Skills 7+ years software development (Java and Python) Prior project expertise in a Data Privacy platform integration/implementation (OneTrust, TrustArc, Osano, Captain Compliance, DataGrail, Securiti.AI) Proficient in writing complex SQL queries Expertise in deploying code into enterprise cloud provider (AWS, Google, and/or Azure) Expertise in working with multiple databases (relational and non-relational) Extensive RESTful API development English proficiency (written and oral) Desired Skill Prior experience integrating Securiti.AI to a client Daily Role You will be joining a Fortune 100 client, within their Data Governance/Stewardship organization. Currently, the client is implementing security.ai to manage their data privacy at the global level. There are 6 modules that need to be implemented and then there will be integration of various solutions. These two team members will be part of a larger team (which will expand in future months). As part of this team, you will assist in the development and implementation of securiti.ai in the scope of: there are several business applications, platforms, and data lakes processing personal data across different client markets that they may want to connect to Securiti.ai. Depending on the underlying technical architecture of each data source, we will choose to integrate them either via Informatica EDC (Enterprise Data Catalog) or Securiti SDI (Sensitive Data Intelligence), with the goal of automating personal data discovery. Upcoming integrations include connecting Securiti with Informatica Cloud, ServiceNow, Salesforce Consumer Data Source, and several local consumer data lakes. There are two immediate roles to be onboarded: Integration Lead & Technical Developer; to be located within India.

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About the role: This is a remote, full time consulting position (contract) responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics) to optimize digital channels and technology innovations with the end goal of creating competitive advantages for food services industry around the globe. We’re looking for a solid lead engineer who brings fresh ideas from past experiences and is eager to tackle new challenges. We’re in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data and cloud technologies. Candidates must also be proficient with data programming languages (Python and SQL), AWS cloud and Snowflake Data Platform. The data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects, with the intent of elevating the customer experience and driving revenue and profit growth globally. Qualification & Experience: Must have a full-time Bachelor's degree in Computer Science or similar from an accredited institution At least 3 years of experience as a data engineer with strong expertise in Python, Snowflake, PySpark, and AWS Proven experience delivering large-scale projects and products for Data and Analytics, as a data engineer Skill Set Requirement: Vast background in all things data-related 3+ years of real-world data engineering development experience in Snowflake and AWS (certifications preferred) Highly skilled in one or more programming languages, must have Python, and proficient in writing efficient and optimized code for data integration, storage, processing, manipulation and automation Strong experience in working with ELT and ETL tools and being able to develop custom integration solutions as needed, from different sources such as APIs, databases, flat files, and event streaming. Including experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus Strong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of data Strong programming skills in SQL, with proficiency in writing efficient and optimized code for data integration, storage, processing, and manipulation Strong experience in designing and implementing Data Warehousing solutions in AWS with Snowflake Good understanding of Data Modelling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions Proven experience as a Snowflake Developer, with a strong understanding of Snowflake architecture and concepts Proficient in Snowflake services such as Snowpipe, stages, stored procedures, views, materialized views, tasks and streams Robust understanding of data partitioning and other optimization techniques in Snowflake Knowledge of data security measures in Snowflake, including role-based access control (RBAC) and data encryption Experience with Kafka, Pulsar, or other streaming technologies Experience orchestrating complex task flows across a variety of technologies, Apache Airflow preferred Expert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, ECS/ECR, IAM, CloudWatch, EKS, API Gateway, etc Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues Responsibilities: Follow established design and constructed data architectures. Developing and maintaining data pipelines (streaming and batch), ensuring data flows smoothly from source (point-of-sale, back of house, operational platforms and more of a Global Data Hub) to destination. Handle ETL/ELT processes, including data extraction, loading, transformation and loading data from various sources into Snowflake to enable best-in-class technology solutions Play a key role in the Data Operations team - developing data solutions responsible for driving Growth Contribute to standardizing and developing a framework to extend these pipelines globally, across markets and business areas Develop on a data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best-in-breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.) Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points Ensure the reliability, scalability, and efficiency of data systems are maintained at all times Assist in the configuration and management of Snowflake data warehousing and data lake solutions, working under the guidance of senior team members Work with cross-functional teams, including Product, Engineering, Data Science, and Analytics teams to understand and fulfill data requirements Contribute to data quality assurance through validation checks and support data governance initiatives, including cataloging and lineage tracking Takes ownership of storage layer, SQL database management tasks, including schema design, indexing, and performance tuning Continuously evaluate and integrate new technologies to enhance data engineering capabilities and actively participate in our Agile team meetings and improvement activities Fusemachines is an Equal opportunity employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws. Powered by JazzHR s7fNo7jx6d

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Bengaluru, India

Work from Office

Looking for OBIA with 5 - 10 years of experience in OBIEE, Oracle BI Apps implementation/customization, ETL (Informatica/ODI), DAC, PL/SQL, data modeling, and functional knowledge of EBS, Siebel, and PeopleSoft modules.

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Requirement: Sound domain knowledge on FCCM(Financial Crime and Compliance Management) Anti Money Laundering(AML), Customer Screening(CS), KYC, Transaction monitoring & filtering, FATCA management, CTR, STR. Experience as a Business Analyst in implementing OFSAA FCCM suite CS, KYC, AML, ECM, Transaction filtering, FCCM analytics and studio.

Posted 1 week ago

Apply

7.0 - 12.0 years

22 - 37 Lacs

Bengaluru, Mumbai (All Areas)

Hybrid

Hiring: Data Engineering Senior Software Engineer / Tech Lead / Senior Tech Lead - Mumbai & Bengaluru - Hybrid (3 Days from office) | Shift: 2 PM 11 PM IST - Experience: 5 to 12+ years (based on role & grade) Open Grades/Roles : Senior Software Engineer : 58 Years Tech Lead : 710 Years Senior Tech Lead : 10–12+ Years Job Description – Data Engineering Team Core Responsibilities (Common to All Levels) : Design, build and optimize ETL/ELT pipelines using tools like Pentaho , Talend , or similar Work on traditional databases (PostgreSQL, MSSQL, Oracle) and MPP/modern systems (Vertica, Redshift, BigQuery, MongoDB) Collaborate cross-functionally with BI, Finance, Sales, and Marketing teams to define data needs Participate in data modeling (ER/DW/Star schema) , data quality checks , and data integration Implement solutions involving messaging systems (Kafka) , REST APIs , and scheduler tools (Airflow, Autosys, Control-M) Ensure code versioning and documentation standards are followed (Git/Bitbucket) Additional Responsibilities by Grade Senior Software Engineer (5–8 Yrs) : Focus on hands-on development of ETL pipelines, data models, and data inventory Assist in architecture discussions and POCs Good to have: Tableau/Cognos, Python/Perl scripting, GCP exposure Tech Lead (7–10 Yrs) : Lead mid-sized data projects and small teams Decide on ETL strategy (Push Down/Push Up) and performance tuning Strong working knowledge of orchestration tools, resource management, and agile delivery Senior Tech Lead (10–12+ Yrs) : Drive data architecture , infrastructure decisions , and internal framework enhancements Oversee large-scale data ingestion, profiling, and reconciliation across systems Mentoring junior leads and owning stakeholder delivery end-to-end Advantageous: Experience with AdTech/Marketing data , Hadoop ecosystem (Hive, Spark, Sqoop) - Must-Have Skills (All Levels): ETL Tools: Pentaho / Talend / SSIS / Informatica Databases: PostgreSQL, Oracle, MSSQL, Vertica / Redshift / BigQuery Orchestration: Airflow / Autosys / Control-M / JAMS Modeling: Dimensional Modeling, ER Diagrams Scripting: Python or Perl (Preferred) Agile Environment, Git-based Version Control Strong Communication and Documentation

Posted 1 week ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelors or masters degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies