Jobs
Interviews

5099 Informatica Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : Cloud InfrastructureMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a StreamSets SME with deep expertise in IBM StreamSets and working knowledge of cloud infrastructure to support ongoing client engagements. The ideal candidate should have a strong data engineering background with hands-on experience in metadata handling, pipeline management, and discovery dashboard interpretation.As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with organizational objectives, ensuring that all stakeholders are informed and involved in key decisions throughout the project lifecycle.Key Responsibilities:Lead and manage the configuration, monitoring, and optimization of StreamSets pipelines.Work closely with client teams and internal stakeholders to interpret metadata extracts and validate pipeline inputs from StreamSets and Pulse Discovery Dashboards.Evaluate metadata samples provided by the client and identify gaps or additional requirements for full extraction.Coordinate with SMEs and client contacts to validate technical inputs and assessments.Support the preparation, analysis, and optimization of full metadata extracts for ongoing project phases.Collaborate with cloud infrastructure teams to ensure seamless deployment and monitoring of StreamSets on cloud platforms.Provide SME-level inputs and guidance during design sessions, catch-ups, and technical reviews.Ensure timely support during critical assessments and project milestones. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and implement necessary adjustments to meet deadlines and quality standards.Preferred Qualifications:Proven experience in StreamSets (IBM preferred) pipeline development and administration.Familiarity with Discovery Dashboard and metadata structures.Exposure to cloud platforms such as AWS, including infrastructure setup and integration with data pipelines.Strong communication skills for client interaction and stakeholder engagement.Ability to work independently in a fast-paced, client-facing environment. Professional & Technical Skills: - Must To Have Skills: Proficiency SME in IBM StreamSets- Good To Have Skills: Experience with Cloud Infrastructure. Proficiency in Data Engineering.- Strong understanding of data modeling and ETL processes.- Experience with big data technologies such as Hadoop or Spark.- Familiarity with database management systems, including SQL and NoSQL databases. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Business Intelligence Enterprise Edition (OBIEE) Plus Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency in your work. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Business Intelligence Enterprise Edition (OBIEE) Plus.- Strong understanding of data modeling and ETL processes.- Experience with dashboard creation and report generation.- Familiarity with SQL and database management.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Oracle Business Intelligence Enterprise Edition (OBIEE) Plus.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Noida

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also be responsible for ensuring that the data models are aligned with best practices and industry standards, facilitating smooth data integration and accessibility across the organization. This role requires a proactive approach to problem-solving and a commitment to delivering high-quality data solutions that enhance decision-making processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate workshops and meetings to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation of data models and architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies mainly data vault 2.0.- Good To Have Skills: Experience with data governance frameworks, snowflake warehouse/AWS- Strong understanding of relational and non-relational database systems.- Familiarity with data warehousing concepts and ETL processes.- Experience in using data modeling tools such as Erwin or IBM InfoSphere Data Architect. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at any location.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Mumbai

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft SQL Server Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, addressing any challenges that arise, and providing guidance to team members to foster a productive work environment. You will also engage in strategic discussions to align project goals with organizational objectives, ensuring that the applications developed meet the needs of stakeholders effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft SQL Server.- Strong understanding of database design and management.- Experience with performance tuning and optimization of SQL queries.- Familiarity with data integration and ETL processes.- Ability to troubleshoot and resolve database-related issues. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft SQL Server.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also be responsible for troubleshooting issues and providing guidance to team members, fostering a collaborative environment that encourages innovation and efficiency in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

13.0 - 23.0 years

50 - 55 Lacs

Hyderabad

Work from Office

Role : Snowflake Practice Lead / Architect / Solution Architect Exp : 13+ Years Work Location : Hyderabad Position Overview : We are seeking a highly skilled and experienced - Snowflake Practice Lead- to drive our data strategy, architecture, and implementation using Snowflake. This leadership role requires a deep understanding of Snowflake's cloud data platform, data engineering best practices, and enterprise data management. The ideal candidate will be responsible for defining best practices, leading a team of Snowflake professionals, and driving successful Snowflake implementations for clients. Key Responsibilities : Leadership & Strategy : - Define and drive the Snowflake practice strategy, roadmap, and best practices. - Act as the primary subject matter expert (SME) for Snowflake architecture, implementation, and optimization. - Collaborate with stakeholders to understand business needs and align data strategies accordingly. Technical Expertise & Solutioning : - Design and implement scalable, high-performance data architectures using - Snowflake- . - Develop best practices for data ingestion, transformation, modeling, and security- within Snowflake. - Guide clients on Snowflake migrations, ensuring a seamless transition from legacy systems. - Optimize - query performance, storage utilization, and cost efficiency- in Snowflake environments. Team Leadership & Mentorship : - Lead and mentor a team of Snowflake developers, data engineers, and architects. - Provide technical guidance, conduct code reviews, and establish best practices for Snowflake development. - Train internal teams and clients on Snowflake capabilities, features, and emerging trends. Client & Project Management : - Engage with clients to understand business needs and design tailored Snowflake solutions. - Lead - end-to-end Snowflake implementation projects , ensuring quality and timely delivery. - Work closely with - data scientists, analysts, and business stakeholders- to maximize data utilization. Required Skills & Experience : - 10+ years of experience- in data engineering, data architecture, or cloud data platforms. - 5+ years of hands-on experience with Snowflake in large-scale enterprise environments. - Strong expertise in SQL, performance tuning, and cloud-based data solutions. - Experience with ETL/ELT processes, data pipelines, and data integration tools- (e.g., Talend, Matillion, dbt, Informatica). - Proficiency in cloud platforms such as AWS, Azure, or GCP, particularly their integration with Snowflake. - Knowledge of data security, governance, and compliance best practices . - Strong leadership, communication, and client-facing skills. - Experience in migrating from traditional data warehouses (Oracle, Teradata, SQL Server) to Snowflake. - Familiarity with Python, Spark, or other big data technologies is a plus. Preferred Qualifications : - Snowflake SnowPro Certification- (e.g., SnowPro Core, Advanced Architect, Data Engineer). - Experience in building data lakes, data marts, and real-time analytics solutions- . - Hands-on experience with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC)- in Snowflake environments. Why Join Us? - Opportunity to lead cutting-edge Snowflake implementations- in a dynamic, fast-growing environment. - Work with top-tier clients across industries, solving complex data challenges. - Continuous learning and growth opportunities in cloud data technologies. - Competitive compensation, benefits, and a collaborative work culture.

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly to support business operations. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with organizational goals. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and ensure timely delivery of application development milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with data modeling and database design.- Familiarity with performance tuning and optimization techniques.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in Ab Initio.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with database management systems and SQL.- Familiarity with application development methodologies and best practices.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated with the latest technologies and methodologies to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing them with guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing platforms and services.- Ability to analyze and optimize query performance. Additional Information:- The candidate should have minimum 5 years of experience in Google BigQuery.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

6.0 years

0 Lacs

India

On-site

Job Description: We are seeking a highly skilled 6+ years OBIEE Developer with strong experience in Informatica (on-premise version) to support a post-merger integration. The ideal candidate will be responsible for resolving issues resulting from a poorly executed Informatica implementation by a previous organization. This role will focus on data cleanup, dashboard/report accuracy, and system performance. Key Responsibilities: Analyze and clean up OBIEE reports and dashboards affected by data integration issues. Troubleshoot and resolve data inconsistencies and transformation problems within Informatica. Collaborate with business users to identify and correct data discrepancies. Optimize existing OBIEE reports for better performance and usability. Work closely with data warehouse and ETL teams to ensure proper data flow and transformation. Create and maintain documentation on fixes, enhancements, and configuration changes. Provide technical expertise and support during UAT and production rollout. Requirements: 6 to 9 years of experience in OBIEE development and administration. Strong hands-on experience with Informatica PowerCenter (on-premises). Expertise in BI reporting, RPD development, and performance tuning in OBIEE. Experience resolving data integrity issues in post-merger environments. Strong SQL and data modeling skills.  Excellent communication and problem-solving abilities.

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Mumbai

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Business Intelligence (BI) Testing Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are delivered on time and meet the quality standards expected by stakeholders. Your role will require you to balance technical expertise with effective communication, ensuring that all team members are aligned with project goals and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Business Intelligence (BI) Testing.- Strong analytical skills to assess application performance and identify areas for improvement.- Experience with data integration and ETL processes.- Familiarity with reporting tools and dashboard creation.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Business Intelligence (BI) Testing.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Varicent Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Varicent.- Strong understanding of software development life cycle methodologies.- Experience with application performance tuning and optimization.- Familiarity with version control systems such as Git.- Ability to troubleshoot and resolve software issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in Varicent.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

1.0 - 7.0 years

7 - 8 Lacs

Hyderabad

Work from Office

Req ID: 328472 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts

Posted 1 week ago

Apply

10.0 - 12.0 years

17 - 18 Lacs

Hyderabad

Work from Office

Req ID: 333186 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a DATA SENIOR ENGINEER to join our team in Hyderabad, Telangana (IN-TG), India (IN). Job Duties: Recent experience and clear responsibilities in an ETL / Extracts / Data Engineering developer capacity. Experience with Informatica PowerCenter Development High level of proficiency with Unix (AIX and Linux preferred) command line High level of comfort with Oracle SQL Minimum Skills Required: Experience with Informatica PowerCenter Development Good understanding on Unix (Linux preferred) command line High level of comfort with Oracle SQL

Posted 1 week ago

Apply

15.0 - 20.0 years

11 - 14 Lacs

Bengaluru

Work from Office

Job Title / Advertise Job Title: LEAD DATA/AI ENGINEERING Job Summary: The Lead Data Engineer is responsible for architecting, developing, and optimizing robust data solutions that support the unique analytical and reporting needs of telecom finance. This role leads a team in building scalable data pipelines, integrating diverse financial and operational data sources, and implementing advanced analytics and AI models to drive business insights and efficiency. Collaborating closely with finance, IT, and business stakeholders, the Lead Data Engineer ensures data quality, governance, and compliance while leveraging cutting-edge technologies in data warehousing, cloud platforms, and automation. The ideal candidate brings deep expertise in both telecom and financial data domains, excels at solving complex business challenges, and mentors the team to deliver data-driven solutions that enable strategic decision-making and financial growth. Roles and Responsibilities: Design, Development, Testing, and Deployment: Drive the development of scalable Data & AI warehouse applications by leveraging software engineering best practices such as automation, version control, and CI/CD. Implement comprehensive testing strategies to ensure reliability and optimal performance. Manage deployment processes end-to-end, effectively addressing configuration, environment, and security concerns. Engineering and Analytics: Transform Data Warehouse and AI use case requirements into robust data models and efficient pipelines, ensuring data integrity by applying statistical quality controls and advanced AI methodologies. API & Microservice Development: Design and build secure, scalable APIs and microservices for seamless data integration across warehouse platforms, ensuring usability, strong security, and adherence to best practices. Platform Scalability & Optimization: Assess and select the most suitable technologies for cloud and on-premises data warehouse deployments, implementing strategies to ensure scalability, robust performance monitoring, and cost-effective operations. Lead : Lead the execution of complex data engineering and AI projects to solve critical business problems and deliver impactful results. Technologies: Leverage deep expertise in Data & AI technologies (such as Spark, Kafka, Databricks, and Snowflake), programming (including Java, Scala, Python, and SQL), API integration patterns (like HTTP/REST and GraphQL), and leading cloud platforms (Azure, AWS, GCP) to design and deliver data warehousing solutions. Financial Data Integration: Oversee the integration of highly granular financial, billing, and operational data from multiple telecom systems (e.g., OSS/BSS, ERP, CRM) to ensure a single source of truth for finance analytics. Regulatory & Compliance Reporting: Ensure that data solutions support strict telecom and financial regulatory requirements (e.g., SOX, GAAP, IFRS, FCC mandates), including automated audit trails and comprehensive reporting. Revenue Assurance & Fraud Detection: Develop and maintain data pipelines and analytical models for revenue assurance, leakage detection, and fraud monitoring specific to telecom finance operations. Real-Time & Near Real-Time Analytics: Architect solutions that deliver real-time or near real-time insights for finance operations, supporting decision-making in fast-paced telecom environments. Cost Allocation & Profitability Analysis: Implement advanced data models and processes for cost allocation, margin analysis, and profitability reporting across telecom products, services, and regions. Intercompany & Affiliate Billing: Design and manage data flows for complex intercompany transactions and affiliate billing, ensuring financial accuracy across business units and partnerships. Shift timing (if any): 12:30 PM to 9:30 PM IST Location / Additional Location (if any): Bangalore, Hyderabad Overall Experience: Typically requires a minimum 15 years of progressive experience in data engineering, data architecture, or related fields. At least 3 5 years of hands-on experience in the telecom or finance domain, preferably integrating and managing large-scale financial and operational data systems. Demonstrated experience leading complex data projects, managing teams, and delivering end-to-end data solutions in large or matrixed organizations is highly valued. Experience with cloud data platforms, big data technologies, and implementing best practices in data governance and DevOps is strongly preferred. Primary / Mandatory skills: Delivery: Proven experience in managing and delivering complex data engineering and AI solutions for major business challenges. Telecom Data Domain Expertise: Deep understanding of telecom data structures, including OSS/BSS, CDRs, billing, customer, and product hierarchies. Financial Data Modeling: Experience designing data models for financial reporting, revenue recognition, cost allocation, and profitability analysis specific to telecom finance. Regulatory Compliance Knowledge: Familiarity with telecom and finance regulatory frameworks (e.g., SOX, IFRS, FCC reporting) and ability to implement compliant data solutions. Data Reconciliation & Audit: Strong skills in building automated data reconciliation, validation, and audit trails to ensure financial integrity. Data Architecture & Modeling: Expertise in designing scalable, high-performance data architectures (e.g., data warehouses, data lakes, data marts) and creating robust data models. ETL/ELT Development: Advanced skills in building, optimizing, and maintaining data pipelines using modern ETL/ELT tools (e.g., Informatica, Talend, dbt, Azure Data Factory). Cloud Platforms: Proficiency with cloud data services and platforms such as AWS, Azure, or Google Cloud (e.g., Redshift, Snowflake, Databricks, BigQuery). Programming Languages: Strong coding ability in SQL and at least one general-purpose language (e.g., Python, Scala, Java). Big Data Technologies: Experience with distributed data processing frameworks (e.g., Spark, Hadoop) and real-time streaming tools (e.g., Kafka). Data Governance & Quality: Knowledge of data governance practices, data lineage, data cataloging, and implementing data quality checks. CI/CD & Automation: Experience in automating data workflows, version control (e.g., Git), and deploying CI/CD pipelines for data applications. Analytics & AI/ML Integration: Ability to support advanced analytics and integrate machine learning pipelines with core data platforms. Leadership & Collaboration: Proven track record in leading teams, mentoring engineers, and collaborating with business, analytics, and IT stakeholders. Problem Solving & Communication: Strong analytical, troubleshooting, and communication skills to translate business needs into technical solutions. Secondary / Desired skills: Data Visualization: Experience with BI tools such as Power BI, Tableau, or Looker for creating dashboards and visual analytics. AI/ML Model Operationalization: Familiarity with deploying, monitoring, and scaling machine learning models in production environments (MLOps). API & Microservices Development: Understanding of building and consuming RESTful APIs and microservices for data integration. Data Security & Privacy: Knowledge of data encryption, access controls, and compliance with data privacy regulations (GDPR, CCPA, SOX). Data Catalogs & Metadata Management: Experience with tools like Alation, Collibra, or Azure Purview for cataloging and managing metadata. Workflow Orchestration: Hands-on with workflow tools (e.g., Apache Airflow, Control-M, Prefect) for scheduling and monitoring data jobs. Performance Tuning: Skills in optimizing queries, storage, and processing for cost and speed. Change/Data Release Management: Experience in managing data schema evolution, versioning, and deployment coordination. GitHub & Copilot Proficiency: Proficient in using GitHub for version control, collaboration, and CI/CD pipelines; experience leveraging GitHub Copilot to enhance coding efficiency and foster team productivity. DevOps for Data: Exposure to infrastructure-as-code (Terraform, CloudFormation) and containerization (Docker, Kubernetes) for data workloads. Domain Knowledge: Understanding of Finance, Telecom, Retail, or the relevant business domain to better align data solutions with business needs. Project Management: Familiarity with Agile, Scrum, or Kanban methodologies for managing data projects. Stakeholder Management: Ability to effectively engage with non-technical users, translate requirements, and manage expectations. Additional information (if any): Leadership & Mentorship: Expected to mentor and develop junior engineers, foster a culture of knowledge sharing, and lead by example in adopting best practices. Cross-Functional Collaboration: Will work closely with data scientists, business analysts, product managers, and IT teams to deliver end-to-end solutions that meet business needs. Innovation & Continuous Improvement: Encouraged to stay current with emerging technologies and trends in data engineering, AI/ML, and cloud platforms, and to proactively recommend and implement improvements. Ownership & Accountability: Responsible for the entire data engineering lifecycle, including architecture, implementation, monitoring, and optimization. Communication Skills: Must be able to translate complex technical concepts into clear, actionable insights for non-technical stakeholders and leadership. Change Management: Experience managing change in fast-paced environments and guiding teams through technology transformations is highly valued. Quality & Compliance Focus: Commitment to data quality, security, and compliance is essential, with experience in implementing and maintaining controls and standards. Business Impact: Expected to contribute to measurable business outcomes by enabling data-driven decision-making and supporting organizational goals. Education Qualification: Bachelor s degree in Computer Science, Information Technology or a related field is required. A Master s degree in Data Science, Computer Science, Engineering, or a related discipline. Certifications (if any specific): Cloud Platform Certifications (AWS, Azure, GCP) Data Engineering & Big Data (Databricks, CCDP) Database & Data Warehousing (SnowPro, GCP) General Data & AI (CDMP, AI/ML integration, Microfosft) DevOps & Automation (Github, Gitlab CI/CD) Relevant certifications in financial data analytics or telecom data management Location: IND:KA:Bengaluru / Innovator Building, Itpb, Whitefield Rd - Adm: Intl Tech Park, Innovator Bldg Job ID R-73693 Date posted 07/14/2025

Posted 1 week ago

Apply

1.0 - 6.0 years

50 - 80 Lacs

Hyderabad

Work from Office

PRISM (Profitability Insights Manager) vision is to be the source of truth for AWS profitability and provide Finance and the business with insights that help to guide investment decisions and optimize profitability. PRISM generated profit and loss statements across multiple intersections like customer, contract, partner, service and sales. As a data engineer in PRISM you will be responsible for integrating with finance data from upwards of 15+ systems at scale. We run complex allocation logic for our customers to provide a profitability view. We also compare the modelling with actuals and provide insights through AI. The team provides ample opportunity to innovate and solve ambiguous problems at scale 1. Deliver features in the data pipelines to meet customer needs on profitability 2. Generate actionable insights from the data so that Finance customers can do differentiated work 3. Scale the data pipeline for the ever increasing data volume 4. Ability to learn and leverage the latest data analytical tools 5. Work on the end to end experience from sourcing the data, defining the data model/architecture, implementing a reporting layer and generating analytics and insights A day in the life 1. Collaborate with stakeholders across customer finance, sales finance, service finance, deal modelling teams 2. Constantly strive to improve the efficiency of the data pipeline. e.g. Reduce the time taken to load data, ability to define the data model that simplifies visualization, adhere to data governance frameworks About the team PRISM (Profitability Insights Manager) is the profit and loss (P&L) statement solution for AWS. The web-based interface currently enables AWS Finance users to quickly generate customer and sales P&Ls. In the future, users will also be able to generate P&Ls across product and infrastructure region dimensions, delivering a detailed historical P&L for any customer, territory, or product intersection. PRISM is important because it provides AWS Finance with visibility into the financial performance of all customers, whether theyre on public or private pricing. Private pricing means offering financial incentives to specific customers, and has become an important selling level for AWS. A granular, period-over-period P&L gives analysts and leaders the data they need to drive insights that impact the business. 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.

Posted 1 week ago

Apply

6.0 - 8.0 years

8 - 9 Lacs

Bengaluru

Work from Office

We enable #HumanFirstDigital Required Skills & Qualifications:Minimum 6-8 years of hands-on experience in ETL development, data warehousing, and business intelligence.Extensive hands-on experience with Informatica PowerCenter (versions 9.x/10.x) including Designer, Workflow Manager, Workflow Monitor.Extensive expertise in Oracle SQL and PL/SQL including advanced concepts (e.g., analytical functions, complex joins, dynamic SQL, exception handling).Proven experience with Oracle Database 11g, 12c, 19c or higher.Strong understanding of ETL methodologies, data warehousing principles, and dimensional modeling (Star Schema, Snowflake Schema).Experience with performance tuning of Informatica PowerCenter mappings/workflows and large-scale Oracle databases.Proficiency in shell scripting (e.g., Bash, Korn Shell) for automation of ETL jobs and pre/post-session commands.Familiarity with version control systems (e.g., Git).Excellent problem-solving, analytical, and debugging skills.Strong communication (verbal and written) and interpersonal skills.Ability to work independently and as part of a collaborative team in a fast-paced environment.Bachelors degree in Computer Science, Information Technology, Engineering, or a related field.Preferred Skills:Experience with Oracle-plsql ETL tools (e.g., Informatica).Familiarity with Agile development methodologies. Proficiency in Git for source code control, including code migration and deployment workflows.Demonstrated ability to write and optimize complex SQL queries for large datasets.Design, develop, test, and maintain robust and highly efficient Oracle PL/SQL stored procedures, functions, packages, and database triggers.Implement complex business logic, data transformations, and automated tasks using PL/SQL to support ETL processes and application requirements. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee Wellness Job Location : Bengaluru, India

Posted 1 week ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Bengaluru

Work from Office

. Roles and Responsibility RESPONSIBILITIES: This is a strong technology and solution delivery role, accountable for the successful design, development, and delivery of ETL solutions on the corporate Data Platform. Responsible for the delivery of the development activities he/she owns including ETL standards, patterns and best practices. Work with team to ensure standards and patterns are followed. Determines organizational strategies for data integrity validation processes. Establishes policies and best practices for optimizing ETL data throughput/accessibility Identifies opportunities for new initiatives; makes recommendations on the increasing scalability and robustness of ETL platforms and solutions. Partner with the Data modelers to drive improvements and enhancements to the current data landscape and future strategy. Remain current on new ETL techniques and methodologies and communicate trends and opportunities to management and other developers as needed. Identify opportunities for uses of those technologies to enhance current or anticipated information systems and business goals needs Evaluates existing applications that could address client requirements and makes recommendations from complex projects. Identifies opportunities for solution sharing and reuse. Assists in the development of ETL-related Service Level Agreements. Communicates risks and ensures understanding of these risks. QUALIFICATIONS: - EDUCATION : B.E / B TECH/MASTERS IN COMPUTER SCIENCE, ELECTRONICS, RELEVANT TECHNICAL CERTIFICATION Technology Skills and Project Experience: - 7 years of experience in modeling and business system designs. - 5 years hands on experience in SQL and Informatica ETL development is must. - 3 years of Redshift or Oracle (or comparable database) experience with BI/DW deployments. - Must have proven experience with STAR and SNOWFLAKE schema techniques. - Development experience in minimum 1 year in Python scripting is mandatory. Having Unix scripting is an added advantage - Proven track record as an ETL developer in delivering successful business intelligence developments with complex data sources. - Strong analytical skills and enjoys solving complex technical problems. Business / Soft Skills: - Must have solid presentation, communication (on complex technical solutions) and inter-personal skills. - Ability to work effectively with globally dispersed stakeholders from business. - Ability to manage multiple priorities in a fast-paced environment. - A data-driven mindset, Ability to clearly communicate complex business problems and technical solutions. - Ability to manage and make decisions about competing priorities and resources. - Ability to delegate where appropriate - Must be a strong team player

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Job_Description":" Required Skills: Bachelors degree in Computer Science, Information Systems, or a related field. Minimum 3 years of hands-on experience with SnapLogic or similar iPaaS tools (e.g., MuleSoft, Dell Boomi, Informatica). Strong integration skills with various databases (SQL, NoSQL) and enterprise systems. Proficiency in working with REST/SOAP APIs, JSON, XML, and data transformation techniques. Experience with cloud platforms (AWS, Azure, or GCP). Solid understanding of data flow, ETL/ELT processes, and integration patterns. Excellent analytical, problem-solving, and communication skills. Exposure to DevOps tools and CI/CD pipelines. Experience integrating with Enterprise platforms (e.g., Salesforce) Key Responsibilities: Design, develop, and maintain scalable integration pipelines using SnapLogic. Integrate diverse systems including relational databases, cloud platforms, SaaS applications, and on-premise systems. Collaborate with cross-functional teams to gather requirements and deliver robust integration solutions. Monitor, troubleshoot, and optimize SnapLogic pipelines for performance and reliability. Ensure data consistency, quality, and security across integrated systems. Maintain technical documentation and follow best practices in integration development. ","

Posted 1 week ago

Apply

12.0 - 15.0 years

13 - 18 Lacs

Gurugram

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the design and implementation phases. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and capable of supporting future growth and innovation within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor and evaluate the effectiveness of data solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Migration.- Experience with SAP Data & Development.- Strong understanding of data modeling techniques and best practices.- Familiarity with data integration tools and methodologies.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 12 years of experience in SAP Data Migration.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP FI S/4HANA Accounting Good to have skills : NA Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of the clients while adhering to best practices in software development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FI S/4HANA Accounting.- Strong understanding of financial accounting principles and practices.- Experience with integration of SAP modules and third-party applications.- Familiarity with reporting tools and techniques within SAP.- Ability to troubleshoot and resolve issues related to SAP FI S/4HANA. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP FI S/4HANA Accounting.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated with the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation for application processes and workflows. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and architecture.- Experience with SQL and data manipulation techniques.- Familiarity with cloud computing platforms and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Google BigQuery.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies