Jobs
Interviews

4166 Informatica Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

4 - 6 Lacs

Chennai

On-site

Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications: Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

8.0 years

3 - 6 Lacs

Bān

On-site

Req ID: 331538 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Database-Oracle Developer to join our team in Ban/Hyd/Chn/Gur/Noida, Karnātaka (IN-KA), India (IN). Job Title- Developer Role Description The Developer is responsible for analyzing the technical requirements and perform code changes, unit testing, facilitate integration testing, UAT, production release. The developer also works as L3 support for fixing any production defects. Your key responsibilities Design, develop, test, deploy, maintain and improve software Manage individual project priorities, deadlines and deliverables etc. Directly interact with a broad spectrum of stakeholders in multiple regions Liaise with other technical areas, conducting technology research, and evaluating software required for maintaining the development environment Perform research on various technologies and define architectural improvements, build prototypes or core features as needed Help set the technical direction that will help achieve the best user experience Your skills and experience/Qualifications Overall 8+ years of Experience Technical skills required include o Oracle SQL, PL/SQL o Unix o Control-M o Informatica (Good to have ) o Python (Good to have) Knowledge of DevOps configuration management tools (Chef, Puppet, Docker, TeamCity, Jenkins, uDeploy, Kubernetes, Maven etc) Experience with tooling across the SDLC: Sonar, Crucible, JIRA, HP ALM, HP UFT, Confluence, Nexus, Artifactory, Teamcity, Git/BitBucket) Experience in a Banking /Wealth management domain. Experience of working in Agile Environment. Relevant and demonstrable experience in designing and documenting requirements in agile teams An organized self-starter able to manage in a complex environment A team player who continually collaborates and shares information Continually looks to simplify and standardize solutions Actively seeks to reduce complexity and do the "right thing" Persistent in your drive for quality and excellence Architecturally minded with an ability to simplify complex activities Influencer and problem solving person Fluent in English (written/verbal) additional language(s) are an advantage Familiar with Excel, PowerPoint, Visio etc. Ability to work in a Matrix organization with stakeholders spread across geographies. Understanding of executing projects in agile (scrum) methodology Ability to identify and interpret stakeholders needs and requirements Self-motivated and flexibility to work autonomously coupled with ability to work in virtual teams and matrix/global organizations including appreciation of different cultures during collaborating and sharing. Ability to influence and motivate other team members and stakeholders through strong dialogue, facilitation and persuasiveness. Preferred Domain Knowledge: Weath management and banking regulations Nice to have skills: Banking, Database, BigQuery, Google cloud knowledge About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

0 years

0 Lacs

Noida

On-site

Req ID: 328482 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a ETL Informatica ,IICS Developer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

0 years

7 - 9 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Principal Consultant, Informatica Power Center! Responsibilities Hands-on experience in Informatica Power Center and IICS administration. Experience in developing data feeds or ETLs using Informatica Power center Install, upgrade, configure, optimize , and administer the Informatica platform, either on-premises or IICS. Manages the Informatica platform components related to configuration, security, and connectivity. Knowledge of UNIX and Linux administration. Daily health check-up, usual Admin activities, migration, upgrade, connection creations especially on Application side ( eg : http, sales force). Monthly and Weekly Code deployments from lower environments to higher environments. Support production jobs with performance tuning, metrics, and troubleshooting Experience in Domain and Repository Backup activities Provide timely and effective technical support to end-users, developers, and stakeholders who encounter problems with informatica tools and data flows. Monitor Informatica environments, ELT jobs, and data pipelines to ensure their availability, performance, and reliability, Perform routine maintenance tasks such as restarting services, clearing cache, and optimizing workflows. Qualifications we seek in you Minimum qualifications BE/B Tech/MCA Excellent written and verbal communication skills Mandatory qualifications Strong understanding of ETL Concepts, data integration Patterns, and data warehousing. Hands-on experience in Informatica Power Center and IICS administration. Knowledge of UNIX and Linux administration. Excellent Problem-solving and analytical skills. Detail-Oriented approach and commitment to maintaining accurate records. Quick joiners preferred Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 7, 2025, 7:30:16 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

0 years

0 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Consultant - Informatica Responsibilities Hands-on experience in Informatica Power Center and IICS administration. Install, upgrade, configure, optimize , and administer the Informatica platform, either on-premises or IICS. Manages the Informatica platform components related to configuration, security, and connectivity. Knowledge of UNIX and Linux administration. Daily health check-up, usual Admin activities, migration, upgrade, connection creations especially on Application side ( eg : http, sales force). Monthly and Weekly Code deployments from lower environments to higher environments. Support production jobs with performance tuning, metrics, and troubleshooting Experience in Domain and Repository Backup activities Provide timely and effective technical support to end-users, developers, and stakeholders who encounter problems with informatica tools and data flows. Monitor Informatica environments, ELT jobs, and data pipelines to ensure their availability, performance, and reliability, Perform routine maintenance tasks such as restarting services, clearing cache, and optimizing workflows. Qualifications we seek in you Minimum qualifications BE/B Tech/MCA Excellent written and verbal communication skills Mandatory qualifications Strong understanding of ETL Concepts, data integration Patterns, and data warehousing. Hands-on experience in Informatica Power Center and IICS administration. Knowledge of UNIX and Linux administration. Excellent Problem-solving and analytical skills. Detail-Oriented approach and commitment to maintaining accurate records. Quick joiners preferred Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 7, 2025, 7:26:09 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Type : Internship Duration : 6 Months About Phoenix Phoenix is Myntras initiative specifically designed to offer a launchpad to women on career break. It is a six month internship that ensures a conducive environment facilitating a smooth transition back to work. With structured on-boarding, customized learning and development programs, mentorship opportunities, on the job learning and best in class benefits, we aim to provide an environment that is supportive, so that you can re-discover your career with us. During your internship with us, you will get the opportunity to work with the best talent in the e-commerce industry and work on projects that match your interest, abilities and could lead to full-time employment with Myntra. Key Responsibilities: Collect, process, and analyze data from various internal systems and external sources. Develop and maintain dashboards, reports, and visualizations to support business objectives. Identify trends, anomalies, and opportunities through data analysis and communicate findings to stakeholders. Collaborate with business, product, and engineering teams to define KPIs and performance metrics. Perform exploratory data analysis to uncover business insights and support strategic planning. Use statistical techniques to develop predictive models or support A/B testing. Ensure data quality and integrity by validating datasets and troubleshooting discrepancies. Document data processes and maintain version control for reports and code. Requirements: Bachelors degree in Statistics, Mathematics, Computer Science, Economics, or a related field. 1-3 years of experience as a Data Analyst or in a related analytical role. Proficient in SQL, Excel, and at least one data visualization tool (e.g., Power BI, Tableau, Looker). Experience with data analysis tools/languages such as Python, R, or similar. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills and the ability to present complex data in a clear and actionable manner. Experience in e-commerce industry Understanding of basic statistical methods and predictive analytics. Should have minimum 6 months career gap at present.

Posted 1 week ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills.

Posted 1 week ago

Apply

2.0 - 6.0 years

5 - 10 Lacs

Mumbai Suburban, Thane, Mumbai (All Areas)

Hybrid

Hi Candidates, We are hiring for ETL Support Location- Mumbai Exp- 2 to 5 years Skills- ETL support, Scripting, SQL Direct Responsibilities 3 + years of experience in Informatica development/production support. Ability to understand and architect ETL solutions for production support activities. Provide L1\L2 tier production support (oncall rotation) for ETL jobs in production. Able to understand\write\troubleshoot shell scripts and sql (Oracle) for regular maintenance jobs to improve performance and uptime of Informatica platform. Collaborate internally and external sources to maintain SLA and provide escalations. Maintenance of the Informatica ETL platform infrastructure (upgrades, patches, health monitoring). • Worked on Change Management Tools to track all the incidents and changes and to plan the production changes as per the change management policies. Troubleshooting ETL jobs and follow escalation procedures to resolve the issue. Collaborate and work with development team on L3 issues and follow the task to completion. Provide on call support for Informatica ETL jobs (on rotation 24x7). Participate in change deployment releases Ensures adherence to bank and information systems policies and procedures, security measures and audit standards. Able to understand process quickly and apply in day-to-day activities. Team player and mentor junior support engineers. Interested candidate share your resume on singh.nikita@kiya.ai

Posted 1 week ago

Apply

2.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Key Responsibilities: Design, build, and maintain scalable data pipelines on Snowflake. Possessing experience or knowledge in Snow pipe, Time Travel, and Fail Safe. Write and optimize SQL queries for data extraction and transformation. Develop ETL processes to integrate various data sources into Snowflake. Monitor and troubleshoot data warehouse performance issues. Implement security measures and data governance practices. Having sound knowledge on snowflake architecture. Having knowledge on fivetran is addon advantage Collaborate with cross-functional teams to support analytical and reporting needs. Experience : 2 to 8 Years Qualifications: Bachelor’s degree in computer science, Information Technology, or a related field. Proven experience with Snowflake and data warehousing concepts. Proficiency in SQL and ETL tools (e.g., Talend, Informatica, etc.). Company Details: One of the top ranked IT Companies in Ahmedabad, Gujarat. We are ISO 9001:2015 & ISO 27001:2013 certified leading global technology solution provider. Globally present, core focus is on USA, Middle East, Canada for services. Constantly enhancing our span of services around custom software development, Enterprise Mobility Solutions, and the Internet of Things. Family of multicultural and multi-talented passionate and well experienced resources who consistently work to set new standards for customer satisfaction by implementing industry best practices. Why Stridely? · You will have opportunities to work on international enterprise-level projects of big-ticket size · Interaction and co-ordination with US customer · Employee-First approach along with customer-first approach · Continuous learning, Training, and knowledge enhancements opportunities · Self-development, Career, and growth concealing within the origination · Democratic and Metro Culture · Strong and solid leadership · Seeing your potential you will get Overseas visits, transfers, and exposure URL: www.stridelysolutions.com Employee strength: 500+ Working Days: 5 Days a week One of the top ranked IT Companies in Ahmedabad, Gujarat. We are ISO 9001:2015 & ISO 27001:2013 certified leading global technology solution provider. Globally present, core focus is on USA, Middle East, Canada for services. Constantly enhancing our span of services around custom software development, Enterprise Mobility Solutions, and the Internet of Things. Family of multicultural and multi-talented passionate and well experienced resources who consistently work to set new standards for customer satisfaction by implementing industry best practices. Why Stridely? · You will have opportunities to work on international enterprise-level projects of big-ticket size · Interaction and co-ordination with US customer · Employee-First approach along with customer-first approach · Continuous learning, Training, and knowledge enhancements opportunities · Self-development, Career, and growth concealing within the origination · Democratic and Metro Culture · Strong and solid leadership · Seeing your potential you will get Overseas visits, transfers, and exposure URL: www.stridelysolutions.com Employee strength: 500+ Working Days: 5 Days a week Location: Ahmedabad/ Pune/ Vadodara

Posted 1 week ago

Apply

0 years

0 Lacs

India

Remote

Join Appsierra, a dynamic leader in the Technology, Information, and Internet industry, as an IICS Lead / IICS Lead Data Engineer. In this pivotal role, you will spearhead innovative projects, driving data integration and transformation initiatives using Informatica Intelligent Cloud Services (IICS). You will collaborate with cross-functional teams to design, implement, and optimize cutting-edge data solutions that empower decision-making and enhance operational efficiency. Your expertise will be crucial in ensuring seamless data flow and accuracy, contributing to the company's strategic objectives. At Appsierra, we value forward-thinking professionals who are passionate about leveraging technology to solve complex challenges. If you are an adept problem-solver with a keen eye for detail and a commitment to excellence, we invite you to become a part of our esteemed team, where your contributions will have a lasting impact on our success and growth in the tech industry. Tasks Tech Skills Details for IICS Lead Prior experience as an IICS Lead Extensive experience with IDCS (Oracle Identity Cloud Service) Hands-on experience with Informatica Cloud Proficiency in programming languages such as SQL Strong experience in data modelling, data warehousing, and database management Having knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud Platform) Ability to convey technical information clearly to both technical and non-technical stakeholders Tech Skills Details for IICS DEs Prior experience as an IICS data engineer Extensive experience with IDCS (Oracle Identity Cloud Service) Hands-on experience with Informatica Cloud Proficiency in programming languages such as SQL Strong experience in data modelling, data warehousing, and database management Having knowledge of cloud platforms (e.g., AWS, Azure, Google Cloud Platform) Ability to convey technical information clearly to both technical and non-technical stakeholders Requirements Number of Resources = 1 for Lead / 2 for DEs Office = 100% Remote Work Seniority = Lead/ Intermediate to Senior Duration = Long Term Soft Skills = Fluent English to work directly on an international team

Posted 1 week ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Senior Data Modelller – Telecom Domain About the Role: We are seeking an experienced Telecom Senior Data Modeller to join our team. In this role, you will be responsible for designing and standardization of enterprise-wide data models across multiple domains such as Customer, Product, Billing, and Network. The ideal candidate will work closely with cross-functional teams to translate business needs into scalable and governed data structures. You will work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Responsibilities:  Design logical and physical data models aligned with enterprise and industry standards  Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management  Create and maintain data models for Customer, Product, Usage, and Service domains  Align models with TM Forum SID, telecom standards, and data mesh principles  Translate business requirements into normalized and analytical schemas (Star/Snowflake)  Define and maintain entity relationships, hierarchy levels (Customer - Account - MSISDN), and attribute lineage  Standardize attribute definitions across systems and simplify legacy structures  Collaborate with engineering teams to implement models in cloud data platforms (e.g., Databricks)  Collaborate with domain stewards to simplify and standardize legacy data structures  Work with governance teams to tag attributes for privacy, compliance, and data quality  Document metadata, lineage, and maintain version control of data models  Support analytics, reporting, and machine learning teams by enabling standardized data access  Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Qualifications:  Bachelor's or Master's degree in Computer Science, Telecommunications Engineering, Data Science, or a related technical field  7+ years of experience in data modelling roles with at least 3-4 years in telecommunications industry  Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes  Excellent understanding of TM Forum SID / eTOM / ODA  Strong experience with data modeling tools (Azure Analysis services, SSAS, dbt, informatica)  Hands-on experience with modern cloud data platforms (Databricks, Azure Synapse, Snowflake)  Deep understanding of data warehousing concepts and normalized/denormalized models  Proven experience in telecom data modeling (CRM, billing, network usage, campaigns)  Expertise in SQL, data profiling, schema design, and metadata documentation  Familiarity with domain-driven design, data mesh and modular architecture  Experience in large-scale transformation or modernization programs  Knowledge of regulatory frameworks such as GDPR or data privacy-by-design  Background in telecom, networking or other data-rich industries

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Company Description ThreatXIntel is a startup cyber security company that offers customized and affordable solutions to protect businesses and organizations from cyber threats. With a focus on cloud security, web and mobile security testing, and DevSecOps, we aim to provide peace of mind to our clients. Our proactive approach involves continuous monitoring and testing to identify vulnerabilities before they can be exploited. Role Description We are seeking a business-focused Lead Data Engineer to design, develop, and maintain a Client Master MDM solution supporting both institutional and wealth management channels . This role will involve working across business and technology stakeholders, designing modern cloud-native data pipelines using Azure , ADF , Snowflake , DBT , and Semarchy (or Informatica) . You will play a critical role in implementing master data strategies , improving data quality, and delivering scalable, secure, and high-performing data engineering solutions. The ideal candidate is a hands-on contributor with deep MDM expertise and cloud data architecture experience. Key Responsibilities Develop and maintain a Client Master MDM solution across institutional and wealth management data domains. Collaborate with data stewards, product owners, and business stakeholders to gather and analyze data requirements. Design and build cloud-based data pipelines using Azure Data Factory (ADF) , DBT , FiveTran , and Snowflake . Work with Semarchy MDM (preferred) or Informatica MDM to manage master data lifecycle and workflows. Develop scalable solutions using SQL , Python , or Java for high-performance data transformation. Optimize data pipelines for scalability, performance, and reliability . Ensure data accuracy, integrity, and quality through proactive monitoring and troubleshooting. Translate complex data and business problems into robust, reusable technical solutions . Promote best practices in data governance , data modeling , and data quality management . Communicate technical solutions clearly with business and engineering teams. Required Skills & Experience 5+ years of experience in data engineering , with a focus on Master Data Management In-depth knowledge of MDM principles , architectures, and tools (Semarchy preferred, Informatica acceptable) Strong experience with Azure-based data pipeline tools : ADF , Snowflake , DBT , FiveTran Strong SQL development skills (must-have) Programming experience in Python or Java Proficiency in data modeling , ETL architecture , and data warehousing concepts Hands-on experience with data quality , governance , and monitoring frameworks Experience in fast-paced or startup-like environments with agile execution Nice to Have Familiarity with wealth management or institutional asset management data models Understanding of data lineage , metadata management , and compliance regulations

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role:-Data Analyst/Data Engineer Exp:- 7-14 Yrs Location:-Hyderabad Primary Skills:- ETL,Informatica,Python,SQL,BI tools and Investment Domain Please share your resumes to rajamahender.n@technogenindia.com , Job Description:- •7-9 years of experience with data analytics, data modeling, and database design. •3+ years of coding and scripting (Python, Java, Scala) and design experience. •3+ years of experience with Spark framework. •5+ Experience with ELT methodologies and tools. •5+ years mastery in designing, developing, tuning and troubleshooting SQL. •Knowledge of Informatica Power center and Informatica IDMC. •Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. •Strong data analysis skills for extracting insights from financial data •Proficiency in reporting tools (e.g., Power BI, Tableau).

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 45 Lacs

Bengaluru

Work from Office

Seeking a Technical Lead – MDM Solutions with 10+ years of data management experience and expertise in implementing MDM architectures, data governance, and cloud integration using platforms like Snowflake and AWS.

Posted 1 week ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Noida

Work from Office

About the Organisation DataFlow Group is a pioneering global provider of specialized Primary Source Verification (PSV) solutions, and background screening and immigration compliance services that assist public and private organizations in mitigating risks to make informed, cost-effective decisions regarding their Applicants and Registrants. About the Role: We are looking for a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience to play a pivotal role in designing, developing, and maintaining our robust data pipelines. The ideal candidate will have deep expertise in both batch ETL processes and real-time data streaming technologies, coupled with extensive hands-on experience with AWS data services. A proven track record of working with Data Lake architectures and traditional Data Warehousing environments is essential. Duties and Responsibilities: Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka or AWS Kinesis to support immediate data ingestion and processing requirements. Utilize and optimize a wide array of AWS data services Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Qualifications: 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT and data pipeline development. Deep expertise in ETL Tools: Extensive hands-on experience with commercial ETL tools (Talend) Strong proficiency in Data Streaming Technologies: Proven experience with real-time data ingestion and processing using platforms such as AWS Glue,Apache Kafka, AWS Kinesis, or similar. Extensive AWS Data Services Experience: Proficiency with AWS S3 for data storage and management Hands-on experience with AWS Glue for ETL orchestration and data cataloging. Familiarity with AWS Lake Formation for building secure data lakes. Good to have experience with AWS EMR for big data processing Data Warehouse (DWH) Knowledge: Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. Programming Languages: Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. Database Skills: Strong understanding of relational databases and NoSQL databases. Version Control: Experience with version control systems (e.g., Git). Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. Communication: Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Preferred Qualifications: Certifications in AWS Data Analytics or other relevant areas.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Greetings from Analytix Solutions…!!! We are seeking an experienced and motivated Senior Data Engineer to join our AI & Automation team . The ideal candidate will have 6–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data-driven insights across the organization. Company Name: Analytix Business Solutions (US Based MNC) Company At Glance: We are a premier knowledge process outsourcing unit based in Ahmedabad, fully owned by Analytix Solutions LLC, headquartered in the USA. We customize a wide array of business solutions including IT services, Audio-Visual services, Data management services & Finance and accounting services for small and mid-size companies across diverse industries. We partner with and offer our services to Restaurants, Dental services, Dunkin' Donuts franchises, Hotels, Veterinary services, and others including Start-ups from any other industry. For more details about our organization, please click on https://www.analytix.com/ LinkedIn : Analytix Business Solutions (India) Pvt. Ltd. : My Company | LinkedIn Roles & Responsibilities : Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. Architect and optimize data storage solutions to ensure reliability, security, and scalability. Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. Develop and enforce data quality standards, governance policies , and monitoring systems to ensure data integrity. Create and maintain comprehensive documentation for data systems, workflows, and models. Implement data modeling best practices and optimize data retrieval processes for better performance. Stay up-to-date with emerging technologies and bring innovative solutions to the team. Competencies & Skills : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. 6–8 years of experience in data engineering , designing and managing large-scale data systems. Advanced knowledge of Database Management Systems and ETL/ELT processes . Expertise in data modeling , data quality , and data governance . Proficiency in Python programming , version control systems (Git), and data pipeline orchestration tools . Familiarity with AI/ML technologies and their application in data engineering. Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. Excellent communication skills , with the ability to explain technical concepts to non-technical stakeholders. Ability to work independently, lead projects, and mentor junior team members. Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. Technology Stacks : Strong expertise in database technologies, including: o SQL Databases : PostgreSQL, MySQL, SQL Server o NoSQL Databases : MongoDB, Cassandra o Data Warehouse/ Unified Platforms : Snowflake, Redshift, BigQuery, Microsoft Fabric Hands-on experience implementing and working with generative AI tools and models in production workflows. Proficiency in Python and SQL , with experience in data processing frameworks (e.g., Pandas, PySpark). Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms . Strong understanding of data architecture , data modeling , and data governance principles . Experience with cloud platforms (preferably Azure ) and associated data services. Our EVP (Employee Value Proposition) : 5 Days working Total 24 Earned & Casual leaves and 8 public holidays Compensatory Off Personal Development Allowances Opportunity to work with USA clients Career progression and Learning & Development Loyalty Bonus Benefits Medical Reimbursement Standard Salary as per market norms Magnificent & Dynamic Culture

Posted 1 week ago

Apply

8.0 - 13.0 years

0 Lacs

India

On-site

Data Base Oracle Developer Are you a seasoned Data Base Oracle Developer seeking a new opportunity? We are looking for a skilled professional to join our team and take on a pivotal role in our projects. Role Overview: Join us as a Data Base Oracle Developer, where you will be instrumental in our technical projects. Your duties will involve analyzing technical requirements, implementing code changes, conducting unit testing, overseeing integration testing and UAT, and ensuring smooth production releases. You will also be the go-to person for L3 support, efficiently addressing any production defects. Key Responsibilities: - Design, develop, test, deploy, and enhance software solutions. - Effectively manage project priorities, deadlines, and deliverables. - Engage with stakeholders across various regions. - Collaborate with technical teams, conduct technology research, and assess software requirements. - Keep abreast of emerging technologies, propose architectural enhancements, and develop prototypes or core features as needed. - Contribute to defining the technical direction to enhance user experience. Skills and Qualifications: - Over 8-13 years of overall experience. - Proficiency in Oracle SQL, PL/SQL, Unix, and Control-M. - Familiarity with Informatica and Python is advantageous. - Knowledge of DevOps tools like Chef, Puppet, Docker, Jenkins, etc. - Experience with SDLC tools such as Sonar, JIRA, Confluence, Git/BitBucket, etc. - Background in Banking/Wealth management domain and Agile environments. - Strong ability to design and document requirements in agile setups. - Self-motivated, organized, and adept at managing complex environments. - Team player with a focus on collaboration and information sharing. - Solution-oriented mindset with a drive for quality and excellence. - Fluent in English, additional languages are beneficial. - Proficient in Excel, PowerPoint, Visio, etc. - Capability to work effectively in a global, matrixed organization. - Skilled in executing projects using agile

Posted 1 week ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About the Company As a Solution Architect, you are part of Enterprise Architecture team focusing on driving innovation, improving delivery effectiveness and enhancing the overall business agility. About the Role You will be responsible for building hybrid solutions with seamless integrations across multiple IT components on-premises and on-cloud. You bring deep technical expertise in designing and delivering end to end high performance, scalable & flexible solutions using cutting edge/emerging technologies including Cloud Computing (IaaS, PaaS and Containerization), Mobility/Responsive Web apps, APIs, microservices, data Integration and DevOps. You work with the direction from Senior Solution Architect and Enterprise Architect to co-create solutions with the rest of the IT delivery teams. Responsibilities Work closely with various business partners and other departments to create future proof solutions covering digital, automation, APIs, integration and data. Provide technical expertise in solving/troubleshooting performance & other non-functional requirements. Design integrations and patterns in accordance with architectural standards and drive changes to standards, policies and procedures based on input from service managers and service partners. Support critical projects in all phases of delivery on a need basis. Review Application, Integration and Solution Architecture to analyze the current IT ecosystem and develop opportunities for improvements. Experiment, explore and demonstrate application of new technologies by means of conducting quick prototypes to solve business problems. Maintain good technical relationship with partner’s (Banca/broker/aggregators/vendors) technical teams. Other Responsibilities Defining and reviewing continuous delivery, continuous integration, continuous testing (DevOps pipelines) that serve the purpose of provisioning and quality code delivery. Stakeholder management at strategic levels in both technical and business functions. Focus on continuous service improvement and thought leadership, drive strategic initiatives to help business and partners to achieve their goals. Measures of Success Alignment of IT landscape to overall vision and blueprints. Faster, better and cheaper delivery of applications via technology. Exceptional user(internal/external) experience, automation and operational efficiency by adoption of new cutting edge technology solutions to solve business problems. Trusted partnership with other departments of IT and business. Stay up to date with emerging technologies, industry trends and best practices. Qualifications B.E / MCA from a reputed institute. 9+ years of designing solution and technical architectures with hands-on software development background. 7+ years of experience in applications migration to cloud i.e. AWS, Azure etc., building JSON/REST API integrations, Application and Data Integration Stacks - IIB, API Connect, Datapower, Informatica, AWS services. Should have architected and delivered at least 2-3 hybrid projects in a technology organization. Experience in presenting solutions architecture to peers and senior management. Should be working or should have worked with reputed organizations or startups. Required Skills Extensive experience in areas of technologies like Node JS/Java Frameworks, databases, queues, streams, AWS cloud serverless and containers. At least 2 years of designing hybrid cloud applications and migrate existing workloads to the cloud. Able to recommend the appropriate AWS service based on data, compute, database or security requirements. Demonstrated competency with application and data integration platforms (Mulesoft, Apigee, EDW, etc.) and patterns – SOA, APIs, Webservices, Microservices, ETL, Event Processing, BPM, ESB. Understanding of BFSI domain, Open Architecture and application integration discipline, concepts and best practices. Define cost control mechanisms by suggesting architectural changes & optimizing resource utilization. Any prior experience on Implementation of AI, Data Analytics will be a plus. Preferred Skills Experience in presenting solutions architecture to peers and senior management. Should be working or should have worked with reputed organizations or startups.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. . Job Title: Test Data Manager · Location: Hyderabad(Hybrid) · Experience: 5+ yrs · Job Type : Contract to hire. · Notice Period:- Immediate joiners. Mandatory Skills: 5 years' experience working as a data quality engineer Designs, builds, and maintains quality assurance processes in support of data analytics, integration, and reporting initiatives. Develops data integration tests to ensure data meets intended use Works with business and technology to understand data requests and build or enhance processes and procedures to meet expected outcome Works in data integration QA tools such as Informatica, SSIS, SQL, SSRS, PowerBI, etc.

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

On-site

Strong hands-on experience with SQL Server, including advanced SQL development, indexing, and query performance optimization. • 4+ years of experience in Informatica PowerCenter and/or Informatica IDMC (Cloud). • Good understanding of Snowflake cloud data platform (basic to intermediate skills). • Expertise in data analysis, profiling, and validation across large datasets. • Strong in debugging SQL and ETL logic to troubleshoot and resolve issues quickly. • Deep understanding of data warehousing concepts, data marts, and dimensional modelling (e.g., fact/dimension tables). • Proven ability to conduct root cause analysis for complex data discrepancies.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Location: Bengaluru, Karnataka, India Job ID: R0098148 Date Posted: 2025-07-08 Company Name: HITACHI ENERGY INDIA LIMITED Profession (Job Category): Quality Management Job Schedule: Full time Remote: No Job Description: The opportunity: The Quality Assurance (QA) and Control Manager will oversee the planning, coordination, and execution of QA activities for a large-scale SAP ERP set up. This role ensures that SAP-Center of Expertise meet internal quality standards, industry best practices, and business requirements. The manager will also be responsible for designing and managing governance frameworks to monitor process improvements and maintain long-term operational excellence in ERP and enabled processes aligned to the strategic objectives of SAP-CoE. How you’ll make an impact: Define and implement a comprehensive quality assurance strategy and plan specific to the service management (defects/ incident management, and related interfaces), specification and development of new functionality, project management, and operations. Develop and enforce quality standards, testing protocols, and documentation procedures across SAP modules (e.g., FI/CO, MM, SD, PP, etc.,). Conduct quality gate reviews on SAP- CoE projects. Monitor deliverables from SAP consultants, developers, and business stakeholders to ensure they meet agreed-upon quality criteria. Provide any special input reviewing the testing procedures and development and execution of testing strategies including Unit Testing, Integration Testing, User Acceptance Testing (UAT), and Regression Testing. Ensure qualitative process in defects management. Establish control mechanisms to ensure that implemented ERP processes are compliant with internal policies and external regulations (e.g., SOX, GDPR). Work closely with BU/FU leads and business process owners to align SAP processes with organizational objectives and continuous improvement efforts. Define KPIs and dashboards to monitor process adherence and performance post-implementation. Implement and drive continuous improvements in SAP- CoE. Maintain quality Document management system. Identify, document, and manage quality-related risks. Conduct root cause analysis for defects or process failures and ensure corrective/preventive actions are implemented. Conduct periodic process Audits and implement corrective actions. Ensure Process compliance through effective documentation and process traceability. Provide regular QA status reports to management/ steering committees. Facilitate workshops and meetings with functional teams to ensure quality awareness and continuous engagement. Act as a point of contact for QA/QC-related issues and escalate critical quality risks appropriately. Responsible to ensure compliance with applicable external and internal regulations, procedures, and guidelines. Living Hitachi Energy’s core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your background: Bachelor’s or master’s degree in information technology, Engineering, or related field. 15+ years of experience in large scale SAP ERP implementation with at least 7+ years in quality assurance/control in SAP/ ERP projects. Strong understanding of SAP modules and implementation methodologies (e.g., SAP Activate, ASAP, ADO, Panaya, etc.,). Certification in Quality Management (e.g., Six Sigma, ISO 9001) and SAP Quality Assurance. Knowledge in Data - Syniti , Informatica, SAP Data Intelligence, Testing -Worksoft Tricentris , Selenium Etc. Proven experience in enterprise process design, process mapping, and control frameworks. Proficiency in both spoken & written English language is required. Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. This is solely for job seekers with disabilities requiring accessibility assistance or an accommodation in the job application process. Messages left for other purposes will not receive a response.

Posted 1 week ago

Apply

8.0 years

0 Lacs

India

Remote

About Company Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. About The Role We are seeking a skilled and experienced Technical Project Manager to lead complex projects in Identity and Access Management (IAM) and Data Privacy domains. This role requires a strong background in Azure Cloud Security , agile project management, and cross-functional stakeholder collaboration. The ideal candidate will oversee multiple concurrent projects, align technology with strategic objectives, and ensure secure, scalable delivery. Key Responsibilities Lead end-to-end project planning, execution, and delivery for security engineering and data privacy initiatives Coordinate Agile ceremonies (Scrum, Sprint Planning, Retrospectives) and ensure SAFe Agile compliance Manage cross-functional teams and align onsite-offshore delivery efforts Design and implement IAM solutions using technologies such as PlainID, Microsoft Entra ID, SailPoint, and AWS IAM Oversee integration of IAM and Data Privacy tools like OneTrust, Microsoft Priva, Informatica DPM, and Collibra Create project documentation, architecture diagrams, risk assessments, and compliance reports Support CI/CD best practices using Azure DevOps Pipelines Enforce data governance policies and access control frameworks Provide technical leadership across solution design and architecture reviews Monitor delivery timelines, budgets, and risk mitigation plans Conduct performance audits, security assessments, and gap analyses Required Skills & Experience Master’s degree in Computer Science, IT, or related field 8+ years of project management experience (including 5+ in IAM/security/data privacy projects) Strong Agile/SAFe project delivery and stakeholder management experience Hands-on experience with IAM tools (e.g. PlainID, Microsoft Entra ID, SailPoint, Okta) Experience with Azure Cloud Security practices and infrastructure Familiarity with Data Privacy platforms like OneTrust, Informatica DPM, Microsoft Priva Proven experience in reference architecture design and secure policy enforcement Excellent communication and leadership skills with ability to manage global delivery teams Familiarity with Azure DevOps for CI/CD, workflow automation, and reporting Benefits & Perks Opportunity to work with leading global clients Flexible work arrangements with remote options Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities Skills: microsoft priva,plainid,ci/cd,azure cloud security,onetrust,technical project manager,okta,azure cloud,agile project management,informatica dpm,data privacy,sailpoint,microsoft entra id,security engineering,project delivery,collibra,agile,access governance,azure devops,iam

Posted 1 week ago

Apply

8.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

Remote

Job Title : Freelance Data Pipeline Engineer ETL, OLAP, Healthcare (HL7, FHIR) Employment Type : Full-Time Freelancer (Independent Contractor) Work Mode : Remote / Permanent Work From Home Work Schedule : Monday to Friday, 8 : 00 PM 5 : 00 AM IST (Night Shift) Compensation : 95K/ -100K -per month (subject to applicable TDS deductions) Contract Duration : Initial three-month engagement, extendable based on performance and project requirements Required Skills & Qualifications 8+ years of experience in data engineering, ETL development, and pipeline automation. Strong understanding of data warehousing and OLAP concepts. Proven expertise in handling healthcare data using HL7, FHIR, CCD/C-CDA. Proficiency in SQL and scripting languages such as Python or Bash. Experience with data integration tools (e.g., Apache NiFi, Talend, Informatica, SSIS). Familiarity with cloud data services (AWS, Azure, or GCP) is a plus. Strong knowledge of data quality assurance, data profiling, and data governance. Bachelors degree in Computer Science, Information Systems, or a related field (Masters preferred). For More Details Kindly share your resume : (ref:hirist.tech)

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

Remote

JOB_POSTING-3-72216-5 Job Description Role Title: VP, Data Engineering Tech Lead (L12) Company Overview COMPANY OVERVIEW: Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #5 among India’s Best Companies to Work for 2023, #21 under LinkedIn Top Companies in India list, and received Top 25 BFSI recognition from Great Place To Work India. We have been ranked Top 5 among India’s Best Workplaces in Diversity, Equity, and Inclusion, and Top 10 among India’s Best Workplaces for Women in 2022. We offer 100% Work from Home flexibility for all our Functional employees and provide some of the best-in-class Employee Benefits and Programs catering to work-life balance and overall well-being. In addition to this, we also have Regional Engagement Hubs across India and a co-working space in Bangalore Organizational Overview Organizational Overview: This role will be part of the Data Architecture & Analytics group part of CTO organization Data team is responsible for designing and developing scalable data pipelines for efficient data ingestion, transformation, and loading(ETL). Collaborating with cross-functional teams to integrate new data sources and ensure data quality and consistency. Building and maintaining data models to facilitate data access and analysis by Data Scientists and Analysts. Responsible for the SYF public cloud platform & services. Govern health, performance, capacity, and costs of resources and ensure adherence to service levels Build well defined processes for cloud application development and service enablement. Role Summary/Purpose We are seeking a highly skilled Cloud Technical Lead with expertise in Data Engineering who will work in multi-disciplinary environments harnessing data to provide valuable impact for our clients. The Cloud Technical Lead will work closely with technology and functional teams to drive migration of legacy on-premises data systems/platforms to cloud-based solutions. The successful candidate will need to develop intimate knowledge of SYF key data domains (originations, loan activity, collection, etc.) and maintain a holistic view across SYF functions to minimize redundancies and optimize the analytics environment. Key Responsibilities Manage end-to-end project lifecycle, including planning, execution, and delivery of cloud-based data engineering projects. Providing guidance on suitable options, designing, and creating data pipeline for the analytical solutions across data lake, data warehouses and cloud implementations. Architect and design robust data pipelines and ETL processes leveraging Ab Initio and Amazon Redshift. Ensure data integration, transformation, and storage process are optimized for scalability and performance in cloud environment. Ensure data security, governance, and compliance in the cloud infrastructure. Provide leadership and guidance to data engineering teams, ensuring best practices are followed. Ensure timely delivery of high-quality solutions in an Agile environment. Required Skills/Knowledge Minimum 10+ years of experience with Bachelor's degree in Computer Science or similar technical field of study or in lieu of a degree 12+ years of relevant experience Minimum 10+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 10+ years of financial services experience Minimum 6+ years of experience working with Data Warehouses/Data Lake/Cloud. 6+ years’ of hards-on programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Working knowledge of Hive, Spark, Kafka and other data lake technologies. Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Ability to develop and maintain strong collaborative relationships at all levels across IT and the business. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Demonstrated ability to drive change and work effectively across business and geographical boundaries. Expertise in evaluating technology and solution engineering, with strong focus on architecture and deployment of new technology Superior decision-making, client relationship, and vendor management skills. Desired Skills/Knowledge Prior work experience in a credit card/banking/fintech company. Experience dealing with sensitive data in a highly regulated environment. Demonstrated implementation of complex and innovative solutions. Agile experience using JIRA or similar Agile tools. Eligibility Criteria Bachelor's degree in Computer Science or similar technical field of study (Masters degree preferred) Minimum 12+ years of experience in managing large scale data platforms (Data warehouse/Data Late/Cloud) environments Minimum 12+ years of financial services experience Minimum 8+ years of experience working with Oracle Data Warehouses/Data Lake/Cloud 8+ years’ of programming experience in ETL tools - Ab Initio or Informatica highly preferred. Be able to read and reverse engineer the logic in Ab Initio graphs. Hands on experience with cloud platforms such as S3, Redshift, Snowflake, etc. Rigorous data analysis through SQL in Oracle and various Hadoop technologies. Involvement in large scale data analytics migration from on premises to a public cloud Strong familiarity with data governance, data lineage, data processes, DML, and data architecture control execution. Experience to analyze system requirements and implement migration methods for existing data. Excellent written and oral communication skills, along with a strong ability to lead and influence others. Experience working iteratively in a fast-paced agile environment. Work Timings: 3:00 PM IST to 12:00 AM IST (WORK TIMINGS: This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details .) For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L10+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L10+ Employees can apply Level / Grade : 12 Job Family Group Information Technology

Posted 1 week ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Position Overview Job Title: Senior Engineer – Data SQL Engineer Corporate Title: AVP Location: Pune, India Role Description As a SQL Engineer, you would be responsible for design, development and optimization of complex database systems. You would be writing efficient SQL queries, stored procedures and possess expertise in data modeling, performance optimization and working with large scale relational databases What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Design, develop and optimize complex SQL queries, stored procedures, views and functions. Work with large datasets to perform data extraction, transformation and loading(ETL). Develop and maintain scalable database schemas and models Troubleshoot and resolve database-related issues including performance bottlenecks and data quality concerns Maintain data security and compliance with data governance policy. Your Skills And Experience 10+ years of hands-on experience with SQL in relational databases – SQL Server, Oracle, MySQL PostgreSQL. Strong working experience of PLSQL and T-SQL. Strong understanding of data modelling, normalization and relational DB design. Desirable Skills That Will Help You Excel Ability to write high performant, heavily resilient queries in Oracle / PostgreSQL / MSSQL Working knowledge of Database modelling techniques like Star Schema, Fact-Dimension Models and Data Vault. Awareness of database tuning methods like AWR reports, indexing, partitioning of data sets, defining tablespace sizes and user roles etc. Hands on experience with ETL tools - Pentaho/Informatica/Stream sets. Good experience in performance tuning, query optimization and indexing. Hands-on experience on object storage and scheduling tools Experience with cloud-based data services like data lakes, data pipelines, and machine learning platforms. Experience in GCP, Cloud Database Migration experience, hands-on with Postgres How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies