Home
Jobs

235 Snowflake Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 9.0 years

8 - 9 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Key Responsibilities: Lead and manage cross-functional data engineering teams for successful delivery of data initiatives. Architect, design, and implement scalable and efficient data pipelines and ETL workflows. Develop high-quality, reusable code using Python and SQL for data processing and transformation. Build and maintain cloud-based data infrastructure primarily on AWS and Redshift , with exposure to Snowflake . Collaborate with stakeholders to gather requirements and deliver impactful data solutions. Maintain and improve the architecture of the data warehouse and production data systems. Provide technical leadership, mentorship, and best practices in data engineering. Ensure data quality, governance, and security standards are upheld. Troubleshoot data-related issues and support data operations as needed. Required Qualifications: Bachelors or Master s degree in Computer Science, Information Technology, or related field. 8+ years of experience in data engineering or related fields. Strong proficiency in Python , SQL , Amazon Redshift , and AWS ecosystem (e.g., S3, Lambda, Glue). Experience with Snowflake or other cloud data warehouse platforms is a strong plus. Demonstrated experience in leading teams and managing end-to-end data projects. Strong problem-solving skills and experience with complex data architectures. Excellent verbal and written communication skills. Familiarity with big data tools and frameworks is a plus (e.g., Spark, Hadoop). Ability to work independently as well as collaboratively in a team environment

Posted 3 days ago

Apply

8.0 - 9.0 years

8 - 9 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Key Responsibilities: Lead and manage cross-functional data engineering teams for successful delivery of data initiatives. Architect, design, and implement scalable and efficient data pipelines and ETL workflows. Develop high-quality, reusable code using Python and SQL for data processing and transformation. Build and maintain cloud-based data infrastructure primarily on AWS and Redshift , with exposure to Snowflake . Collaborate with stakeholders to gather requirements and deliver impactful data solutions. Maintain and improve the architecture of the data warehouse and production data systems. Provide technical leadership, mentorship, and best practices in data engineering. Ensure data quality, governance, and security standards are upheld. Troubleshoot data-related issues and support data operations as needed. Required Qualifications: Bachelors or Master s degree in Computer Science, Information Technology, or related field. 8+ years of experience in data engineering or related fields. Strong proficiency in Python , SQL , Amazon Redshift , and AWS ecosystem (e.g., S3, Lambda, Glue). Experience with Snowflake or other cloud data warehouse platforms is a strong plus. Demonstrated experience in leading teams and managing end-to-end data projects. Strong problem-solving skills and experience with complex data architectures. Excellent verbal and written communication skills. Familiarity with big data tools and frameworks is a plus (e.g., Spark, Hadoop). Ability to work independently as well as collaboratively in a team environment

Posted 3 days ago

Apply

8.0 - 9.0 years

8 - 9 Lacs

Delhi, India

On-site

Foundit logo

Key Responsibilities: Lead and manage cross-functional data engineering teams for successful delivery of data initiatives. Architect, design, and implement scalable and efficient data pipelines and ETL workflows. Develop high-quality, reusable code using Python and SQL for data processing and transformation. Build and maintain cloud-based data infrastructure primarily on AWS and Redshift , with exposure to Snowflake . Collaborate with stakeholders to gather requirements and deliver impactful data solutions. Maintain and improve the architecture of the data warehouse and production data systems. Provide technical leadership, mentorship, and best practices in data engineering. Ensure data quality, governance, and security standards are upheld. Troubleshoot data-related issues and support data operations as needed. Required Qualifications: Bachelors or Master s degree in Computer Science, Information Technology, or related field. 8+ years of experience in data engineering or related fields. Strong proficiency in Python , SQL , Amazon Redshift , and AWS ecosystem (e.g., S3, Lambda, Glue). Experience with Snowflake or other cloud data warehouse platforms is a strong plus. Demonstrated experience in leading teams and managing end-to-end data projects. Strong problem-solving skills and experience with complex data architectures. Excellent verbal and written communication skills. Familiarity with big data tools and frameworks is a plus (e.g., Spark, Hadoop). Ability to work independently as well as collaboratively in a team environment

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior BI DWH SQL Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). 5+ years of experience of experience in Snowflake cloud databases minimum 3 years relevant. Hands-on Knowledge of and Data Management concepts-Required Hands-on working knowledge in Data Warehousing and ETL projects Experience with utilities to load the data into Snowflake from different/disparate source systems Expertise in writing, re-writing and dissecting complex SQL queries using multiple join conditions, case statements, arithmetic, aggregate functions and query tuning/optimization techniques-Required Advanced SQL skills, including use of derived tables, unions, multi-table inner/outer joins Able to effectively communicate with customers, peers and management at all levels in and outside the organization Knowledge of different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements Experienced in shell scripts to write wrapper scripts to call ETL Jobs Excellent Team player and Quick learner with positive attitude and self-motivation Excellent Communication, Interpersonal and problem solving skills and Implementation skills Tech Skills: Snowflake SQL DWH BI About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Req ID: 325282 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake DBT Developer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Snowflake and Data Vault 2 (optional) Consultant Extensive expertise in DBT , including macros, modeling, and automation techniques. Proficiency in SQL, Python , or other scripting languages for automation. Experience leveraging Snowflake for scalable data solutions. Familiarity with Data Vault 2.0 methodologies is an advantage. Strong capability in optimizing database performance and managing large datasets. Excellent problem-solving and analytical skills. Minimum of 3+ years of relevant experience, with a total of 5+ years of overall experience. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for Lead Consultant, Java Backend developer In this role, you will be responsible for coding, testing and delivering high quality deliverables, and should be willing to learn new technologies. Responsibilities . Has in-depth understanding of the full software development life cycle. Deep familiarity with data & business analysis, along with modern software design & development concepts encompassing a range of languages & platforms. . Can work within an Agile team to ensure that all key technical requirements are identified, estimated, designed, implemented, and tested. . Can work on a collaborative cross-technology team where Java, APIs (services), PL/SQL interact with a range of business logic interfaces & systems. . Can own small to medium scale strategic initiatives/projects, enhances process standards and best practices, and supports issues during UAT/system rollout phases. . Understands business needs and priorities and provides thought leadership on potential solutions and the opportunities for technology to create positive impact on the business. . Has excellent analytical and problem-solving skills that are coupled with strong communication, anticipates issues, and deals with them proactively. . Is team oriented, highly collaborative, works effectively to build strong long-term partnerships with stakeholders at all levels of the organization and across a variety of business and IT functions. . Self-sufficient and shows ability to lead, given the opportunity. . Demonstrates a passion for technology innovation balanced with a pragmatic approach to developing and deploying solutions that best benefit the business. . Nimble, adaptable, able to express ideas in meetings & design discussions, comfortable with ambiguity course correct when circumstances change. Qualifications we seek in you! Minimum Qualifications . BE/B Tech/MCA . Excellent written and verbal communication skills Preferred Qualifications/ Skills . Hands on experience with on Java 11 with Spring Cloud Microservices ecosystem (multi-threading, AWS,data structures, design patterns and OOP practices etc) . Ability to Query Relational Databases like DB2, Sybase and Cloud data sources like snowflake . Spring, SOAP & REST Web Services with on XML/JSON based data . Experience with DevOps tools and Git . Test Driven development (TDD) and experience working in a disciplined development environment Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 4 days ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnapLogic Professionals in the following areas : Job Description: Experience: 6 to 8 years. Configuring and deploying Snaplogic pipelines to integrate data from various sources. Trouble shooting and resolving issues related to ETL processes. Developing and maintaining user documentation for ETL processes. Resource to work from CAT office 3 days a week. Overall, 6-8 yrs experience Proven 3+ experience in building and designing solutions for data warehouse and experience in working with large data sets 3-4 years of Development experience in building Snap logic pipelines, error handling, scheduling tasks & alerts. Analyze & translate functional specifications /user stories into technical specifications. Performs a Sr.Develop er role in end to end implementations in Snap logic Strong database knowledge, i.e., RDBMS Oracle/PLSQL, Snowflake Proven experience with Cloud data storage and access using Snowflake / S3 Experienced in business interfacing, possess strong data background, and good understanding in requirements analysis design Data movement and ETL experience Experience with AWS/Azure cloud environment development and deployment Knowledge of API's and in any scripting is a plus Note : Resource should be able to provide Technical guidance and Mentorship to development teams along with Team leads. Review and optimize existing pipelines for performance and efficiency Collaborate with stakeholders to understand business requirements and turn them to technical solutions. Please let know if you need any further details on this. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 4 days ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowFlake Professionals in the following areas : Senior Snowflake Developer Job description: Responsible for designing and implementing data pipelines, ETL processes, and data modeling in Snowflake Responsible to translate business requirements into ELT pipelines using data replication tools and data transformation tools (such as DBT) or advanced SQL scripting (views, Snowflake Store Procedure, UDF). Deep understanding of Snowflake architecture and processing Exp with performance tuning of Snowflake data warehouse, Experience working with Snowflake Functions, hands on exp with Snowflake utilities, stage and file upload features, time travel, fail safe Responsible for development, deployment, code reviews, and production support. Maintain and implement best practices for Snowflake infrastructure Hands-on in complex SQL, parsing complex data sets Primary Skills: Must have 4 to 6 yrs. in IT, 3+ years working as a Snowflake Developer, and 5+ years in Data warehouse, ETL, and BI projects. Must have experience in at least one complex implementation of a Snowflake Data Warehouse and DBT hands-on experience Expertise in Snowflake data modeling, ELT using Snowflake SQL or Modern Data Replication tools Snowflake Store Procedures / UDF / advanced SQL scripting, and standard Data Lake / Data Warehouse concepts. Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, and time travel. Expertise in deploying Snowflake features such as data sharing, events, and lake-house patterns. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe and Big Data model techniques. Deep understanding of relational data stores, methods, and approaches (star and snowflake, dimensional modeling). Hands-on experience with DBT Core or DBT Cloud, including dev and prod deployment using CI/CD (BitBucket) is a plus. Should be able to develop and maintain documentation of the data architecture, data flow, and data models of the data warehouse. Good communication skills Python and API experience is a plus At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - Snowflake! Responsibilities: . Ability to design and implement effective analytics solutions and models with Snowflake . Hand-on experience in Snowflake SQL, Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data . Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. . Should be able to implement Snowpipe, Stage and file upload to Snowflake database . Hand-on Experience on any RBDMS/NoSQL database with strong SQL writing skills . In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles . Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling. . Hands-on Experience on Azure Blob Qualifications we seek in you! Minimum Qualifications / Skills . SnowSQL, SnowPipe, Tasks, Streams, Time travel . Certified SnowPro Core . Good Understanding of Data Warehousing & Reporting tools . Able to work on own initiative and as a team player . Good organizational skills with cultural awareness and sensitivity . Education: ME/ M.Tech./ MS (Engg/ Sciences) and BE/BTech (Engineering) . Industry: Manufacturing/Industrial Behavioral Requirements: . Lives client&rsquos core values of courage and curiosity to deliver the best business solutions for EL-Business . Ability to o work in diversified teams o convey messages and ideas clearly to the users and project members o listen, understand, appreciate, and appropriately respond to the users . Excellent team player with strong oral and written communication skills . Possess strong time management skills . Keeps up-to-date and informed of client technology landscape and client IS Strategy planned or ad-hoc changes. Preferred Skills/Qualifications Azure storage services such as Blob, Data Lake, Cosmos DB and SQL Server. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 4 days ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Senior Principal Consultant, AIML Engineer! In this role, we are looking for candidates who have relevant years of experience in designing and developing machine learning and deep learning system. Who have professional software development experience. Hands on running machine learning tests and experiments. Implementing appropriate ML algorithms engineers. Responsibilities . Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products . Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers . Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services . Build and implement machine learning models and prototype solutions for proof-of-concept . Scale existing ML models into production on a variety of cloud platforms . Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Qualifications / Skills . Bachelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field . Master&rsquos degree is a plus . Integration - APIs, micro-services and ETL/ELT patterns . DevOps (Good to have) - Ansible, Jenkins, ELK . Containerization - Docker, Kubernetes etc . Orchestration - Airflow, Step Functions, Ctrl M etc . Languages and scripting: Python, Scala Java etc . Cloud Services - AWS, GCP, Azure and Cloud Native . Analytics and ML tooling - Sagemaker, ML Studio . Execution Paradigm - low latency/Streaming, batch Preferred Qualifications/ Skills . Data platforms - Big Data (Hadoop, Spark, Hive, Kafka etc.) and Data Warehouse (Teradata, Redshift, BigQuery, Snowflake etc.) . Visualization Tools - PowerBI, Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 4 days ago

Apply

5.0 - 11.0 years

5 - 11 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Responsibilities: Lead and manage cross-functional teams to ensure successful project delivery. Code, Design and manage complex data projects. Code and develop solutions using Python, SQL, Redshift, and AWS. Utilize experience with Redshift or Snowflake to enhance data solutions. Communicate effectively with stakeholders to understand and meet their requirements. Implement and manage data processes for the Data Warehouse and production systems. Maintain and update Data Engineering architecture. Design and implement ETL processes. Provide technical leadership in data space Specific Qualifications: Proven experience in problem-solving and engineering complex use-cases Proficiency in Python, Redshift, SQL and AWS. Experience with Snowflake Strong project management skills and the ability to design complex data projects Ability to lead teams and communicate effectively with stakeholders Experience in developing blueprints for analytics applications using Big Data infrastructure Excellent communication and interpersonal skills

Posted 4 days ago

Apply

2.0 - 8.0 years

2 - 8 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Let's change the world! We're looking for a skilled Data Engineer to join our Enterprise Data RunOps Team in Hyderabad. In this role, you'll be instrumental in developing, supporting, and optimizing data pipelines and operational workflows that empower our enterprise data teams. You'll ensure seamless data access, integration, and governance across the organization. We're seeking a hands-on engineer with a deep understanding of modern data architectures, strong experience in cloud-native technologies, and a passion for delivering reliable, well-governed, and high-performing data infrastructure within a regulated biotech environment. Roles & Responsibilities: Design, build, and support data ingestion, transformation, and delivery pipelines across structured and unstructured sources within enterprise data engineering. Manage and monitor day-to-day operations of the data engineering environment, ensuring high availability, performance, and data integrity. Collaborate with data architects, data governance, platform engineering, and business teams to support data integration use cases across R&D, Clinical, Regulatory, and Commercial functions. Integrate data from laboratory systems, clinical platforms, regulatory systems, and third-party data sources into enterprise data repositories. Implement and maintain metadata capture, data lineage, and data quality checks across pipelines to meet governance and compliance requirements. Support real-time and batch data flows using technologies such as Databricks, Kafka, Delta Lake , or similar. Work within GxP-aligned environments , ensuring compliance with data privacy, audit, and quality control standards. Partner with data stewards and business analysts to support self-service data access, reporting, and analytics enablement. Maintain operational documentation, runbooks, and process automation scripts for continuous improvement of data fabric operations. Participate in incident resolution and root cause analysis, ensuring timely and effective remediation of data pipeline issues. Create documentation, playbooks, and best practices for metadata ingestion, data lineage, and catalog usage. Work in an Agile and Scaled Agile (SAFe) environment , collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value. Use JIRA, Confluence , and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle. Collaborate and communicate effectively with product teams and cross-functional teams to understand business requirements and translate them into technical solutions. Must-Have Skills: Build and maintain data pipelines to ingest and update metadata into enterprise data catalog platforms, preferably in biotech, life sciences, or pharma . Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL, Apache Spark, AWS, Python, SQL , and Scaled Agile methodologies . Proficiency in workflow orchestration and performance tuning on big data processing. 2+ years of experience in data engineering, data operations, or related roles, with at least 2+ years in life sciences, biotech, or pharmaceutical environments . Experience with cloud platforms (e.g., AWS, Azure, or GCP) for data pipeline and storage solutions. Understanding of data governance frameworks, metadata management, and data lineage tracking . Strong problem-solving skills, attention to detail, and ability to manage multiple priorities in a dynamic environment. Effective communication and collaboration skills to work across technical and business stakeholders. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices . Good-to-Have Skills: Data Engineering experience in the Biotechnology or pharma industry. Experience in writing APIs to make data available to consumers. Experienced with SQL/NoSQL databases , and vector databases for large language models. Experienced with data modeling and performance tuning for both OLAP and OLTP databases. Experienced with software engineering best practices , including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven, etc.), automated unit testing, and DevOps. Education and Professional Certifications: Master's degree and 3 to 4+ years of Computer Science, IT, or related field experience OR Bachelor's degree and 5 to 8+ years of Computer Science, IT, or related field experience Preferred: AWS Certified Data Engineer Preferred: Databricks Certificate Preferred: Scaled Agile SAFe certification Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized, and detail-oriented. Strong presentation and public speaking skills.

Posted 5 days ago

Apply

3.0 - 8.0 years

3 - 8 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Master s degree and 3 to 4 + years of Computer Science, IT or related field experience OR Bachelor s degree and 5 to 8 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 5 days ago

Apply

4.0 - 12.0 years

4 - 12 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

This role involves working with large datasets, developing reports, supporting and executing data initiatives and, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Collaborate with cross-functional teams to understand data requirements and design solutions that meet business needs Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation What we expect of you Masters degree and 4 to 6 years of Computer Science, IT or related field experience OR Bachelors degree and 6 to 8 years of Computer Science, IT or related field experience OR Diploma and 10 to 12 years of Computer Science, IT or related field experience Basic Qualifications: Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL),Snowflake, workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Excellent problem-solving skills and the ability to work with large, complex datasets Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, SageMaker, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 5 days ago

Apply

15.0 - 17.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

Job Description Remaining Positions: 1 Details: Exp: 15+ year of exp Require knowledge of Talend but also knowledge of other Data related tools like: Databricks or Snowflake The Senior Talend Developer/Architect role is responsible to lead the design, development and manage the INSEAD data infrastructure for the CRM ecosystem, to develop Talend Jobs & Flows and to act as a mentor for the other 3-4 Talend Developers. This role will be instrumental in driving data pipeline architecture and ensuring data integrity, performance, and scalability using the Talend platform.This role is key part of the HARMONIA project team while the engagement is active. The role will also be part of the Harmonia Data Quality project and Data Operations scrum teams. It will contribute to additional activities such data modelling & design, architecture, integrationand propose technology strategy. The position holder must organize and plan her/his work under special consideration of a frictionless information flow in the Digital Solutions and the relevant business department. He/She will collaborate closely with cross-functional teams to deliver high-quality data solutions that support strategic business objectives. Job Requirements Details: Design, develop, and deploy scalable ETL/ELT solutions using Talend (e.g. Data Stewardship, Management Console, Studio). Architect end-to-end data integration workflows. Establish development best practices, reusable components, and job templates to optimize performance and maintainability. Responsible for delivering robust data architecture, tested, validated and deployable jobs/flows to production environments. He/she will follow Talend best practices and JIRA development framework development practices. Translate business requirements into efficient and scalable Talend solutions. Assist with the developer input/feedback for those requirements wherever deemed necessary. These are to be done by actively leading brainstorming sessions arranged by the project manager. Work closely with the Manager of Data Operations and Quality, project manager, business analysts, data analysts, developers and other subject matter experts to align technical solutions with operational needs. Ensure alignment with data governance, security, and compliance standards. Responsible for ensuring that newly produced developments follow standard styles which are already part of the current Talend jobs & flows developed by the current integrator, and INSEAD teams. Actively participate to the project related activities and ensure the SDLC process is followed. Participate in the implementation and execution of data cleansing and normalization, deduplication and transformation projects. Conduct performance tuning, error handling, monitoring, and troubleshooting of Talend jobs and environments. Contribute to sprint planning and agile ceremonies with the Harmonia Project Team and Data Operations Team. Document technical solutions, data flows, and design decisions to support operational transparency. Stay current with Talend product enhancements and industry trends, recommending upgrades or changes where appropriate. No budget responsibility Personnel responsibility: Provide technical mentorship to junior Talend developers and contribute to develop the internal knowledge base. (INSEAD and external ones). #LI-AS2 Pay Range: Based on Experience

Posted 6 days ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireSnowFlake Professionals in the following areas : Experience 6-8 Years Job Description Sr Snowflake Developer Design and develop our Snowflake data platform, including data pipeline building, data transformation, and access management. Minimum 4+years of experience in Snowflake, strong in SQL Develop data warehouse and data mart solutions for business teams. Accountable to design robust, scalable database and data extraction, transformation, and loading (ETL) solutions Understand and evaluate business requirements that impact the Caterpillar enterprise. Liaises with data creators to support project planning, training, guidance on standards, and the efficient creation/maintenance of high-quality data. Contributes to policies, procedures, and standards as well as technical requirements. Ensure compliance with the latest data standards supported by the company, and brand, legal, information security (data security and privacy compliance). Document data models for domains to be deployed including a logical data model, candidate source lists, and canonical formats. Creates, updates, and enhances metadata policies, processes, and catalogs. Good communication and interacting with SME from client Should have capability to lead a team of 4-5 members Snowflake Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 6 days ago

Apply

12.0 - 15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Req ID: 323754 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data & AI Technical Solution Architects to join our team in Chennai, Tamil N?du (IN-TN), India (IN). Job Duties: The Data & AI Architect is a seasoned level expert who is responsible for participating in the delivery of multi-technology consulting services to clients by providing strategies and solutions on all aspects of infrastructure and related technology components. This role collaborates with other stakeholders on the development of the architectural approach for one or more layer of a solution. This role has the primary objective is to work on strategic projects that ensure the optimal functioning of the client's technology infrastructure. . Key Responsibilities: . Ability and experience to have conversations with the CEO, Business owners and CTO/CDO . Break down intricate business challenges, devise effective solutions, and focus on client needs. . Craft high level innovative solution approach for complex business problems . Utilize best practices and creativity to address challenges . Leverage market research, formulate perspectives, and communicate insights to clients . Establish strong client relationships . Interact at appropriate levels to ensure client satisfaction . Knowledge and Attributes: . Ability to focus on detail with an understanding of how it impacts the business strategically. . Excellent client service orientation. . Ability to work in high-pressure situations. . Ability to establish and manage processes and practices through collaboration and the understanding of business. . Ability to create new and repeat business for the organization. . Ability to contribute information on relevant vertical markets . Ability to contribute to the improvement of internal effectiveness by contributing to the improvement of current methodologies, processes and tools. Minimum Skills Required: Academic Qualifications and Certifications: . BE/BTech or equivalent in Information Technology and/or Business Management or a related field. . Scaled Agile certification desirable. . Relevant consulting and technical certifications preferred, for example TOGAF. Required Experience: 12-15 years . Seasoned demonstrable experience in a similar role within a large scale (preferably multi- national) technology services environment. . Very good understanding of Data, AI, Gen AI and Agentic AI . Must have Data Architecture and Solutioning experience. Capable of E2E Data Architecture and GenAI Solution design. . Must be able to work on Data & AI RFP responses as Solution Architect . 10+ years of experience in Solution Architecting of Data & Analytics, AI/ML & Gen AI Technical Architect . Develop Cloud-native technical approach and proposal plans identifying the best practice solutions meeting the requirements for a successful proposal. Create, edit, and review documents, diagrams, and other artifacts in response to RPPs RFQs and Contribute to and participate in presentations to customers regarding proposed solutions. . Proficient with Snowflake, Databricks, Azure, AWS, GCP cloud, Data Engineering & AI tools . Experience with large scale consulting and program execution engagements in AI and data . Seasoned multi-technology infrastructure design experience. . Seasoned demonstrable level of expertise coupled with consulting and client engagement experience, demonstrating good experience in client needs assessment and change management. . Additional Job Description Additional Job Description Additional Career Level Description: Knowledge and application: . Seasoned, experienced professional has complete knowledge and understanding of area of specialization. . Uses evaluation, judgment, and interpretation to select right course of action. Problem solving: . Works on problems of diverse scope where analysis of information requires evaluation of identifiable factors. . Resolves and assesses a wide range of issues in creative ways and suggests variations in approach. Interaction: . Enhances relationships and networks with senior internal/external partners who are not familiar with the subject matter often requiring persuasion. . Works About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Position Title: Product Manager - Healthcare Data & Analytics About the Role As Product Manager, you will lead the strategy, execution, and commercialization of innovative data and analytics products for the U.S. healthcare market. This is a highly collaborative role where you'll work cross-functionally with Engineering, Sales, Design, and Delivery teams to build scalable, interoperable solutions that address core challenges across payers and providers. You'll be responsible for partnering with the solution offering manager to deliver on the product vision and roadmap, conducting customer discovery, tracking success metrics, and ensuring timely delivery of high-impact features. This role carries revenue responsibilities and is key to EXL Health's broader growth agenda. Core Responsibilities Product Strategy & Leadership Develop and own the quarterly roadmap for healthcare data and analytics solution Manage product backlog and ensure alignment with evolving client needs, compliance mandates (e.g., CMS, FHIR), and company objectives Translate customer pain points and regulatory changes into innovative data-driven products and services. Champion a customer-first approach while ensuring technical feasibility and commercial viability. Stay ahead of technology and market trends-especially in AI, value-based care, and Care Management Collaborate closely with Engineering and Design teams to define and prioritize product requirements. Client Engagement & Sales Support Meet directly with clients to shape strategy, gather feedback, and build trusted relationships. Serve as the bridge between client expectations and solution capabilities, ensuring alignment and delivery excellence. Qualifications Experience Minimum 5-8 years of experience in analytics, data platforms, or product management, preferably within the U.S. healthcare ecosystem. At least 3 years in a leadership or client-facing product role, including experience managing end-to-end product development and revenue accountability. Proven success in bringing data or analytics products to market-from ideation through launch and iteration. Healthcare Domain Expertise Deep familiarity with U.S. payer or provider environments, including claims, payments, risk adjustment, population health, or care management. Working knowledge of regulatory and interoperability standards (e.g., CMS 0057, FHIR, TEFCA). Hands-on understanding of how data management, analytics, and AI/ML drive value in clinical or operational workflows. Technical Skills Practical experience with or exposure to: Cloud Platforms: Snowflake, AWS, Azure, GCP BI & Visualization Tools: Tableau, Power BI, Qlik ETL/Data Integration: Informatica, Talend, SSIS, Erwin Data Science/AI/ML: Experience collaborating data science teams on AI initiatives Agile/Tools: Jira, Confluence, Asana, Agile/Scrum methodologies Personal Attributes Strategic thinker who can dive deep into execution. Exceptional written and verbal communication, with the ability to translate technical concepts for non-technical audiences. Strong organizational, problem-solving, and analytical skills. Passion for innovation, continuous improvement, and cross-functional collaboration. A team-first leader with high emotional intelligence and the ability to mentor others. Education Bachelor's or Master's degree in Computer Science, Engineering, Data Science, Statistics, Business, or a related field from a top-tier institution.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant, DB ETL Developer In this role, you will be responsible for coding, testing and delivering high quality deliverables, and should be willing to learn new technologies. Responsibilities . Will be responsible for design, code & maintain databases and ensuring their stability, reliability, and performance. . Research and suggest new database products, services and protocols. . Ensure all database programs meet company and performance requirements. . Collaborate with other database teams and owners of different applications. . Modify databases according to requests and perform tests. . Maintain and own database in all environments Qualifications we seek in you! Minimum Qualifications . BE/B Tech/MCA . Excellent written and verbal communication skills Preferred Qualifications/ Skills . A bachelor&rsquos degree in Computer Science or a related field. . Hands-on developer in Sybase, DB2, ETL technologies. . Worked extensively on data integration, designing, and developing reusable interfaces/ . Advanced experience in Sybase, shell scripting, Unix, database design and modelling, and ETL technologies, Informatica . Hands-on experience with Snowflake OR Informatica with below details:- o Demonstrate expertise in Snowflake data modelling and ELT using Snowflake SQL, implementing complex stored procedures and best practices with data warehouse and ETL concepts. o Designing, implementing, and testing cloud computing solutions using Snowflake technology o Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, zero copy clone, time travel, and understand how to use these features. o Creating, monitoring and optimization of ETL/ELT processes (Talend, Informatica) migrating solutions from on-premises to public cloud platforms. . Expert level understanding of data warehouse, core database concepts and relational database design . Skilled at writing, editing large complicated SQL statements . Experience in writing stored procedures, optimization, and performance tuning . Strong Technology acumen and a deep strategic mindset . Proven track record of delivering results . Proven analytical skills and experience making decisions based on hard and soft data . A desire and openness to learning and continuous improvement, both of yourself and your team members . Exposure to SDLC tools, such as: JIRA, Confluence, SVN, TeamCity, Jenkins, Nolio, and Crucible. . Experience on DevOps, CI/CD, and Agile methodology. . Good to have experience with Business Intelligence tools . Familiarity with Postgres and Python is a plus Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant- Snowflake Data Engineer ( Snowflake+ Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight , Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer ( Snowflake+ Python+Cloud ) ! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : E xperience in IT industry W orking experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL /ELT Good to have DBT experience Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be add ed an advantage Roles and Responsibilities : Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake , developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snow p ipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs , Snowsight . Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system . Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python / Pyspark . integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark . Should have some experience on Snowflake RBAC and data security . Should have good experience in implementing CDC or SCD type - 2 . Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analys is, designing, development, and deployment . Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer . Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Foundit logo

________________________________________ Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant- Senior Data Engineer - Snowflake, AWS, Cortex AI & Horizon Catalog Role Summary: We are seeking an experienced Senior Data Engineer with deep expertise in modernizing Data & Analytics platforms on Snowflake, leveraging AWS services, Cortex AI, and Horizon Catalog for high-performance, AI-driven data management. The role involves designing scalable data architectures, integrating AI-powered automation, and optimizing data governance, lineage, and analytics frameworks. Key Responsibilities: . Architect & modernize enterprise Data & Analytics platforms on Snowflake, utilizing AWS, Cortex AI, and Horizon Catalog. . Design and optimize Snowflake-based Lakehouse architectures, integrating AWS services (S3, Redshift, Glue, Lambda, EMR, etc.). . Leverage Cortex AI for AI-driven data automation, predictive analytics, and workflow orchestration. . Implement Horizon Catalog for enhanced data lineage, governance, metadata management, and security. . Develop high-performance ETL/ELT pipelines, integrating Snowflake with AWS and AI-powered automation frameworks. . Utilize Snowflake&rsquos native capabilities like Snowpark, Streams, Tasks, and Dynamic Tables for real-time data processing. . Establish data quality automation, lineage tracking, and AI-enhanced data governance strategies. . Collaborate with data scientists, ML engineers, and business stakeholders to drive AI-led data initiatives. . Continuously evaluate emerging AI and cloud-based data engineering technologies to improve efficiency and innovation. Qualifications we seek in you! Minimum Qualifications . experience in Data Engineering, AI-powered automation, and cloud-based analytics. . Expertise in Snowflake (Warehousing, Snowpark, Streams, Tasks, Dynamic Tables). . Strong experience with AWS services (S3, Redshift, Glue, Lambda, EMR). . Deep understanding of Cortex AI for AI-driven data engineering automation. . Proficiency in Horizon Catalog for metadata management, lineage tracking, and data governance. . Advanced knowledge of SQL, Python, and Scala for large-scale data processing. . Experience in modernizing Data & Analytics platforms and migrating on-premises solutions to Snowflake. . Strong expertise in Data Quality, AI-driven Observability, and ModelOps for data workflows. . Familiarity with Vector Databases & Retrieval-Augmented Generation (RAG) architectures for AI-powered analytics. . Excellent leadership, problem-solving, and stakeholder collaboration skills. Preferred Skills: . Experience with Knowledge Graphs (Neo4J, TigerGraph) for structured enterprise data systems. . Exposure to Kubernetes, Terraform, and CI/CD pipelines for scalable cloud deployments. . Background in streaming technologies (Kafka, Kinesis, AWS MSK, Snowflake Snowpipe). Why Join Us . Lead Data & AI platform modernization initiatives using Snowflake, AWS, Cortex AI, and Horizon Catalog. . Work on cutting-edge AI-driven automation for cloud-native data architectures. . Competitive salary, career progression, and an opportunity to shape next-gen AI-powered data solutions. ________________________________________Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Lead Consultant - Sr.Data Engineer (DBT+Snowflake) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: . Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. . Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. . Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. . Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. . Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. . Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. . Develop and maintain data documentation, best practices, and data governance protocols. . Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Roles and Responsibilities: . . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . . experience in data engineering, with at least 3 years of experience working with Snowflake. . . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . . Strong proficiency in SQL, Python, and data modeling. . . Experience with data integration tools (e.g., Matillion, Talend, Informatica). . . Knowledge of cloud platforms such as AWS, Azure, or GCP. . . Excellent problem-solving skills, with a focus on data quality and performance optimization. . . Strong communication skills and the ability to work effectively in a cross-functional team. . Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. . Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. . Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. . Should have experience building data ingestion pipeline. . Should have experience with Snowflake utilities such as SnowSQL, SnowPipe, bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight. . Should have good experience in implementing CDC or SCD type 2 . Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. . Good to have experience in repository tools like Github/Gitlab, Azure repo Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Qualifications/Minimum qualifications . B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

6.0 - 10.0 years

5 - 7 Lacs

Jaipur, Rajasthan, India

On-site

Foundit logo

Key Responsibilities: Design and implement data ingestion workflows into Salesforce Data Cloud Unify data from multiple sources to create a 360 customer view Develop integrations using APIs, ETL tools, and middleware (e.g., MuleSoft) Collaborate with cross-functional teams to gather and fulfill data integration requirements Monitor integration performance and ensure real-time data availability Ensure compliance with data privacy and governance standards Enable data activation across Salesforce Marketing, Sales, and Service Clouds Must-Have Skills: Experience with cloud data platforms (e.g., Snowflake, Redshift, BigQuery) Salesforce certifications (e.g., Data Cloud Consultant, Integration Architect) Hands-on experience with Salesforce Data Cloud (CDP) Proficiency in ETL, data transformation, and data mapping Strong knowledge of REST/SOAP APIs and integration tools Solid understanding of data modeling and customer data platforms Familiarity with data privacy regulations (e.g., GDPR, CCPA).

Posted 1 week ago

Apply

8.0 - 12.0 years

5 - 8 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Design, develop, and execute comprehensive ETL testing strategies and test plans specific to Snowflake data pipelines. Create and maintain test cases to validate data extraction, transformation, and loading processes within Snowflake. Conduct end-to-end testing of ETL workflows to ensure data integrity and accuracy throughout the entire pipeline. Collaborate closely with data engineers, developers, and stakeholders to understand data requirements and business logic. Identify, document, and report defects or discrepancies in data transformations or loading processes. Develop and maintain automated test scripts and frameworks for ETL testing in Snowflake. Perform performance testing to evaluate the efficiency and scalability of Snowflake data pipelines. Troubleshoot issues and work with cross-functional teams to resolve data-related problems. Ensure compliance with data governance, security, and regulatory standards during testing processes. Document test results, findings, and recommendations for improvements in testing procedures. Strong Hands on, experience in SQL.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies