Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
________________________________________ Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Senior Principal Consultant- Senior Data Engineer - Snowflake, AWS, Cortex AI & Horizon Catalog Role Summary: We are seeking an experienced Senior Data Engineer with deep expertise in modernizing Data & Analytics platforms on Snowflake, leveraging AWS services, Cortex AI, and Horizon Catalog for high-performance, AI-driven data management. The role involves designing scalable data architectures, integrating AI-powered automation, and optimizing data governance, lineage, and analytics frameworks. Key Responsibilities: . Architect & modernize enterprise Data & Analytics platforms on Snowflake, utilizing AWS, Cortex AI, and Horizon Catalog. . Design and optimize Snowflake-based Lakehouse architectures, integrating AWS services (S3, Redshift, Glue, Lambda, EMR, etc.). . Leverage Cortex AI for AI-driven data automation, predictive analytics, and workflow orchestration. . Implement Horizon Catalog for enhanced data lineage, governance, metadata management, and security. . Develop high-performance ETL/ELT pipelines, integrating Snowflake with AWS and AI-powered automation frameworks. . Utilize Snowflake&rsquos native capabilities like Snowpark, Streams, Tasks, and Dynamic Tables for real-time data processing. . Establish data quality automation, lineage tracking, and AI-enhanced data governance strategies. . Collaborate with data scientists, ML engineers, and business stakeholders to drive AI-led data initiatives. . Continuously evaluate emerging AI and cloud-based data engineering technologies to improve efficiency and innovation. Qualifications we seek in you! Minimum Qualifications . experience in Data Engineering, AI-powered automation, and cloud-based analytics. . Expertise in Snowflake (Warehousing, Snowpark, Streams, Tasks, Dynamic Tables). . Strong experience with AWS services (S3, Redshift, Glue, Lambda, EMR). . Deep understanding of Cortex AI for AI-driven data engineering automation. . Proficiency in Horizon Catalog for metadata management, lineage tracking, and data governance. . Advanced knowledge of SQL, Python, and Scala for large-scale data processing. . Experience in modernizing Data & Analytics platforms and migrating on-premises solutions to Snowflake. . Strong expertise in Data Quality, AI-driven Observability, and ModelOps for data workflows. . Familiarity with Vector Databases & Retrieval-Augmented Generation (RAG) architectures for AI-powered analytics. . Excellent leadership, problem-solving, and stakeholder collaboration skills. Preferred Skills: . Experience with Knowledge Graphs (Neo4J, TigerGraph) for structured enterprise data systems. . Exposure to Kubernetes, Terraform, and CI/CD pipelines for scalable cloud deployments. . Background in streaming technologies (Kafka, Kinesis, AWS MSK, Snowflake Snowpipe). Why Join Us . Lead Data & AI platform modernization initiatives using Snowflake, AWS, Cortex AI, and Horizon Catalog. . Work on cutting-edge AI-driven automation for cloud-native data architectures. . Competitive salary, career progression, and an opportunity to shape next-gen AI-powered data solutions. ________________________________________Why join Genpact . Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities . Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 5 days ago
2.0 - 7.0 years
3 - 6 Lacs
Hospet
Work from Office
You will be responsible to provide an excellent and consistent level of administrative support to your customers. The Learning Officer is responsible to assist the Human Resources Manager in the efficient running of the Training Department. You will be responsible for the efficient running of the department in line with Hyatt International's Corporate Strategies and brand standards, whilst meeting employee, guest and owner expectations. The Learning Officer is responsible to ensure the smooth and efficient running of the Personnel Department in the Human Resources Division. Qualifications University Degree/Diploma preference given to Human Resources or business-related degrees / Experience of working in hotel-related operational positions would be a useful benefit
Posted 5 days ago
2.0 - 7.0 years
3 - 6 Lacs
Raipur
Work from Office
You will be responsible to provide an excellent and consistent level of administrative support to your customers. The Learning Officer is responsible to assist the Human Resources Manager in the efficient running of the Human Resources Department. Qualifications University Degree/Diploma preference given to Human Resources or business-related degrees. Experience of working in hotel-related operational positions would be a useful benefit.
Posted 5 days ago
2.0 - 7.0 years
3 - 6 Lacs
Kochi
Work from Office
You will be responsible to provide an excellent and consistent level of administrative support to your customers. The Training Officer is responsible to assist the Training Manager in the efficient running of the Training Department. Qualifications University Degree/Diploma preference given to Human Resources or business-related degrees / Experience of working in hotel-related operational positions would be a useful benefit.
Posted 5 days ago
2.0 - 6.0 years
2 - 6 Lacs
Chennai, Tamil Nadu, India
On-site
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills.
Posted 1 week ago
2.0 - 6.0 years
2 - 6 Lacs
Chennai, Tamil Nadu, India
On-site
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills.
Posted 1 week ago
5.0 - 10.0 years
0 - 1 Lacs
Ahmedabad, Chennai, Bengaluru
Hybrid
Job Summary: We are seeking an experienced Snowflake Data Engineer to design, develop, and optimize data pipelines and data architecture using the Snowflake cloud data platform. The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and cloud platforms, with a focus on creating scalable and high-performance solutions for data integration and analytics. --- Key Responsibilities: * Design and implement data ingestion, transformation, and loading processes (ETL/ELT) using Snowflake. * Build and maintain scalable data pipelines using tools such as dbt, Apache Airflow, or similar orchestration tools. * Optimize data storage and query performance in Snowflake using best practices in clustering, partitioning, and caching. * Develop and maintain data models (dimensional/star schema) to support business intelligence and analytics initiatives. * Collaborate with data analysts, scientists, and business stakeholders to gather data requirements and translate them into technical solutions. * Manage Snowflake environments including security (roles, users, privileges), performance tuning, and resource monitoring. * Integrate data from multiple sources including cloud storage (AWS S3, Azure Blob), APIs, third-party platforms, and streaming data. * Ensure data quality, reliability, and governance through testing and validation strategies. * Document data flows, definitions, processes, and architecture. --- Required Skills and Qualifications: * 3+ years of experience as a Data Engineer or in a similar role working with large-scale data systems. * 2+ years of hands-on experience with Snowflake including SnowSQL, Snowpipe, Streams, Tasks, and Time Travel. * Strong experience in SQL and performance tuning for complex queries and large datasets. * Proficiency with ETL/ELT tools such as dbt, Apache NiFi, Talend, Informatica, or custom scripts. * Solid understanding of data modeling concepts (star schema, snowflake schema, normalization, etc.). * Experience with cloud platforms (AWS, Azure, or GCP), particularly using services like S3, Redshift, Lambda, Azure Data Factory, etc. * Familiarity with Python or Java or Scala for data manipulation and pipeline development. * Experience with CI/CD processes and tools like Git, Jenkins, or Azure DevOps. * Knowledge of data governance, data quality, and data security best practices. * Bachelor's degree in Computer Science, Information Systems, or a related field. --- Preferred Qualifications: * Snowflake SnowPro Core Certification or Advanced Architect Certification. * Experience integrating BI tools like Tableau, Power BI, or Looker with Snowflake. * Familiarity with real-time streaming technologies (Kafka, Kinesis, etc.). * Knowledge of Data Vault 2.0 or other advanced data modeling methodologies. * Experience with data cataloging and metadata management tools (e.g., Alation, Collibra). * Exposure to machine learning pipelines and data science workflows is a plus.
Posted 3 weeks ago
5.0 - 10.0 years
14 - 19 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Role & responsibilities Urgent Hiring for one of the reputed MNC Exp - 5+ Years Location - Pan India Immediate Joiners only Snowflake developer , Pyspark , Python , API, CI/CD , Cloud services ,Azure , Azure Devops Subject: Fw : TMNA SNOWFLAKE POSITION Please share profiles for Snowflake developers having strong Pyspark experience Job Description: Strong hands-on experience in Snowflake development including Streams, Tasks, and Time Travel Deep understanding of Snowpark for Python and its application for data engineering workflows Proficient in PySpark , Spark SQL, and distributed data processing Experience with API development . Proficiency in cloud services (preferably Azure, but AWS/GCP also acceptable) Solid understanding of CI/CD practices and tools like Azure DevOps, GitHub Actions, GitLab, or Jenkins for snowflake. Knowledge of Delta Lake, Data Lakehouse principles, and schema evolution is a plus Preferred candidate profile
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France