Home
Jobs

235 Snowflake Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

3 - 11 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

Remote

Foundit logo

You will Provide 24/7 administrative support (on-prime and Atlas Cloud) on MongoDB Clusters, Postgres & Snowflake Provide support for on-prime and Confluent Cloud Kafka Clusters You will Review database designs to ensure all technical and our requirements are met. Perform database Optimization, testing to ensure Service level agreements are met. You will support during system implementation and in production Provide Support for Snowflake Administrative Tasks (Data Pipeline, Object creation, Access) Participate in Weekdays and Weekend Oncall Rotation to support Products running on Mongo, SQL, Kafka & Snowflake, and other RDBMS Systems. This roles does not have any managerial responsibilities. Its an individual contributor role. You will report to Sr. Manager Reliability Engineering. What Your Responsibilities Will Be 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely. What Youll Need to be Successful Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETLs to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus

Posted 2 weeks ago

Apply

8.0 - 13.0 years

2 - 11 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

What Your Responsibilities Will Be Avalara is looking for data analytics engineer who can solve and scale real world big data challenges. Have end to end analytics experience and a complex data story with data models and reliable and applicable metrics. Build and deploy data science models using complex SQL, Python, DBT data modelling and re-useable visualization components (PowerBI/Tableau/Hex/R-shiny etc.) Expert level experience in PowerBI, SQL and Snowflake Solve needs on a large scale by applying your software engineering and complex data. Lead and help develop a roadmap for the area and the team. Analyze fault tolerance and high availability issues, performance, and scale challenges, and solve them. Lead programs and collaborate with engineers, product managers, and technical program managers across teams. Understand the trade-offs between consistency, durability, and costs to build solutions that can meet the demands of growing services. Ensure the operational readiness of the services and meet the commitments to our customers regarding availability and performance. Manage end-to-end project plans and ensure on-time delivery. Communicate the status and big picture to the project team and management. Work with business and engineering teams to identify scope, constraints, dependencies, and risks. Identify risks and opportunities across the business and guide solutions. What Youll Need to be Successful What Youll Need to be Successful Bachelors Engineering degree in Computer Science or a related field. 8+ years of experience of enterprise-class experience with large-scale cloud solutions in data science/analytics projects and engineering projects. Expert level experience in PowerBI, SQL and Snowflake Experience with data visualization, Python, Data Modeling and data storytelling. Experience architecting complex data marts applying DBT. Architect and build data solutions that use data quality and anomaly detection best practices. Experience building production analytics using the Snowflake data platform. Experience in AWS and Snowflake tools and services Good to have: Certificate in Snowflake is plus Relevant certifications in data warehousing or cloud platform. Experience architecting complex data marts applying DBT and Airflow.

Posted 2 weeks ago

Apply

6.0 - 12.0 years

6 - 12 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Job Title: DBT Data Engineer Key Responsibilities As a DBT Data Engineer, you will: SQL Development & Optimization: Apply strong SQL skills, including Common Table Expressions (CTE), for data manipulation and optimization. Database Management: Utilize Snowflake or equivalent strong database experience for data handling, including SQL tuning. DBT Implementation: Work with DBT Core or DBT Cloud for data transformation and modeling. Development Best Practices: Adhere to and promote development best practices, including peer reviews and unit testing. Data Modeling: Apply fundamental data modeling concepts such as 3NF, Star, Snowflake schemas, and understanding the grain of data. Data Analysis & Profiling: Perform data analysis and data profiling tasks to ensure data quality and understanding. Mandatory Skills & Experience Strong SQL skills , including Common Table Expressions (CTE). Experience with Snowflake or very strong database experience , including SQL tuning. Experience in DBT Core or DBT Cloud . Good understanding of development best practices, peer reviews, and unit testing . Understanding of fundamental data modeling concepts such as 3NF, Star, Snowflake schemas, and the grain of data. Experience in Data Analysis and Data Profiling tasks.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

3 - 11 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

What Your Responsibilities Will Be Avalara is looking for data analytics engineer who can solve and scale real world big data challenges. Have end to end analytics experience and a complex data story with data models and reliable and applicable metrics. Build and deploy data science models using complex SQL, Python, DBT data modelling and re-useable visualization components (PowerBI/Tableau/Hex/R-shiny etc.) Expert level experience in PowerBI, SQL and Snowflake Solve needs on a large scale by applying your software engineering and complex data. Lead and help develop a roadmap for the area and the team. Analyze fault tolerance and high availability issues, performance, and scale challenges, and solve them. Lead programs and collaborate with engineers, product managers, and technical program managers across teams. Understand the trade-offs between consistency, durability, and costs to build solutions that can meet the demands of growing services. Ensure the operational readiness of the services and meet the commitments to our customers regarding availability and performance. Manage end-to-end project plans and ensure on-time delivery. Communicate the status and big picture to the project team and management. Work with business and engineering teams to identify scope, constraints, dependencies, and risks. Identify risks and opportunities across the business and guide solutions. What Youll Need to be Successful What Youll Need to be Successful Bachelors Engineering degree in Computer Science or a related field. 8+ years of experience of enterprise-class experience with large-scale cloud solutions in data science/analytics projects and engineering projects. Expert level experience in PowerBI, SQL and Snowflake Experience with data visualization, Python, Data Modeling and data storytelling. Experience architecting complex data marts applying DBT. Architect and build data solutions that use data quality and anomaly detection best practices. Experience building production analytics using the Snowflake data platform. Experience in AWS and Snowflake tools and services Good to have: Certificate in Snowflake is plus Relevant certifications in data warehousing or cloud platform. Experience architecting complex data marts applying DBT and Airflow.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Job Title: Senior Data Engineer Key Responsibilities As a Senior Data Engineer, you will: Data Pipeline Development: Design, build, and maintain scalable data pipelines using PySpark and Python. AWS Cloud Integration: Work with AWS cloud services (S3, Lambda, Glue, EMR, Redshift) for data ingestion, processing, and storage. ETL Workflow Management: Implement and maintain ETL workflows using DBT and orchestration tools (e.g., Airflow). Data Warehousing: Design and manage data models in Snowflake, ensuring performance and reliability. SQL Optimization: Utilize SQL for querying and optimizing datasets across different databases. Data Integration: Integrate and manage data from MongoDB, Kafka, and other streaming or NoSQL sources. Collaboration & Support: Collaborate with data scientists, analysts, and other engineers to support advanced analytics and Machine Learning (ML) initiatives. Data Quality & Governance: Ensure data quality, lineage, and governance through best practices and tools. Mandatory Skills & Experience Strong programming skills in Python and PySpark . Hands-on experience with AWS data services (S3, Lambda, Glue, EMR, Redshift). Proficiency in SQL and experience with DBT for data transformation. Experience with Snowflake for data warehousing. Knowledge of MongoDB , Kafka , and data streaming concepts. Good understanding of data architecture, data modeling, and data governance . Familiarity with large-scale data platforms. Essential Professional Skills Excellent problem-solving skills . Ability to work independently or as part of a team . Experience with CI/CD and DevOps practices in a data engineering environment (Plus). Qualifications Proven hands-on experience working with large-scale data platforms . Strong background in Python, PySpark, AWS , and modern data warehousing tools such as Snowflake and DBT . Familiarity with NoSQL databases like MongoDB and real-time streaming platforms like Kafka.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

1 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

As a Solution Architect Snowflake, you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS services, and SQL, to support our data processing and analytics needs. Responsibilities: - Collaborate with stakeholders to finalize the scope of enhancements and development projects, gather detailed requirements. Apply expertise in ETL/ELT processes and tools to design and implement data pipelines that fulfil business requirements. Provide expertise as a technical resource to solve complex business issues that translate into data integration and database systems designs. Migrate and modernize existing legacy ETL jobs for Snowflake, ensure data integrity and optimal performance. Analyze existing ETL jobs and identify opportunities for creating reusable patterns and components to expedite future development. Develop and implement a configuration-driven Data Ingestion framework that enables efficient onboarding of new source tables. Collaborate with cross-functional teams, including business analysts and solution architects, to align data engineering initiatives with business goals. Drive continuous improvement initiatives, enhance data engineering processes, tools, and frameworks. Ensure compliance with data quality, security, and privacy standards across all data engineering activities. Participate in code reviews, provide constructive feedback, and ensure high-quality, maintainable code. Prepare and present technical documentation, including data flow diagrams, ETL specifications, and architectural designs. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Cloud certifications (AWS, etc.) and relevant technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Should have strong experience in Snowflake and must have executed development and migration projects involving Snowflake Should have strong working experience in ETL tools (Matillion/ DBT/Fivetron/ADF preferably) Experience in SQL writing including flatten tables and experience in JSON will be good to have, and able to write complex queries. Strong understanding of SQL queries, good coding experience on Python, deploying into Snowflake data warehousing, pipelines Experience in large databases Working knowledge of AWS (S3, KMS, and more) or Azure/GCP Design, develop, and thoroughly test new ETL/ELT code, ensure accuracy, reliability, and adherence to best practices Snowflake Python/Spark/JavaScript AWS/Azure/GCP SQL Good to have skills:- CI/CD (DevOps)

Posted 2 weeks ago

Apply

7.0 - 8.0 years

1 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

As a Technical Lead Azure Snowflake DBT, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards. Responsibilities: - Design, develop, and maintain data processing systems using Azure Snowflake. Design and develop robust data integration solutions using Data Build Tool (DBT) and other data pipeline tools. Work with complex SQL functions and transform large data sets to meet business requirements. Drive creation and maintenance of data models that support analytics use cases and business objectives. Collaborate with various stakeholders, including technical teams, functional SMEs, and business users, to understand and address the data needs. Create low-level design documents and unit test strategies and plans in adherence to defined processes and guidelines. Perform code reviews and unit test plan reviews to ensure high quality of code and deliverables. Ensure data quality and integrity through validation, cleansing, and enrichment processes. Support end-to-end testing and validation, including UAT and product testing. Take ownership of problems, demonstrate a proactive approach to problem solving, and Lead solutions to completion. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Over 5 years of experience in Cloud data architecture and analytics Proficient in Azure, Snowflake, SQL, and DBT Extensive experience in designing and developing data integration solutions using DBT and other data pipeline tools Excellent communication and teamwork skills Self-initiated, problem solver with a strong sense of ownership Good to Have Skills: Experience in other data processing tools and technologies Familiarity with agile development methodologies Strong analytical and problem-solving skills Experience in the healthcare domain

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Cloud Solution Delivery Lead Consultant to join our team in bangalore, Karn?taka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelines mandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring) Training & Certification . Apache Kafka Administration Snowflake Fundamentals/Advanced Training . Experience 8 years of experience in a technical role working with AWS At least 2 years in a leadership or management role About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 5 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

As a Data Engineer at IBM , you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. Your Role and Responsibilities In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintaining statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques. Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements. Work in an Agile, collaborative environment , partnering with other scientists, engineers, consultants, and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Expertise in designing and implementing scalable data warehouse solutions on Snowflake , including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modeling techniques to support analytics and reporting requirements. Preferred Technical and Professional Experience Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 5 Lacs

Navi Mumbai, Maharashtra, India

On-site

Foundit logo

Role Overview In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers) , where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role and Responsibilities Data Strategy and Planning : Develop and implement data architecture strategies that align with organizational goals and objectives. Collaborate with business stakeholders to understand data requirements and translate them into actionable plans. Data Modeling : Design and implement logical and physical data models to support business needs. Ensure data models are scalable, efficient, and comply with industry best practices. Database Design and Management : Oversee the design and management of databases, selecting appropriate database technologies based on requirements. Optimize database performance and ensure data integrity and security. Data Integration : Define and implement data integration strategies to facilitate seamless flow of information across systems. Responsibilities: Experience in data architecture and engineering. Proven expertise with Snowflake data platform . Strong understanding of ETL/ELT processes and data integration . Experience with data modeling and data warehousing concepts. Familiarity with performance tuning and optimization techniques. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Cloud & Data Architecture : AWS, Snowflake ETL & Data Engineering : AWS Glue, Apache Spark, Step Functions Big Data & Analytics : Athena, Presto, Hadoop Database & Storage : SQL, Snow SQL Security & Compliance : IAM, KMS, Data Masking Preferred Technical and Professional Experience Cloud Data Warehousing : Snowflake (Data Modeling, Query Optimization) Data Transformation : DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance : Alation (Data Catalog, Lineage, Governance)

Posted 2 weeks ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Required technical and professional expertise 5+ years of experience with BI tools, with expertise and/or certification in at least one major BI platform Tableau preferred. Advanced knowledge of SQL, including the ability to write complex stored procedures, views, and functions. Proven capability in data storytelling and visualization, delivering actionable insights through compelling presentations. Excellent communication skills, with the ability to convey complex analytical findings to non-technical stakeholders in a clear, concise, and meaningful way. 5.Identifying and analyzing industry trends, geographic variations, competitor strategies, and emerging customer behavior Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)

Posted 2 weeks ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Job description A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 5 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client's environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases.Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL

Posted 2 weeks ago

Apply

3.0 - 10.0 years

1 - 2 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities Perform intake of new ETL Project & initiatives, make the high-level assessment in collaboration with the leadership of the roadmap. Design Test Strategy and Test Plan to address the needs of Cloud Based ETL Pipelines. Contribute and manage Testing Deliverables. Ensure the implementation of test standards and best practices for the agile model & contributes to their development. Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. Deals with external stakeholders / Vendors. Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. Contribute to continuous improvement of testing standard processes. Skills Expert level knowledge on Data Warehouse, RDBMS concepts. Expertise on new age cloud-based Data Warehouse solutions ADF, SnowFlake, GCP etc. Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. Knowledge of framework designing, REST API Testing of databases using Python. Experience using Atlassian tool set, Azure DevOps. Experience in Code & Version Management GIT, Bitbucket, Azure Repos etc. Qualifications A bachelor's degree or equivalent experience in computer science or similar. Experience in crafting test strategies and supervising ETL DWH test activities on multi-platform & sophisticated Cloud based environments. Strong analytical mind-set with the ability to extract relevant information from documentation, system data, clients and colleagues and analyze the captured information. ISTQB Foundation Certificate in Software testing Optional/Preferred experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used. Important to Have Proficiency in English read/write/speak. Able to demonstrate your ability to learn new technologies. Able to easily adapt to new circumstances / technologies / procedures. Stress resistant and constructive whatever the context. Able to align with existing standards and acting with attention to detail A true standout colleague who demonstrates good interpersonal skills. Able to summarize complex technical situations in simple terms. Solution and customer focused. Good communication skills, a positive attitude, and a competitive, but team-oriented focus are key elements to be successful in this challenging environment. Nice to have Experience in the financial industry, knowledge of Regulatory Reporting and the terms/terminology used.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Experience in data architecture and engineering Proven expertise with Snowflake data platform Strong understanding of ETL/ELT processes and data integration Experience with data modeling and data warehousing concepts Familiarity with performance tuning and optimization techniques Excellent problem-solving skills and attention to detail Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cloud & Data Architecture: AWS ,Snowflake ETL & Data Engineering: AWS Glue, Apache Spark, Step Functions Big Data & Analytics: Athena,Presto, Hadoop Database & Storage: SQL,Snow sql Security & Compliance: IAM, KMS, Data Masking Preferred technical and professional experience Cloud Data Warehousing: Snowflake (Data Modeling, Query Optimization) Data Transformation: DBT (Data Build Tool) for ELT pipeline management Metadata & Data Governance: Alation (Data Catalog, Lineage, Governance)

Posted 2 weeks ago

Apply

5.0 - 15.0 years

22 - 24 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

This role is for one of the Weekday's clients Salary range: Rs 2200000 - Rs 2400000 (ie INR 22-24 LPA) Min Experience: 5 years Location: Bengaluru, Chennai, Gurgaon JobType: full-time We are looking for an experiencedSnowflake Developerto join our Data Engineering team. The ideal candidate will possess a deep understanding ofData Warehousing,SQL,ETL tools like Informatica, andvisualization platforms such as Power BI. This role involves building scalable data pipelines, optimizing data architectures, and collaborating with cross-functional teams to deliver impactful data solutions. Requirements Key Responsibilities Data Engineering & Warehousing:Leverage over 5 years of hands-on experience in Data Engineering with a focus on Data Warehousing and Business Intelligence. Pipeline Development:Design and maintain ELT pipelines usingSnowflake,Fivetran, andDBTto ingest and transform data from multiple sources. SQL Development:Write and optimize complexSQL queriesandstored proceduresto support robust data transformations and analytics. Data Modeling & ELT:Implement advanced data modeling practices includingSCD Type-2, and build high-performance ELT workflows using DBT. Requirement Analysis:Partner with business stakeholders to capture data needs and convert them into scalable technical solutions. Data Quality & Troubleshooting:Conduct root cause analysis on data issues, maintain high data integrity, and ensure reliability across systems. Collaboration & Documentation:Collaborate with engineering and business teams. Develop and maintain thorough documentation for pipelines, data models, and processes. Skills & Qualifications Expertise inSnowflakefor large-scale data warehousing and ELT operations. StrongSQLskills with the ability to create and manage complex queries and procedures. Proven experience withInformatica PowerCenterfor ETL development. Proficiency withPower BIfor data visualization and reporting. Hands-on experience withFivetranfor automated data integration. Familiarity withDBT,Sigma Computing,Tableau, andOracle. Solid understanding ofdata analysis,requirement gathering, andsource-to-target mapping. Knowledge of cloud ecosystems such asAzure (including ADF, Databricks); experience withAWS or GCPis a plus. Experience with workflow orchestration tools likeAirflow,Azkaban, orLuigi. Proficiency inPythonfor scripting and data processing (Java or Scala is a plus). Bachelor's or Graduate degree inComputer Science,Statistics,Informatics,Information Systems, or a related field. Key Tools & Technologies Snowflake,snowsql,Snowpark SQL,Informatica,Power BI,DBT Python,Fivetran,Sigma Computing,Tableau Airflow,Azkaban,Azure,Databricks,ADF

Posted 2 weeks ago

Apply

3.0 - 10.0 years

5 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Key Skills Snowflake, SQL, AWS, DBT Must have Skills 3+ years of data engineering experience, including practical experience using Snowflake for Data Engineer tasks. A working knowledge of Restful APIs, SQL, semi-structured datasets, and cloud native concepts. Experience in ELT toolsFivetran, Qlik replicate, Matillion,Experience with DBT is must Experience creating stored procedures, UDFs in Snowflake Experience in Snowpark - python Source data from Data Lakes, APIs, and on-premises Transform, replicate, and share data across cloud platforms Design end-to-end near real-time streams Design scalable compute solutions for Data Engineer workloads Evaluate performance metrics

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 5 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Role Overview In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers) . These centers are where we provide deep technical and industry expertise to a wide range of public and private sector clients globally. Our delivery centers offer clients locally based skills and technical expertise to drive innovation and the adoption of new technology. Your Role and Responsibilities Provide expertise in analysis, requirements gathering, design, coordination, customization, testing, and support of reports in the client's environment. Develop and maintain a strong working relationship with business and technical members of the team. Maintain a relentless focus on quality and continuous improvement . Perform root cause analysis of reports issues. Handle development and evolutionary maintenance of the environment, performance, capability, and availability. Assist in defining technical requirements and developing solutions. Ensure effective content and source-code management, troubleshooting, and debugging . Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise Tableau Desktop Specialist , SQL - Strong understanding of SQL for querying databases. Good to have: Python, Snowflake, Statistics, ETL experience. Extensive knowledge of creating impactful visualizations using Tableau . Must have a thorough understanding of SQL & advanced SQL (Joining & Relationships) . Must have experience working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge of creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Preferred Technical and Professional Experience Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable models. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have a thorough understanding of SQL & advanced SQL (Joining & Relationships) .

Posted 2 weeks ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

This position will play a key role on theFirst Line Risk and Control team, supporting Consumer Monitoring & Testing and driving the implementation of horizontal Consumer risk programs. This individual will be responsible for executing risk-based testing, liasing with product, operations, compliance, and legal teams to ensure regulatory adherence. The role will also provide the opportunity to drive development and enhancement of risk and control programs Execute testing and monitoring of regulatory, policy and process compliance Gather and synthesize data to determine root causes and trends related to testing failures Propose effective and efficient methods to enhance testing and sampling strategies (including automation) to ensure the most effective risk detection, analyses and control solutions Proactively identify potential business risks, process deficiencies and improvement opportunities and make recommendations for additional controls and corrective action to enhance the efficiency and effectiveness of risk mitigation processes Maintain effective communication with stakeholders and support teams in remediation of testing errors; assist with implementation of corrective actions related to testing fails and non-compliance with policies and procedures Identify continuous improvement opportunities to meet changing requirements, driving maximum visibility to the executive audience Work closely with enterprise risk teams to ensure business line risks are being shared and rolled up to firm-wide risk summaries Your Skills 2-4 years of testing, audit, or compliance experience in consumer financial services Bachelor's degree or equivalent military experience Knowledge of applicable U.S. federal and state consumer lending laws and regulations as well as industry association standards, including, among others, Truth in Lending Act (Reg Z), Equal Credit Opportunity Act (Reg B), Fair Credit Reporting Act (Reg V), UDAAP Understanding of test automation framework like data driven, hybrid driven etc Knowledge of testing concepts, methodologies, and technologies Genuine excitement and passion for leading root cause analysis, troubleshooting technical process failures and implementing fixes to operationalize a process Analytical, critical thinking and problem solving skills Highly motivated self-starter with strong organizational skills, attention to detail, and the ability to remain organized in a fast-paced environment Interpersonal, and relationship management skills Integrity, ethical standards, and sound judgment; ability to exercise discretion with respect to sensitive information Ability to summarize observations and present in a clear, concise manner to peers, managers and senior Consumer Compliance management Quickly grasp complex concepts, including global business and regulatory matters Confidence in expressing a point of view with management Plus : CPA, Audit experience, CRCM, proficiency in Aquadata studio, Snowflake, Splunk, Excel macros,Tableau, Hadoop/PySpark/Spark/Python/R, CPA, Audit experience, CRCM

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Lead Consultant- Snowflake Data Engineer ( Snowflake+ Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight , Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Pyspark /Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance , and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science , Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy , Scipy , Pandas, Dask , spaCy , NLTK, Great Expectations, Splink and PyTorch . . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

The Analyst/Associate Role (1-4 years of experience) involves dynamically collaborating across business and engineering teams, translating business problems into detailed data specifications and then designing, building and deploying scalable relational data models which would serve as the source for business user's/consumer's analytical use cases. The role requires end-to-end skills in data engineering, ETL, data modeling, distributed databases, math/logic and a good grasp SDLC best practices along with Data Governance aspect. In the course of building this data solution, the engineer will benefit from and be required to learn financial data engineering as it is performed at a top tier financial firm. Skills & Experience We Are Looking For Academic Qualifications: A Bachelors or Masters degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) 1-4 years of relevant work experience in a global team-oriented environment Strong object-oriented design and hands on experience in one of programming languages (such as Java, Python, C++) using Object Oriented design techniques and best practices. Deep understanding of multidimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes In-depth knowledge of relational and columnar SQL databases, including database design Expertise in data warehousing concepts (e.g. star schema, entitlement implementations, SQL modeling, milestoning, indexing, partitioning) Excellent communications skills and the ability to work with subject matter experts to extract critical business concepts and gather business requirements Independent thinker, willing to engage, challenge or learn Ability to stay commercially focused and to always push for quantifiable commercial impact Strong work ethic, a sense of ownership and urgency Strong analytical and problem solving skills Ability to collaborate effectively across global teams and communicate complex ideas in a simple manner Preferred Qualifications Industry Experience in Data engineering. Exposure to cloud databases (such as Snowflake, Single Store). Exposure to cloud infrastructure (AWS, Azure, or GCP) and infrastructure as code (Terraform). Experience with programming for extract transform load (ETL) operations and data analysis.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

2 - 4 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Role Overview: The Site Reliability Engineer team is responsible for design, implementation and end to end ownership of the infrastructure platform and services that protect the Trellix Securitys Consumer. The services provide continuous protection to our customers with a very strong focus on quality and an extendible services platform to internal partners & product teams. This role is a Site Reliability Engineer for commercial cloud-native solutions, deployed and managed in public cloud environments like AWS, GCP. You will be part of a team that is responsible for Trellix Cloud Services that enable protection at the endpoint products on a continuous basis. Responsibilities of this role include supporting Cloud service measurement, monitoring, and reporting, deployments and security. You will input into improving overall operational quality through common practices and by working with the Engineering, QA, and product DevOps teams. You will also be responsible for supporting efforts that improve Operational Excellence and Availability of Trellix Production environments. You will have access to the latest tools and technology, and an incredible career path with the worlds cyber security leader. You will have the opportunity to immerse yourself within complex and demanding deployment architectures and see the big picture all while helping to drive continuous improvement in all aspects of a dynamic and high-performing engineering organization. If you are passionate about running and continuously improving as a world class Site Reliability Engineer Team, we are offering you a unique and great opportunity to build your career with us and gain experience working with high-performance Cloud systems. About Role: Being part of a global 24x7x365 team providing the operational coverage including event response and recovery efforts of critical services. Periodic deployment of features, patches and hotfixes to maintain the Security posture of our Cloud Services. Ability to work in shifts on a rotational basis and participate in On-Call duties Have ownership and responsibility for high availability of Production environments Input into the monitoring of systems applications and supporting data Report on system uptime and availability Collaborate with other team members on best practices Assist with creating and updating runbooks & SOPs Build a strong relationship with the Cloud DevOps, Dev & QA teams and become a domain expert for the cloud services in your remit. Provided the required support for growth and development in this role. About you: 2 to 4 years of hands-on working experience in supporting production of large-scale cloud services. Strong production support background and experience of in-depth troubleshooting Experience working with solutions in both Linux and Windows environments Experience using modern Monitoring and Alerting tools (Prometheus, Grafana, PagerDuty, etc.) Excellent written and verbal communication skills. Experience with Python or other scripting languages Proven ability to work independently in deploying, testing, and troubleshooting systems. Experience supporting high availability systems and scalable solutions hosted on AWS or GCP. Familiarity with security tools & practices (Wiz, Tenable) Familiarity with Containerization and associated management tools (Docker, Kubernetes) Significant experience of developing and maintaining relationships with a wide range of customers at all levels Understanding of Incident, Change, Problem and Vulnerability Management processes. Desired: Awareness of ITIL best practices AWS Certification and/or Kubernetes Certification Experience with SnowFlake Automation/CI/CD experience, Jenkins, Ansible, Github Actions, Argo CD. Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each other's unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies