Jobs
Interviews

2244 Snowflake Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 18.0 years

0 - 3 Lacs

Hyderabad

Work from Office

Greetings from Cognizant!!! We have an exciting opportunity for the skill Azure infrastructure with Cognizant, if you are an aspirant for matching the below criteria apply with us immediately!! Skill: Azure Data Factory Experience: 11 to 18 years Location: Hyderabad Notice Period: immediate to 30 days Interview mode : Virtual Required Qualifications: AZ Data Engineer profiles, who are strong in AZ ADF, Snowflake, SQL and DWH Concepts.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 15 Lacs

Chennai

Hybrid

Role & responsibilities Backend Developer (4+ years) with strong expertise in Python and SQL technologies to develop and maintain high-performance big data architecture. Should have hands-on experience in hive, impala, airflow. They should have project experience in agile methodologies. Snowflake skills with competitive level experience. Python-Big Data Technologies, Snowflake and Good Communication skill

Posted 3 weeks ago

Apply

8.0 - 13.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Project description We are seeking an experienced Senior Project Manager with a strong background in delivering data engineering and Python-based development projects. In this role, you will manage cross-functional teams and lead Agile delivery for high-impact, cloud-based data initiatives. You'll work closely with data engineers, scientists, architects, and business stakeholders to ensure projects are delivered on time, within scope, and aligned with strategic objectives. The ideal candidate combines technical fluency, strong leadership, and Agile delivery expertise in data-centric environments. Responsibilities Lead and manage data engineering and Python-based development projects, ensuring timely delivery and alignment with business goals. Work closely with data engineers, data scientists, architects, and product owners to gather requirements and define project scope. Translate complex technical requirements into actionable project plans and user stories. Oversee sprint planning, backlog grooming, daily stand-ups, and retrospectives in Agile/Scrum environments. Ensure best practices in Python coding, data pipeline design, and cloud-based data architecture are followed. Identify and mitigate risks, manage dependencies, and escalate issues when needed. Own stakeholder communications, reporting, and documentation of all project artifacts. Track KPIs and delivery metrics to ensure accountability and continuous improvement. Skills Must have ExperienceMinimum 8+ years of project management experience, including 3+ years managing data and Python-based development projects. Agile ExpertiseStrong experience delivering projects in Agile/Scrum environments with distributed or hybrid teams. Technical Fluency: Solid understanding of Python, data pipelines, and ETL/ELT workflows. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to tools like Airflow, dbt, Spark, Databricks, or Snowflake is a plus. ToolsProficiency with JIRA, Confluence, Git, and project dashboards (e.g., Power BI, Tableau). Soft Skills: Strong communication, stakeholder management, and leadership skills. Ability to translate between technical and non-technical audiences. Skilled in risk management, prioritization, and delivery tracking. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior

Posted 3 weeks ago

Apply

5.0 - 9.0 years

25 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Looking for Snowflake Data Engineer who has below technical skills - Able to write SnowSQL queries, stored procedure Have good understanding of Snowflake Warehouse Architecture and Design. Have sound troubleshooting skills. Have knowledge how to fix the Query performance issue in Snowflake Familiar with AWS services - S3, Lambda Function, Glue Jobs etc. Hands-on in Pyspark Also, the Person must have the right attitude, quick learning and analytical skills. The person should be good team player. The person with Insurance (Claims & Settlements) domain knowledge will be preferred.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Job location Bangalore Job Title: Module Lead - SnowFlake Experience 6-8 Years Job Description Sr Snowflake Developer Design and develop our Snowflake data platform, including data pipeline building, data transformation, and access management. Minimum 4+years of experience in Snowflake, strong in SQL Develop data warehouse and data mart solutions for business teams. Accountable to design robust, scalable database and data extraction, transformation, and loading (ETL) solutions Understand and evaluate business requirements that impact the Caterpillar enterprise. Liaises with data creators to support project planning, training, guidance on standards, and the efficient creation/maintenance of high-quality data. Contributes to policies, procedures, and standards as well as technical requirements. Ensure compliance with the latest data standards supported by the company, and brand, legal, information security (data security and privacy compliance). Document data models for domains to be deployed including a logical data model, candidate source lists, and canonical formats. Creates, updates, and enhances metadata policies, processes, and catalogs. Good communication and interacting with SME from client Should have capability to lead a team of 4-5 members Snowflake Certifications Mandatory

Posted 3 weeks ago

Apply

7.0 - 12.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Targets investments in technology and innovation. Were the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guestsand we do so with a focus on diversity and inclusion, experimentation and continuous learning. Position Overview: As a Lead Data Engineer , you will serve as the technical anchor for the engineering team, responsible for designing and developing scalable, high-performance data solutions . You will own and drive data architecture that supports both functional and non-functional business needs, ensuring reliability, efficiency, and scalability .Your expertise in big data technologies, distributed systems, and cloud platforms will help shape the engineering roadmap and best practices for data processing, analytics, and real-time data serving . You will play a key role in architecting and optimizing data pipelines using Hadoop, Spark, Scala/Java, and cloud technologies to support enterprise-wide data initiatives.Additionally, experience with API development for serving low-latency data and Customer Data Platforms (CDP) will be a strong plus. Key Responsibilities: Architect and build scalable, high-performance data pipelines and distributed data processing solutions using Hadoop, Spark, Scala/Java, and cloud platforms (AWS/GCP/Azure) . Design and implement real-time and batch data processing solutions , ensuring data is efficiently processed and made available for analytical and operational use. Develop APIs and data services to expose low-latency, high-throughput data for downstream applications, enabling real-time decision-making. Optimize and enhance data models, workflows, and processing frameworks to improve performance, scalability, and cost-efficiency. Drive data governance, security, and compliance best practices. Collaborate with data scientists, product teams, and business stakeholders to understand requirements and deliver data-driven solutions . Lead the design, implementation, and lifecycle management of data services and solutions. Stay up to date with emerging technologies and drive adoption of best practices in big data engineering, cloud computing, and API development . Provide technical leadership and mentorship to engineering teams, promoting best practices in data engineering and API design . About You: 7+ years of experience in data engineering, software development, or distributed systems. Expertise in big data technologies such as Hadoop, Spark, and distributed processing frameworks. Strong programming skills in Scala and/or Java (Python is a plus). Experience with cloud platforms (AWS, GCP, or Azure) and their data ecosystem (e.g., S3, BigQuery, Databricks, EMR, Snowflake, etc.). Proficiency in API development using REST, GraphQL, or gRPC to serve real-time and batch data. Experience with real-time and streaming data architectures (Kafka, Flink, Kinesis, etc.). Strong knowledge of data modeling, ETL pipeline design, and performance optimization . Understanding of data governance, security, and compliance in large-scale data environments. Experience with Customer Data Platforms (CDP) or customer-centric data processing is a strong plus. Strong problem-solving skills and ability to work in complex, unstructured environments . Excellent communication and collaboration skills, with experience working in cross-functional teams . Why Join Us Work with cutting-edge big data, API, and cloud technologies in a fast-paced, collaborative environment. Influence and shape the future of data architecture and real-time data services at Target. Solve high-impact business problems using scalable, low-latency data solutions . Be part of a culture that values innovation, learning, and growth . Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging

Posted 3 weeks ago

Apply

14.0 - 20.0 years

30 - 45 Lacs

Bengaluru

Hybrid

Hiring, Middle and Back Office Data Analyst (Senior Manager) with following skills and experience. 14+ years Data Analysis / Business Analysis Excellent knowledge of data life cycle that drives Middle and Back Office capabilities Snowflake - must Asset Management industry - must Technical Skills (Must Have): 14+yrs with minimum 5 years as a senior business/technical/data analyst adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Excellent knowledge of data life cycle that drives Middle and Back Office capabilities such as trade execution, matching, confirmation, trade settlement, record keeping, accounting, fund & cash positions, custody, collaterals/margin movements, corporate actions , derivations and calculations such as holiday handling, portfolio turnover rates, funds of funds look through. Excellent hands-on SQL, Advanced Excel, Python, ML (optional)

Posted 3 weeks ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Hyderabad, Pune

Work from Office

Sr MuleSoft Developer1 Design and implement MuleSoft solutions using AnyPoint Studio, Mule ESB, and other related technologies.Collaborate with cross-functional teams to gather requirements and deliver high-quality ETL processes.Develop and maintain APIs using RAML and other industry standards.Strong understanding of RAML (REpresentational API Modeling Language) and its usage in API design.Develop complex integrations between various systems, including cloud-based applications such as Snowflake.Ensure seamless data flow by troubleshooting issues and optimizing existing integrations.Provide technical guidance on best practices for data warehousing, ETL development, and PL/SQL programming language.Strong understanding of SQL concepts, including database schema design, query optimization, and performance tuning.Proficiency in developing complex ETL processes using various technologies such as Cloud platforms and Data Warehousing tools (Snowflake).Experience working with multiple databases and ability to write efficient PL/SQL code snippets.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

IICS Developer1 Job Overview :We are seeking an experienced IICS (Informatica Intelligent Cloud Services) Developer with hands-on experience in the IICS platform . The ideal candidate must have strong knowledge of Snowflake and be proficient in building and managing integrations between different systems and databases. The role will involve working with cloud-based integration solutions, ensuring data flows seamlessly across platforms, and optimizing performance for large-scale data processes. Key Responsibilities : Design, develop, and implement data integration solutions using IICS (Informatica Intelligent Cloud Services). Work with Snowflake data warehouse solutions, including data loading, transformation, and querying. Build, monitor, and maintain efficient data pipelines between cloud-based systems and Snowflake. Troubleshoot and resolve integration issues within the IICS platform and Snowflake . Ensure optimal data processing performance and manage data flow between various cloud applications and databases. Collaborate with data architects, analysts, and stakeholders to gather requirements and design integration solutions. Implement best practices for data governance, security, and data quality within the integration solutions. Perform unit testing and debugging of IICS data integration tasks. Optimize integration workflows to ensure they meet performance and scalability needs. Key Skills : Hands-on experience with IICS (Informatica Intelligent Cloud Services) . Strong knowledge and experience working with Snowflake as a cloud data warehouse. Proficient in building ETL/ELT workflows , including integration of various data sources into Snowflake . Experience with SQL and writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms. Experience with cloud integration platforms and working with RESTful APIs and other integration protocols. Ability to troubleshoot, optimize, and maintain data pipelines effectively. Knowledge of data governance, security principles, and data quality standards. Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Minimum of 5 years of experience in data integration development. Proficiency in Snowflake and cloud-based data solutions. Strong understanding of ETL/ELT processes and integration design principles. Experience working in Agile or similar development methodologies. Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Kolkata, Hyderabad, Pune

Work from Office

ETL QA tester1 Job Tile ETL QA tester Job Summary: We are looking for an experienced ETL Tester to ensure the quality and integrity of our data processing and reporting systems. The ideal candidate will have a strong background in ETL processes, data warehousing, and experience with Snowflake and Tableau. This role involves designing and executing test plans, identifying and resolving data quality issues, and collaborating with development teams to enhance data processing systems. Key Responsibilities: Design, develop, and execute comprehensive test plans and test cases for ETL processes. Validate data transformation, extraction, and loading processes to ensure accuracy and integrity. Perform data validation and data quality checks using Snowflake and Tableau. Identify, document, and track defects and data quality issues. Collaborate with developers, business analysts, and stakeholders to understand requirements and provide feedback on data-related issues. Create and maintain test data, test scripts, and test environments. Generate and analyze reports using Tableau to validate data accuracy and completeness. Conduct performance testing and optimization of ETL processes. Develop and maintain automated testing scripts and frameworks for ETL testing. Ensure compliance with data governance and security standards. Location - Pune,Hyderabad,Kolkata,Chandigarh

Posted 3 weeks ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Greeting of the Day !!! We have job opening for Snaplogic Lead with one of our client . If you are interested, please share update resume . Job Title: Tech Lead - SnapLogic Job Description: Experience: 6 to 8 years. Configuring and deploying Snaplogic pipelines to integrate data from various sources. Trouble shooting and resolving issues related to ETL processes. Developing and maintaining user documentation for ETL processes. Resource to work from CAT office 3 days a week. Overall, 6-8 yrs experience Proven 3+ experience in building and designing solutions for data warehouse and experience in working with large data sets 3-4 years of Development experience in building Snap logic pipelines, error handling, scheduling tasks & alerts. Analyze & translate functional specifications /user stories into technical specifications. Performs a Sr.Develop Role in end to end implementations in Snap logic Strong database knowledge, i.e., RDBMS Oracle/PLSQL, Snowflake Proven experience with Cloud data storage and access using Snowflake / S3 Experienced in business interfacing, possess strong data background, and good understanding in requirements analysis design Data movement and ETL experience Experience with AWS/Azure cloud environment development and deployment Knowledge of API's and in any scripting is a plus Note : Resource should be able to provide Technical guidance and Mentorship to development teams along with Team leads. Review and optimize existing pipelines for performance and efficiency Collaborate with stakeholders to understand business requirements and turn them to technical solutions.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Design, develop, and maintain system integration solutions that connect various applications and platforms using APIs, middleware, or other integration tools Collaborate with business analysts, architects, and development teams to gather requirements and implement robust integration workflows Monitor and troubleshoot integration processes, ensuring data consistency, accuracy, and performance Create technical documentation, perform testing, and resolve any integration-related issues Ensure compliance with security and data governance standards while optimizing system connectivity and scalability Stay updated with integration trends and tools to enhance system interoperability

Posted 3 weeks ago

Apply

8.0 - 13.0 years

15 - 20 Lacs

Chennai

Work from Office

Role & responsibilities : Title: Shift Lead Engineer - L3 Exp: 8+ years Skills: Oracle PLSQ, RevPro Exp, RightRev Exp Location: Chennai Job Type: Full Time Notice: Immediate Hybrid - Yes CTC 18 - 24 Lacs Shift - Morning & Evening Shift About the Role: We are seeking a highly skilled and technically proficient L3 Application Lead Engineer with expertise in Oracle PL/SQL to join our dynamic team. In this role, you will be responsible for providing deep technical analysis, troubleshooting, and resolution of complex issues in business-critical software applications. As part of the 24/5 support operations, you will work closely with internal teams and external clients to diagnose root causes, implement fixes, and optimize application performance. Your expertise will play a key role in ensuring system stability, minimizing downtime, and enhancing overall efficiency. Key Responsibilities: Analyze and resolve complex application issues by leveraging deep expertise in Oracle PL/SQL and Snowflake. Perform advanced debugging and root cause analysis, utilizing logs, scripts, and database queries to identify and fix issues efficiently. Optimize SQL queries, stored procedures, and database performance to ensure high availability and efficiency of enterprise applications. Proactively monitor application and database performance, identifying bottlenecks and implementing corrective measures to minimize downtime. Troubleshoot and resolve data integrity, performance, and security issues across on-premise and cloud-based environments. Work closely with development, DevOps, and infrastructure teams to escalate and resolve complex technical problems. Manage high-priority incidents and problem resolution processes within defined SLAs, ensuring minimal business impact. Automate repetitive operational tasks using scripts and database procedures to improve efficiency and reliability. Develop and maintain in-depth technical documentation, including troubleshooting steps, system workflows, and best practices. Lead application upgrades, patches, and configuration changes, ensuring compatibility and system stability. Assist in data validation, analytics, and reporting, supporting revenue optimization and business intelligence initiatives. Collaborate with software vendors and third-party service providers to troubleshoot and resolve application-related issues. Stay up to date with industry trends, best practices, and emerging technologies related to application support, cloud databases, and performance tuning. Required Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. 8+ years of experience in Application Support, with at least 2-4 years specializing in Revenue Management Systems. Must be willing to work in shifts, including night shifts, weekends, and public holidays, as per business needs Extensive expertise in Oracle PL/SQL development with at least 4 - 6 years of hands-on coding experience, including writing complex queries, stored procedures, functions, triggers, materialized views, and performance tuning. Deep understanding of advanced PL/SQL concepts, such as bulk collections, pipelined functions, dynamic SQL, query optimization techniques (e.g., indexing, partitioning, execution plans, hints), and handling large datasets efficiently. Strong experience in database performance tuning and troubleshooting, including identifying slow-running queries, optimizing execution plans, and resolving deadlocks, locks, and contention issues. Proficiency in Snowflake data warehouse solutions, including data modeling, schema design, query optimization, and stored procedures using Snowflake SQL. Expertise in handling large-scale ETL processes, working with SQL Loader, external tables, bulk data processing, and integrating data between Oracle and Snowflake environments. Strong troubleshooting skills with expertise in analyzing application logs, system logs, and debugging SQL issues in a production environment. Understanding of ITIL principles and best practices. Excellent problem-solving, communication, and interpersonal skills. Ability to work in a fast-paced environment and manage multiple priorities. Strong analytical skills with attention to detail. Preferred candidate profile : Preferred Qualifications: Experience in cloud platforms (AWS, Azure, Google Cloud) and DevOps practices. Familiarity with CI/CD pipelines, APIs, and microservices architecture. Knowledge of containerization technologies such as Docker and Kubernetes. Anyone is interested please DM for more information & share me your Updated cv to mail ID: anusha.r@rrootshell.com Thanks & Regards, Anusha R Ph No: 7989093547 Mail Me: anusha.r@rrootshell.com

Posted 3 weeks ago

Apply

13.0 - 18.0 years

14 - 18 Lacs

Gurugram

Work from Office

0px> Who are we? In one sentence We are seeking a Java Full Stack Architect & People Manager with strong technical depth and leadership capabilities to lead our Java Modernization projects. The ideal candidate will possess a robust understanding of Java Full Stack, Databases and Cloud-based solution delivery , combined with proven experience in managing high-performing technical teams. This role requires a visionary who can translate business challenges into scalable distributed solutions while nurturing talent and fostering innovation. What will your job look like? Lead the design and implementation of Java Full Stack solutions covering Frontend, Backend and Batch processes & interface Integrations across business use cases. Translate business requirements into technical architectures using Azure/AWS Cloud Platforms . Manage and mentor a multidisciplinary team of engineers, Leads and Specialists . Drive adoption of Databricks , Python In addition to Java-based frameworks within solution development. Collaborate closely with product owners, data engineering teams, and customer IT & business stakeholders. Ensure high standards in code quality, system performance, and model governance . Track industry trends and continuously improve the technology stack adopting newer trends showcasing productization, automation and innovative ideas. Oversee end-to-end lifecycle: use case identification, PoC, MVP, production deployment, and support. Define and monitor KPIs to measure team performance and project impact. All you need is... 13+ years of overall IT experience with a strong background in Telecom domain (preferred). Proven hands-on experience with Java Full Stack technologies and Cloud DBs Strong understanding of Design Principles and patterns for distributed applications OnPrem as well as OnCloud . Demonstrated experience in building and deploying on Azure or AWS via CI/CD practices . Strong expertise in Java, Databases, Python, Kafka and Linux Scripting . In-depth understanding of cloud-native architecture , microservices , and data pipelines . Solid people management experience: team building, mentoring, performance reviews. Strong analytical thinking and communication skills. Ability to be Hands-On with Coding, Reviews while Development and Production Support Good to Have Skills: Familiarity with Databricks, PySpark Familiarity of Snowflake Why you will love this job: You will be challenged with leading and mentoring a few development teams & projects You will join a strong team with lots of activities, technologies, business challenges and a progression path You will have the opportunity to work with the industry most advanced technologies

Posted 3 weeks ago

Apply

7.0 - 12.0 years

35 - 45 Lacs

Noida, Thiruvananthapuram

Work from Office

As a Data Engineer, you will work with the development team to construct a data streaming platform and data warehouse that serves as the data foundations for our product. Help us scale our business to meet the needs of our growing customer base and develop new products on our platform. You'll be a critical part of our growing company, working on a cross-functional team to implement best practices in technology, architecture, and process. Youll have the chance to work in an open and collaborative environment, receive hands-on mentorship and have ample opportunities to grow and accelerate your career! Responsibilities: Build our next generation data warehouse Build our event stream platform Translate user requirements for reporting and analysis into actionable deliverables Enhance automation, operation, and expansion of real-time and batch data environment Manage numerous projects in an ever-changing work environment Extract, transform, and load complex data into the data warehouse using cutting-edge technologies Build processes for topnotch security, performance, reliability, and accuracy Provide mentorship and collaborate with fellow team members Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, Operations Research, or related field required 7+ years of experience building data pipelines 7+ years of experience building data frameworks for unit testing, data lineage tracking, and automation Fluency in Scala is required Working knowledge of Apache Spark Familiarity with streaming technologies (e.g., Kafka, Kinesis, Flink) Nice-to-Have: Experience with Machine Learning Familiarity with Looker a plus Knowledge of additional server-side programming languages (e.g. Golang, C#, Ruby)

Posted 3 weeks ago

Apply

12.0 - 18.0 years

1 - 3 Lacs

Bengaluru

Hybrid

Job Description: 12+ years of experience as a Technical Architect, , or similar role with a focus on Azure Data Bricks, Power BI, and ETL. Expertise in designing and implementing data architectures using Azure Data Bricks (ADB). Strong proficiency in Power BI for building scalable reports and dashboards. In-depth knowledge of ETL tools and processes, particularly with Azure Data Factory and other Azure-based ETL solutions. Proficiency in SQL and familiarity with data warehousing concepts (e.g., star schema, snowflake schema). Strong understanding of cloud computing and Azure services, including storage, compute, and security best practices. Experience with data lake architecture, data pipelines, and data governance. Ability to understand complex business requirements and translate them into technical solutions. Strong communication skills with the ability to collaborate across business and technical teams. Leadership and mentoring experience, guiding junior team members to achieve project goals. Preferred Qualifications: Certification in Azure (e.g., Azure Solutions Architect, Azure Data Engineer). Experience with other BI tools or visualization platforms (e.g., Power BI, , PowerApps). Knowledge of programming/scripting languages such as Python, Scala, or DAX. Familiarity with DevOps practices in data pipelines and CI/CD workflows. E xperience with Agile methodologies and project management tools like JIRA or Azure DevOps.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

19 - 34 Lacs

Noida, Gurugram

Hybrid

Should have knowledge in the regression and classification algorithm such as but not limited to Gradient Decent , Linear Regression , RandonForest , Support Vector Machine , K-nearest , Neural Networks DBSCAN , Principal Component Analysis , LDA and Autoencoders , Loss minimization function such as MSE , RMSE , Hinge Loss ect. We are seeking a data science or machine learning professional with strong expertise in both regression and classification algorithms. The ideal candidate should have hands-on experience with a wide range of techniques including, but not limited to: Supervised Learning Algorithms : Gradient Descent, Linear Regression, Random Forest, Support Vector Machines (SVM), K-Nearest Neighbors (KNN), and Neural Networks. Unsupervised Learning Techniques : DBSCAN, Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Autoencoders. Loss Functions and Optimization : Proficient in implementing and optimizing loss functions such as Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and Hinge Loss for model performance improvement.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Job Title: Industry & Function AI Data Engineer Analyst S&C Global Network Location: Primary - Bengaluru, Secondary - Gurugram Management Level: 11 - Analyst Must-Have Skills: Proficiency in Python, SQL, and data engineering tools, Experience with cloud platforms like AWS, Azure, or GCP, Knowledge of data warehousing concepts and ETL processes, Strong problem-solving and analytical skills. Good-to-Have Skills: Familiarity with big data technologies (e.g., Hadoop, Spark), Experience with data pipeline tools (e.g., Apache Airflow), Understanding of MLOps practices, Strong communication and collaboration skills Snowflake,DBT,etc.like Computer Science, Information Systems, or Engineering. MBA is a plus. Job Summary :As a Data Engineer Analyst, you will play a critical role in designing, implementing, and optimizing data infrastructure to power analytics, machine learning, and enterprise decision-making. You will collaborate closely with stakeholders, translate business needs into data-driven solutions, and ensure seamless integration into enterprise ecosystems. Roles & Responsibilities: Build and maintain scalable data pipelines and systems for data ingestion, transformation, and storage. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable solutions. Ensure data accuracy, consistency, and reliability through robust validation and governance practices. Stay updated with the latest tools and technologies in data engineering through professional development opportunities. Professional & Technical Skills: Excellent problem-solving abilities. Strong analytical thinking and attention to detail. Strong knowledge of traditional statistical methods, basic machine learning techniques. Ability to work independently and as part of a team. Effective time management and organizational skills. Advanced knowledge of data engineering techniques and tools. Experience with data wrangling and preprocessing. Familiarity with version control systems like Git. Additional Information: An ideal candidate for the Data Engineer Analyst role at Accenture should have a strong technical background in data engineering, with proficiency in Python, SQL, and data engineering tools. They should be experienced with cloud platforms like AWS, Azure, or GCP, and have a solid understanding of data warehousing concepts and ETL processes. Strong problem-solving and analytical skills are essential for this role. About Our Company | Accenture Qualification Experience: Minimum 1-3 years in data engineering or related roles. Experience in Consumer Goods and Services (CG&S) is a plus. Educational Qualification: Bachelor's or master's degree in a quantitative field

Posted 3 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Tableau Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project goals are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application functionalities with business objectives, ensuring that the solutions provided are effective and efficient. Your role will require you to be proactive in identifying areas for improvement and implementing best practices to enhance application performance and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and ensure timely delivery of application features.- Design, develop, and maintain advanced Tableau dashboards and reports based on business requirements.- Collaborate with business analysts, data engineers, and stakeholders to gather requirements and define key performance indicators (KPIs).- Ensure data accuracy and consistency by implementing robust data validation and QA processes.- Optimize dashboard performance and troubleshoot data or performance issues.- Integrate Tableau with different data sources such as SQL Server, Excel, Snowflake, or cloud-based databases.- Develop and document technical design specifications and business rules.- Support ad-hoc reporting and visualization requests from various business units.- Provide training and guidance to junior Tableau developers and end users.- Stay up to date with new Tableau features and best practices to continually improve the BI environment. Professional & Technical Skills: - The candidate should have minimum 5 years of experience in Tableau.- Strong analytical skills to interpret data and provide actionable insights.- Experience with data visualization best practices to create effective dashboards.- Familiarity with SQL for data extraction and manipulation.- Ability to work collaboratively in a team environment and communicate effectively.Strong analytical and problem-solving skills.- Excellent communication and interpersonal abilities.- Ability to work independently and in a collaborative, fast-paced team environment.- Detail-oriented with a commitment to delivering high-quality, accurate, and user-friendly dashboards.- Strong organizational skills and ability to manage multiple projects simultaneously.- Proactive in identifying opportunities for automation and process improvement.- 6+ years of professional experience with Tableau Desktop and Tableau Server.- Proficient in creating interactive dashboards, visualizations, and complex calculations using LOD expressions, parameters, sets, and filters.- Strong SQL skills with experience querying and managing large datasets.- Experience with data blending, data modeling, and creating extracts from multiple data sources.- Knowledge of Tableau Prep or other ETL tools is a plus.- Experience integrating Tableau with cloud platforms like AWS, Azure, or Google Cloud is preferred.- Familiarity with scripting languages (Python, R) for advanced analytics is an advantage.- Exposure to Agile/Scrum development methodologies. Additional Information:- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Gurugram

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing innovative solutions to enhance business operations and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to design and deliver high-quality applications.- Troubleshoot and debug applications to ensure optimal performance.- Stay updated with industry trends and technologies to enhance application development processes.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of database concepts and SQL.- Experience in ETL processes and data modeling.- Knowledge of cloud platforms like AWS or Azure.- Hands-on experience in developing scalable and efficient applications. Additional Information:- The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Analysis & Interpretation Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 1 Minimum 15 years of Full-time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing insights and recommendations to enhance application functionality and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Advanced Proficiency in Snowflake Data Cloud Technology, DBT and Cloud Data warehousing- Good To Have Skills: Experience with Talend- Strong analytical skills to interpret complex data sets. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Cloud Technology- A Minimum 15 Years of Full time Education is required. Qualification 1 Minimum 15 years of Full-time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Cloud Data Architecture Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationData ArchitectKemper is seeking a Data Architect to join our team. You will work as part of a distributed team and with Infrastructure, Enterprise Data Services and Application Development teams to coordinate the creation, enrichment, and movement of data throughout the enterprise.Your central responsibility as an architect will be improving the consistency, timeliness, quality, security, and delivery of data as part of Kemper's Data Governance framework. In addition, the Architect must streamline data flows and optimize cost management in a hybrid cloud environment. Your duties may include assessing architectural models and supervising data migrations across IaaS, PaaS, SaaS and on premises systems, as well as data platform selection and, on-boarding of data management solutions that meet the technical and operational needs of the company. To succeed in this role, you should know how to examine new and legacy requirements and define cost effective patterns to be implemented by other teams. You must then be able to represent required patterns during implementation projects. The ideal candidate will have proven experience in cloud (Snowflake, AWS and Azure) architectural analysis and management. ResponsibilitiesDefine architectural standards and guidelines for data products and processes. Assess and document when and how to use existing and newly architected producers and consumers, the technologies to be used for various purposes, and models of selected entities and processes. The guidelines should encourage reuse of existing data products, as well as address issues of security, timeliness, and quality. Work with Information & Insights, Data Governance, Business Data Stewards, and Implementation teams to define standard and ad-hoc data products and data product sets. Work with Enterprise Architecture, Security, and Implementation teams to define the transformation of data products throughout hybrid cloud environments assuring that both functional and non-functional requirements are addressed. This includes the ownership, frequency of movement, the source and destination of each step, how the data is transformed as it moves, and any aggregation or calculations. Working with Data Governance and project teams to model and map data sources, including descriptions of the business meaning of the data, its uses, its quality, the applications that maintain it and the technologies in which it is stored. Documentation of a data source must describe the semantics of the data so that the occasional subtle differences in meaning are understood. Defining integrative views of data to draw together data from across the enterprise. Some views will use data stores of extracted data and others will bring together data in near real time. Solutions must consider data currency, availability, response times and data volumes, etc. Working with modeling and storage teams to define Conceptual, Logical and Physical data views limiting technical debt as data flows through transformations. Investigating and leading participation in POCs of emerging technologies and practices. Leveraging and evolving existing [core] data products and patterns. Communicate and lead understanding of data architectural services across the enterprise. Ensure a focus on data quality by working effectively with data and system stewards. QualificationsBachelors degree in computer science, Computer Engineering, or equivalent experience. A minimum of 3 years experience in a similar role. Demonstrable knowledge of Secure DevOps and SDLC processes. Must have AWS or Azure experience.Experience with Data Vault 2 required. Snowflake a plus.Familiarity of system concepts and tools within an enterprise architecture framework. Including Cataloging, MDM, RDM, Data Lakes, Storage Patterns, etc. Excellent organizational and analytical abilities. Outstanding problem solver. Good written and verbal communication skills. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 8 Lacs

Kolkata

Work from Office

We are looking for a Senior Python Developer with a passion for AI research and API development to join our growing team. In this role, you will be responsible for building scalable, high-performance APIs and contributing to AI/ML research and implementation. You will work closely with data scientists, researchers, and product teams to design and deploy intelligent systems that power our next-generation applications. Key Responsibilities Design, develop, and maintain Python-based APIs for AI/ML models and services Collaborate with AI researchers to implement and optimize machine learning models Conduct research into new AI/ML techniques and evaluate their applicability to business problems Build RESTful and GraphQL APIs using frameworks like FastAPI , Flask , or Django REST Framework Write clean, testable, and maintainable Python code with a focus on performance and scalability Participate in code reviews , mentor junior developers, and contribute to best practices Integrate AI models with backend systems and frontend applications Stay up-to-date with AI/ML trends , Python libraries (e.g., PyTorch , TensorFlow , Scikit-learn ), and API design patterns Work in an agile environment , delivering high-quality software in iterative sprints Qualifications Bachelors or Masters degree in Computer Science, Data Science, or a related field 4 + years of professional experience in software development, with 3 + years in Python Strong experience with Python web frameworks (e.g., FastAPI, Flask, Django) What Were Looking For in a Candidate A curious mind with a passion for AI and software development A team player who can mentor and guide others A self-starter who can take initiative and deliver results A lifelong learner who stays current with emerging technologies and trends Why Join Us? Work on cutting-edge AI projects with real-world impact Collaborate with top-tier researchers and engineers Flexible work environment and remote-friendly options Competitive salary and performance-based incentives Opportunities for professional growth and leadership A culture that values innovation, collaboration, and continuous learning

Posted 3 weeks ago

Apply

9.0 - 14.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model

Posted 3 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Ahmedabad

Hybrid

Experience : 7+ years Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Ahmedabad) Must have Skills required: Snowflake, dbt, Airflow Inferenz is Looking for: Position: Senior Data Engineer (Snowflake+ dbt+ Airflow) Location: Pune, Ahmedabad Required Experience: 5+ years Preferred: Immediate Joiners Job Overview: We are looking for a highly skilled Senior Data Engineer (Snowflake) to join our team. The ideal candidate will have extensive experience Snowflake, and cloud platforms, with a strong understanding of ETL processes, data warehousing concepts, and programming languages. If you have a passion for working with large datasets, designing scalable database schemas, and solving complex data problems. Key Responsibilities: Design, implement, and optimize data pipelines and workflows using Apache Airflow Develop incremental and full-load strategies with monitoring, retries, and logging Build scalable data models and transformations in dbt, ensuring modularity, documentation, and test coverage Develop and maintain data warehouses in Snowflake Ensure data quality, integrity, and reliability through validation frameworks and automated testing Tune performance through clustering keys, warehouse scaling, materialized views, and query optimization. Monitor job performance and resolve data pipeline issues proactively Build and maintain data quality frameworks (null checks, type checks, threshold alerts). Partner with data analysts, scientists, and business stakeholders to translate reporting and analytics requirements into technical specifications. Qualifications: Snowflake (data modeling, performance tuning, access control, external tables, streams & tasks) Apache Airflow (DAG design, task dependencies, dynamic tasks, error handling) dbt (Data Build Tool) (modular SQL development, jinja templating, testing, documentation) Proficiency in SQL, Spark and Python Experience building data pipelines on cloud platforms like AWS, GCP, or Azure Strong knowledge of data warehousing concepts and ELT best practices Familiarity with version control systems (e.g., Git) and CI/CD practices Familiarity with infrastructure-as-code tools like Terraform for provisioning Snowflake or Airflow environments. Excellent problem-solving Skills and the ability to work independently Ability to work collaboratively in a team environment. Skills Snowflake, dbt, Airflow

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies