Jobs
Interviews

2364 Snowflake Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a member of the Security Solutions, Platform and Analytics team (SPA) at Snowflake, your primary responsibility will be to develop custom solutions that enhance the security of Snowflake's Data Cloud. Leveraging your expertise in SQL, Python, and security domain knowledge, you will analyze security logs and event data to translate security requirements into effective technical solutions. Your role will involve developing advanced analytics techniques and scalable solutions to identify patterns, anomalies, and trends in security data. In this role at Snowflake, you will have the opportunity to: - Develop and optimize data pipelines, data models, and visualization dashboards for security analytics - Design and implement scalable automated solutions in collaboration with various security teams - Take ownership of database management tasks, including data modeling and performance optimization - Utilize tools like DBT to streamline data transformations and ensure high data quality - Conduct research and propose innovative approaches to enhance security posture - Translate security requirements into technical solutions that align with organizational goals To be successful in this role, we are looking for candidates who possess: - A Bachelor's degree in Computer Science, Information Security, or a related field - 5-8 years of experience in Data Analytics with strong SQL and Python skills - Experience in data visualization, DBT, and data pipeline development - Hands-on experience with Snowflake and familiarity with Cortex functions is a plus - Strong understanding of databases, data modeling, and data warehousing - Security domain knowledge, including experience with SIEM systems and threat intelligence platforms - Proven ability to analyze complex security events and effectively communicate findings Joining our team at Snowflake offers you the opportunity to work with cutting-edge technology and contribute to the security of a rapidly growing data platform. We value innovation, continuous learning, and the chance to make a significant impact on enterprise-scale security solutions. Snowflake is committed to growth and is seeking individuals who share our values, challenge conventional thinking, and drive innovation while building a future for themselves and Snowflake. If you are interested in making an impact and contributing to our team, we encourage you to explore the job posting on the Snowflake Careers Site for salary and benefits information (careers.snowflake.com).,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have 6 - 9 years of experience and possess the following skills: - Experience with test automation tools and frameworks such as Selenium. - Expertise in programming languages like Java or Python. - Strong working knowledge of Relational Databases like Oracle. - Experience with SQL and complex queries, views etc. - Experience with UI, Microservices Architecture and Restful APIs. - Experience with DevOps and CI/CD tools like Jenkins. Good to have skills: - Experience in Procurement domain. - Experience in low code testing tools like Katalon, Playwrite or Power Automate Desktop. - Experience in Jira and test management solutions like Xray. - Experience in Snowflake. - Knowledge of Oracle EBS. - Experience in executing projects in an Agile environment. Location: Bangalore,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Quality Engineer, you will collaborate with product, engineering, and customer teams to gather requirements and develop a comprehensive data quality strategy. You will lead data governance processes, including data preparation, obfuscation, integration, slicing, and quality control. Testing data pipelines, ETL processes, APIs, and system performance to ensure reliability and accuracy will be a key responsibility. Additionally, you will prepare test data sets, conduct data profiling, and perform benchmarking to identify inconsistencies or inefficiencies. Creating and implementing strategies to verify the quality of data products and ensuring alignment with business standards will be crucial. You will set up data quality environments and applications in compliance with defined standards, contributing to CI/CD process improvements. Participation in the design and maintenance of data platforms, as well as building automation frameworks for data quality testing and resolving potential issues, will be part of your role. Providing support in troubleshooting data-related issues to ensure timely resolution is also expected. It is essential to ensure that all data quality processes and tools align with organizational goals and industry best practices. Collaboration with stakeholders to enhance data platforms and optimize data quality workflows will be necessary to drive success in this role. Requirements: - Bachelors degree in Computer Science or a related technical field involving coding, such as physics or mathematics - At least three years of hands-on experience in Data Management, Data Quality verification, Data Governance, or Data Integration - Strong understanding of data pipelines, Data Lakes, and ETL testing methodologies - Proficiency in CI/CD principles and their application in data processing - Comprehensive knowledge of SQL, including aggregation and window functions - Experience in scripting with Python or similar programming languages - Databricks and Snowflake experience is a must, with good exposure to notebook, SQL editor, etc. - Experience in developing test automation frameworks for data quality assurance - Familiarity with Big Data principles and their application in modern data systems - Experience in data analysis and requirements validation, including gathering and interpreting business needs - Experience in maintaining QA environments to ensure smooth testing and deployment processes - Hands-on experience in Test Planning, Test Case design, and Test Result Reporting in data projects - Strong analytical skills, with the ability to approach problems methodically and communicate solutions effectively - English proficiency at B2 level or higher, with excellent verbal and written communication skills Nice to have: - Familiarity with advanced data visualization tools to enhance reporting and insights - Experience in working with distributed data systems and frameworks like Hadoop,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be joining an innovative company that is revolutionizing retail checkout experiences by utilizing cutting-edge Computer Vision technology to replace traditional barcodes. Our platform aims to create seamless, faster, and smarter checkout processes, enhancing the shopping experience for both retailers and consumers. As we are growing rapidly, we are seeking an experienced Android/Cross-Platform App Developer to be a part of our team and help in shaping the future of retail technology. As a Senior Data Engineer, you will be an integral part of our expanding data team. Your primary responsibilities will involve building and optimizing data infrastructure, pipelines, and tooling to support analytics, machine learning, and product development. This role requires a strong background in cloud-native data engineering, a passion for scalable systems, and the ability to work independently with minimal supervision. Key Responsibilities: - Design, build, and maintain scalable data pipelines and ETL/ELT workflows using tools such as Kestra or Prefect. - Architect and manage cloud-based data infrastructure utilizing platforms like Snowflake, MySQL, and LanceDB. - Implement and uphold data quality, lineage, and governance best practices. - Collaborate with analytics, BI, and product teams to establish data models for reporting, experimentation, and operational use cases. - Optimize query performance, storage costs, and data reliability across various platforms. - Oversee data ingestion from internal and external systems through APIs, CDC, or streaming technologies like Kafka and MQTT. - Develop automated data validation, testing, and monitoring frameworks to ensure data integrity. - Contribute to infrastructure-as-code and deployment processes using CI/CD pipelines and version control systems like Git. - Capable of working independently and driving projects forward with minimal supervision. Skills and Qualifications: - 5+ years of experience as a data engineer or software engineer in large-scale data systems. - Proficiency in SQL, Python, and modern data transformation frameworks. - Hands-on experience in building and maintaining production-level ETL/ELT pipelines. - Familiarity with cloud data warehouses like Snowflake and RedPanda Cloud. - Expertise in workflow orchestration tools such as Airflow, Kestra, or Prefect. - Understanding of data modeling techniques like dimensional modeling and normalization. - Experience with cloud platforms such as AWS and Azure for data infrastructure and services. - Ability to work independently and lead projects with minimal guidance. Nice to Have: - Experience with streaming data technologies, specifically RedPanda. - Knowledge of data security, privacy, and compliance practices including GDPR and HIPAA. - Background in DevOps for data, encompassing containerization and observability tools. - Previous involvement in a Retail or e-commerce data environment. Software Qualifications: - Languages: Python, SQL, Rust - Data Warehousing: Snowflake, MySQL - ETL/ELT Orchestration Tools: Kestra, Prefect - Version Control & CI/CD: Git, GitHub Actions - Orchestration & Infrastructure: Docker, Kubernetes, Redpanda, Cloudflare - Monitoring: OpenobserveAI, Keep Why Join Us : - Become part of a forward-thinking company shaping the future of retail technology. - Collaborate with a dynamic and innovative team that values creativity. - Opportunity to contribute to cutting-edge projects and enhance your skills. - Competitive salary and benefits package. - Enjoy a flexible work environment with opportunities for career growth.,

Posted 6 days ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

At Goldman Sachs, our Engineers are dedicated to making the impossible possible. We are committed to changing the world by bridging the gap between people and capital with innovative ideas. Our mission is to tackle the most complex engineering challenges for our clients, crafting massively scalable software and systems, designing low latency infrastructure solutions, proactively safeguarding against cyber threats, and harnessing the power of machine learning in conjunction with financial engineering to transform data into actionable insights. Join our engineering teams to pioneer new businesses, revolutionize finance, and seize opportunities in the fast-paced world of global markets. Engineering at Goldman Sachs, consisting of our Technology Division and global strategists groups, stands at the heart of our business. Our dynamic environment demands creative thinking and prompt, practical solutions. If you are eager to explore the limits of digital possibilities, your journey starts here. Goldman Sachs Engineers embody innovation and problem-solving skills, developing solutions in various domains such as risk management, big data, and mobile technology. We seek imaginative collaborators who can adapt to change and thrive in a high-energy, global setting. The Data Engineering group at Goldman Sachs plays a pivotal role across all aspects of our business. Focused on offering a platform, processes, and governance to ensure the availability of clean, organized, and impactful data, Data Engineering aims to scale, streamline, and empower our core businesses. As a Site Reliability Engineer (SRE) on the Data Engineering team, you will oversee observability, cost, and capacity, with operational responsibility for some of our largest data platforms. We are actively involved in the entire lifecycle of platforms, from design to decommissioning, employing an SRE strategy tailored to this lifecycle. We are looking for individuals who have a development background and are proficient in code. Candidates should prioritize Reliability, Observability, Capacity Management, DevOps, and SDLC (Software Development Lifecycle). As a self-driven leader, you should be comfortable tackling problems with varying degrees of complexity and translating them into data-driven outcomes. You should be actively engaged in strategy development, participate in team activities, conduct Postmortems, and possess a problem-solving mindset. Your responsibilities as a Site Reliability Engineer (SRE) will include driving the adoption of cloud technology for data processing and warehousing, formulating SRE strategies for major platforms like Lakehouse and Data Lake, collaborating with data consumers and producers to align reliability and cost objectives, and devising strategies with data using relevant technologies such as Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, and Gitlab. Basic qualifications for this role include a Bachelor's or Master's degree in a computational field, 1-4+ years of relevant work experience in a team-oriented environment, at least 1-2 years of hands-on developer experience, familiarity with DevOps and SRE principles, experience with cloud infrastructure (AWS, Azure, or GCP), a proven track record in driving data-oriented strategies, and a deep understanding of data multi-dimensionality, curation, and quality. Preferred qualifications entail familiarity with Data Lake / Lakehouse technologies, experience with cloud databases like Snowflake and Big Query, understanding of data modeling concepts, working knowledge of open-source tools such as AWS Lambda and Prometheus, and proficiency in coding with Java or Python. Strong analytical skills, excellent communication abilities, a commercial mindset, and a proactive approach to problem-solving are essential traits for success in this role.,

Posted 6 days ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

The company Loyalytics is a rapidly growing Analytics consulting and product organization headquartered in Bangalore. They specialize in assisting large retail clients worldwide to capitalize on their data assets through consulting projects and product accelerators. With a team of over 100 analytics practitioners, Loyalytics is at the forefront of utilizing cutting-edge tools and technologies in the industry. The technical team at Loyalytics comprises data scientists, data engineers, and business analysts who handle over 1 million data points daily. The company operates in a massive multi-billion dollar global market opportunity and boasts a leadership team with a combined experience of over 40 years. Loyalytics has gained a strong reputation in the market, with word-of-mouth and referral-driven marketing strategies that have attracted prestigious retail brands in the GCC regions like Lulu and GMG. One of the key distinguishing factors of Loyalytics is its 10-year history as a bootstrapped company that continues to expand its workforce, currently employing over 100 individuals. They are now seeking a passionate and detail-oriented BI Consultant Tableau with 1-2 years of experience to join their analytics team. The ideal candidate for this role should have a solid foundation in SQL and hands-on expertise in developing dashboards using Tableau. Responsibilities include designing, developing, and maintaining interactive dashboards and reports, writing efficient SQL queries, collaborating with cross-functional teams, ensuring data accuracy, and optimizing dashboard performance. Strong analytical and problem-solving skills, along with good communication and documentation abilities, are essential for success in this position. Required skills and qualifications for the BI Consultant Tableau role at Loyalytics include 1-2 years of professional experience in BI/Data Analytics roles, proficiency in writing complex SQL queries, hands-on experience with Tableau Desktop, understanding of data modeling concepts and ETL workflows, familiarity with other BI tools like Power BI and Qlik, exposure to Tableau Server or Tableau Cloud, and knowledge of cloud platforms or databases such as AWS, GCP, Azure, Snowflake, or BigQuery. This is an exciting opportunity to join a dynamic and innovative team at Loyalytics and contribute to transforming data into valuable insights for clients in the retail industry.,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

You have a minimum of 6 years of experience in software engineering, with a strong foundation in the design and development of distributed systems. You excel at creating large-scale data processing pipelines for streaming and computing, utilizing data technologies such as AWS, Snowflake, or any Relational Database, and Kafka. Your expertise extends to containers, including Kubernetes, OpenShift, EKS, or similar technologies. You have hands-on experience in building cloud-native applications that operate on containers. Proficiency in at least one of the following programming languages is essential: Java, Scala, Python, or Go. You possess a keen enthusiasm and adaptability to learn new languages and concepts as required. Your effective communication skills, problem-solving abilities, and a continuous learning mindset enable you to establish enduring relationships. Moreover, you have a track record of writing well-structured, high-quality code that is easily maintainable by your peers.,

Posted 6 days ago

Apply

3.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You are invited to join our team as a Lead / Senior ETL & Data Migration QA Engineer in Hyderabad, India. In this role, you will play a crucial part in a significant data migration project, bringing your expertise in ETL testing, data validation, and cloud migration. Your primary responsibilities will include designing and implementing test strategies, executing test cases, validating data accuracy, and ensuring the integrity of large-scale data transformations. You will lead QA efforts across global teams, collaborating to deliver high-quality results. Your key responsibilities will involve developing robust test strategies for data migration and ETL processes, executing detailed test cases for data validation, performing SQL-based data testing, and using ETL tools like Talend for data pipeline validation. Additionally, you will lead QA activities for cloud data migration projects, coordinate testing efforts across teams, document test results and defects, and contribute to the development of automated testing frameworks. To qualify for this role, you should have at least 3 years of experience in QA with a focus on ETL testing, data validation, and data migration. Proficiency in SQL, hands-on experience with ETL tools such as Talend, Informatica PowerCenter or DataStage, familiarity with cloud data platforms like Snowflake, and understanding of semi-structured data formats like JSON and XML are essential requirements. Strong analytical and problem-solving skills, experience in leading QA efforts, and the ability to work in distributed teams are also necessary. Preferred skills for this position include experience with automated testing tools for ETL processes, knowledge of data governance and quality standards, familiarity with cloud ecosystems like AWS, and certification in software testing such as ISTQB. If you are passionate about quality assurance, data migration, and ETL processes, and are looking to make a significant impact in a dynamic work environment, we encourage you to apply for this role and be a part of our team dedicated to driving continuous improvement and excellence in QA practices.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Zimetrics is a technology services and solutions provider specializing in Data, AI, and Digital. We help enterprises leverage the economic potential and business value of data from systems, machines, connected devices, and human-generated content. Our core principles are Integrity, Intellect, and Ingenuity, guiding our value system, engineering expertise, and organizational behavior. We are problem solvers and innovators who challenge conventional wisdom and believe in possibilities. You will be responsible for designing scalable and secure cloud-based data architecture solutions. Additionally, you will lead data modeling, integration, and migration strategies across platforms. It will be essential to engage directly with clients to understand their business needs and translate them into technical solutions. Moreover, you will support sales/pre-sales teams with solution architecture, technical presentations, and proposals. Collaboration with cross-functional teams including engineering, BI, and product will also be a part of your role. Ensuring best practices in data governance, security, and performance optimization is a key responsibility. To be successful in this role, you must have strong experience with Cloud platforms such as AWS, Azure, or GCP. A deep understanding of Data Warehousing concepts and tools like Snowflake, Redshift, BigQuery, etc., is essential. Proven expertise in data modeling, including conceptual, logical, and physical modeling, is required. Excellent communication and client engagement skills are a must. Previous experience in pre-sales or solution consulting will be advantageous. You should also have the ability to present complex technical concepts to non-technical stakeholders effectively.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The ideal candidate should have expertise in Data bricks, Lake Bridge, AWS or other cloud platforms, and experience in technical team management. Proficiency in other cloud analytics platforms like Red shift, Big query, Snowflake would be an added advantage. As a part of this role, you will be responsible for finalizing the solution architecture, system architecture, and technology architecture for the Data bricks migration program/projects. You will also play a key role in finalizing the conversion methodology including ETL, Data pipelines, and Visualization tools using 3rd party accelerators and Data bricks tools like Lake bridge. Additionally, providing technical guidance to a group of data engineers on code conversion to data bricks compliant formats, Database objects to databricks compliant formats, data loading, and data reconciliation with the old and new system will be part of your responsibilities. Having an advanced certification in Databricks will be a plus for this role.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

coimbatore, tamil nadu

On-site

You will be responsible for designing, building, and optimizing NetSuite analytics solutions and enterprise data warehouses as a NetSuite Analytics Developer & Data Warehousing expert. Your main focus will be on leveraging NetSuite's SuiteAnalytics tools in conjunction with external data warehousing platforms like Oracle Analytics Warehouse, Google Cloud Platform (GCP), and Snowflake. This will enable you to deliver scalable, data-driven insights through advanced visualizations and reporting across the organization. Your key responsibilities will include designing, developing, and maintaining SuiteAnalytics reports, saved searches, and dashboards within NetSuite to meet the evolving business needs. You will also be required to build and optimize data pipelines and ETL processes to integrate NetSuite data into enterprise data warehouses, develop data models and schemas, and maintain data marts to support business intelligence and analytical requirements. Additionally, you will implement advanced visualizations and reports using tools such as Tableau, Power BI, or Looker, ensuring high performance and usability. Collaboration with business stakeholders to gather requirements and translate them into effective technical solutions will be crucial in this role. You will also be responsible for monitoring, troubleshooting, and optimizing data flow and reporting performance, ensuring data governance, security, and quality standards are maintained across analytics and reporting systems. Providing documentation, training, and support to end-users on analytics solutions will also be part of your responsibilities. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field, along with at least 3 years of experience working with NetSuite, including SuiteAnalytics (saved searches, datasets, workbooks). Strong expertise in data warehousing concepts, ETL processes, data modeling, proficiency in SQL, and experience with BI and visualization tools like Tableau, Power BI, or Looker are essential. An understanding of data governance, compliance, and best practices in data security is also required. In summary, as a NetSuite Analytics Developer & Data Warehousing expert, you will play a vital role in designing, building, and optimizing analytics solutions and data warehouses to drive data-driven insights and reporting across the organization. Your expertise in SuiteAnalytics, data warehousing, ETL processes, and BI tools will be key in meeting the evolving business needs and ensuring high-quality analytics solutions are delivered.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

At Medtronic, you can embark on a life-long career dedicated to exploration and innovation, all while contributing to the cause of advancing healthcare access and equity for all. Your role will be pivotal in leading with purpose to break down barriers to innovation in a more connected and compassionate world. As a PySpark Data Engineer at Medtronic's new Minimed India Hub, you will play a crucial part in designing, developing, and maintaining data pipelines using PySpark. Collaborating closely with data scientists, analysts, and other stakeholders, your responsibilities will revolve around ensuring the efficient processing and analysis of large datasets, managing complex transformations, and aggregations. This opportunity allows you to make a significant impact within Medtronic's Diabetes business. With the announcement of the intention to separate the Diabetes division to drive future growth and innovation, you will have the chance to operate with increased speed and agility. This move is expected to unlock potential and drive innovation to enhance the impact on patient care. Key Responsibilities: - Design, develop, and maintain scalable and efficient ETL pipelines using PySpark. - Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets. - Implement data quality checks, ensure data integrity, and troubleshoot data pipeline issues. - Stay updated with the latest trends and technologies in big data and distributed computing. Required Knowledge and Experience: - Bachelor's degree in computer science, Engineering, or related field. - 4-5 years of experience in data engineering with a focus on PySpark. - Proficiency in Python and Spark, strong coding and debugging skills. - Strong knowledge of SQL and experience with relational databases. - Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. - Experience with data warehousing solutions like Redshift, Snowflake, Databricks, or Google BigQuery. - Familiarity with data lake architectures, big data technologies, and data storage solutions. - Excellent problem-solving skills and ability to troubleshoot complex issues. - Strong communication and collaboration skills. Preferred Skills: - Experience with Databricks and orchestration tools like Apache Airflow or AWS Step Functions. - Knowledge of machine learning workflows and data security best practices. - Familiarity with streaming data platforms, real-time data processing, and CI/CD pipelines. Medtronic offers a competitive Salary and flexible Benefits Package. The company values its employees and provides resources and compensation plans to support their growth at every career stage. This position is eligible for the Medtronic Incentive Plan (MIP). About Medtronic: Medtronic is a global healthcare technology leader committed to addressing the most challenging health problems facing humanity. With a mission to alleviate pain, restore health, and extend life, the company unites a team of over 95,000 passionate individuals who work tirelessly to generate real solutions for real people through engineering and innovation.,

Posted 6 days ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

Infiligence is a global technology company with offices in Chennai, India, Hyderabad, and California, USA. We are committed to delivering innovative data solutions to our clients worldwide. We offer a collaborative work environment, competitive compensation, and comprehensive employee benefits. As a Data Engineer at Infiligence, your primary responsibility will be to design, develop, and maintain scalable data pipelines for batch and real-time data processing using Azure (preferred) or AWS cloud platforms. You will build and optimize ETL processes to ensure high-quality, secure, and efficient data flow across systems. Collaborating with cross-functional teams, you will translate business requirements into robust data models and solutions. It is essential to implement data quality, data governance, and data security standards throughout the data lifecycle. You will also develop and maintain documentation, conduct code reviews, unit testing, and peer reviews to ensure code quality and compliance. Troubleshooting, monitoring, and resolving data pipeline and infrastructure issues to minimize business impact will be part of your daily tasks. Moreover, staying updated with new technologies and evaluating their organizational impact will be crucial for the role. To qualify for this position, you must have a minimum of 5-7 years of experience in data engineering, with a strong background in building and managing large-scale data pipelines. Hands-on experience with Azure Data Services (Data Factory, Data Lake, Synapse, Databricks, etc.) is preferred, while experience with AWS data services is also acceptable. Proficiency in SQL, Python, or Scala for data processing and transformation is required. Additionally, experience with data warehousing (e.g., Snowflake, SQL Server, MongoDB) and real-time databases is essential. A strong understanding of data architecture, data ingestion, curation, and consumption patterns is necessary. Familiarity with data quality management, metadata management, data lineage, and data security best practices will be advantageous for the role. Excellent communication skills and the ability to work collaboratively with global teams are also essential. Preferred skills for this role include experience with CI/CD processes and source control for data engineering workflows, knowledge of data observability and self-testing pipelines, as well as exposure to business intelligence and reporting platforms. At Infiligence, we offer comprehensive insurance coverage (health and statutory benefits as per Indian law), a competitive salary in line with industry standards, opportunities for professional growth, including support for technical certifications, and an inclusive and collaborative work culture. If you are interested in this position, please apply with your updated CV and a cover letter to our job URL. Shortlisted candidates will undergo an HR, technical assessment, and interviews. For any queries regarding the position or application process, please contact our Talent team at Infiligence, US or Chennai offices through careers@infiligence.com.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As an Intermediate Data Engineer at Assent, you will play a crucial role in advancing the data engineering practices of the company. Your primary responsibility will involve contributing to the development of secure, robust, scalable, and high-performing data platforms that align with Assent's business objectives. Collaborating with other data engineers, you will actively participate in designing, developing, and implementing sophisticated data solutions to enhance our data systems. Key Requirements & Responsibilities: - Support the design and implementation of scalable and high-performing data systems to develop data engineering solutions. - Develop and maintain data pipelines and infrastructure to ensure the reliability and efficiency of data processing workflows. - Assist in evaluating and selecting data technologies, infrastructure, and tools to contribute to the implementation of data architectures and workflows. - Follow coding standards and best practices within the Data Engineering & Operations team to ensure quality and consistency through code reviews. - Collaborate with cross-functional teams including database developers, software development, product management, and AI/ML developers to align data initiatives with Assent's organizational goals. - Monitor progress, adjust priorities, and meet project deadlines and objectives by working closely with team members. - Identify opportunities for process improvements, including automation of manual processes and optimization of data delivery. - Expand knowledge and skills by learning from senior team members, sharing insights, and contributing to technical discussions. - Adhere to corporate security policies and procedures set by Assent for data handling and management. Qualifications: - Degree in a related field and a minimum of 5 years of experience in data engineering. - Proficiency in tools and languages such as AWS, dbt, Snowflake, Git, R, Python, SQL, SQL Server, and Snowflake. - Effective organizational skills with the ability to manage tasks and communicate technical concepts clearly. - Proficient in collecting, organizing, and analyzing data with attention to detail and accuracy. - Understanding of data management systems, warehouse methodologies, data quality principles, data modeling techniques, and governance best practices. - Familiarity with agile work environments and scrum ceremonies. - Basic business acumen and experience in aligning data initiatives with business objectives. Life at Assent: Assent values the well-being of its team members and offers comprehensive benefits packages, including vacation time that increases with tenure, life leave days, and more. Financial benefits include a competitive base salary, a corporate bonus program, retirement savings options, and additional perks. Assent provides flexible work options, volunteer days, and opportunities for team members to engage in corporate giving initiatives. Professional development days are available from the start to encourage lifelong learning. Assent is committed to fostering an inclusive environment where all team members are valued, heard, and respected. Diversity, equity, and inclusion practices are championed by the Diversity and Inclusion Working Group and Employee Resource Groups to ensure a culture of belonging and equal opportunity for all team members. If you require assistance or accommodation during the interview process, please contact talent@assent.com for support.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

We are looking for a highly skilled Data Quality Manager with expertise in SQL, PySpark, Databricks, Snowflake, and CI/CD processes. As a Data Quality Manager, you will be responsible for designing, developing, and maintaining scalable data pipelines and infrastructure to support our data analytics and business intelligence requirements. You will collaborate closely with data scientists, analysts, and stakeholders to ensure the efficient processing and delivery of high-quality data. Your key responsibilities will include designing, developing, and optimizing data pipelines using PySpark, writing complex SQL queries for data extraction, transformation, and loading (ETL), working with Databricks to build and maintain collaborative and scalable data solutions, implementing and managing CI/CD processes for data pipeline deployments, collaborating with data scientists and business analysts to understand data requirements, ensuring data quality, integrity, and security, and monitoring and troubleshooting data pipelines and workflows. To qualify for this role, you should have a Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. You must possess proven experience with PySpark, advanced proficiency in SQL, hands-on experience with Databricks, a strong understanding of CI/CD pipelines, familiarity with cloud platforms, excellent problem-solving skills, attention to detail, and strong communication and collaboration skills. Preferred skills for this role include knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift) and familiarity with the kedro framework. Novartis is dedicated to helping people with diseases and their families, and we believe this requires a community of smart, passionate individuals like you. If you are ready to collaborate, support, and inspire each other to achieve breakthroughs that positively impact patients" lives, we invite you to join us in creating a brighter future together. If you are interested in this opportunity or want to stay connected for future career opportunities at Novartis, please visit our talent community at https://talentnetwork.novartis.com/network. For more information about the benefits and rewards we offer, please refer to our handbook at https://www.novartis.com/careers/benefits-rewards. Novartis is committed to an inclusive work environment and diverse teams that reflect the patients and communities we serve.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You are a skilled Data Engineer with a strong background in Python, Snowflake, and AWS. Your primary responsibility will involve constructing and refining scalable data pipelines, integrating various data sources, and supporting analytics and business intelligence solutions within a cloud-based setting. An essential aspect of your role will entail the design and supervision of AWS Glue Jobs to facilitate efficient, serverless ETL workflows. Your key duties will revolve around designing and executing robust data pipelines using AWS Glue, Lambda, and Python. You will extensively collaborate with Snowflake for data warehousing, modeling, and analytics assistance. Managing ETL/ELT jobs using AWS Glue to ensure consistent data reliability will be a crucial part of your responsibilities. Furthermore, you will be tasked with migrating data between CRM systems, particularly from Snowflake to Salesforce, adhering to defined business protocols and ensuring data precision. It will also be your responsibility to optimize SQL/SOQL queries, manage large data volumes, and sustain high-performance levels. Additionally, implementing data normalization and quality checks will be essential to guarantee accurate, consistent, and deduplicated records. Your required skills include strong proficiency in Python, hands-on experience with Snowflake Data Warehouse, and familiarity with AWS services such as Glue, S3, Lambda, Redshift, and CloudWatch. You should have experience in ETL/ELT pipelines and data integration using AWS Glue Jobs, along with expertise in SQL and SOQL for data extraction and transformation. Moreover, an understanding of data modeling, normalization, and performance optimization is essential for this role. It would be advantageous if you have experience with Salesforce Data Loader, ETL mapping, and metadata-driven migration, as well as exposure to CI/CD tools, DevOps, and version control systems like Git. Previous work experience in Agile/Scrum environments will also be beneficial for this position.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

Job Description: As an MSBI Developer, you will be responsible for utilizing your expertise in PowerBI, SQL Server, and other related technologies to develop and maintain business intelligence solutions. With at least 5 years of experience and a relevant graduate degree, you will play a key role in creating data visualizations, optimizing data processes, and ensuring the smooth functioning of BI tools. Your primary responsibilities will include: - Demonstrating proficiency in PowerBI tools such as PBI Desktop, DAX language, Power Query, M language, PBI Report Server, and RLS. - Utilizing advanced T-SQL skills in the SSMS environment to work with SQL Server databases. - Possessing a basic understanding of Oracle PL-SQL for database management tasks. - Developing and debugging packages using Microsoft SSIS, and deploying them to the SSIS Catalogue for efficient data integration. - Leveraging your expertise in Microsoft SSRS to build and customize reports using the Report Builder tool. - Familiarity with Microsoft Visual Studio data tools for enhancing BI solutions. - Having a strong grasp of Star and Snowflake Schema modelling concepts to design efficient data models. Key Skills: - Microsoft Power BI - PBI Desktop - DAX language - Power Query - M language - PBI Report Server - RLS - SSMS - PL SQL - T-SQL - SSRS - SSIS - Snowflake If you are a proactive and detail-oriented individual with a passion for data analytics and business intelligence, and possess the required skills and experience, we invite you to join our team as a Senior MSBI Developer in Mumbai. This is a full-time, permanent position in the IT/Computers - Software industry. Please apply with Job Code: GO/JC/040/2025.,

Posted 6 days ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Noida

Remote

Architect & manage data solutions using Snowflake & advanced SQL Design & implement data pipelines, data warehouses, & data lakes, ensuring efficient data transformation Develop best practices for data security, access control, & compliance Required Candidate profile Exp 8-14 yrs Strong data architect SQL& Snowflake exp must Collaborate with cross-functional teams, integrate & translate them into robust data architectures Manufacturing industry exp is a must

Posted 6 days ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Gurugram

Work from Office

A minimum of 10 years of experience in data architecture, data engineering, or a related field. Proven expertise in Snowflake and data transformation processes within Snowflake . Strong background in Data Warehouse (DWH) and Business Intelligence (BI) architecture. Experience with Salesforce & CPQ data and architecture. Proficiency with BI tools such as Tableau, Power BI, Sigma Computing, and others. In-depth understanding of financial bookings and revenue reports in Salesforce and DWH. Excellent problem-solving skills and the ability to work under pressure in a fast-paced environment. Strong leadership and team management skills, with the ability to motivate and guide a team of technical professionals. Exceptional communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels.

Posted 6 days ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Kochi, Chennai, Bengaluru

Hybrid

Job Role: Data Quality Integration Engineer Location: PAN India Role Overview As a Data Quality Integration Engineer, you will be responsible for embedding data quality capabilities across enterprise data landscapes. You will lead the integration of advanced data quality tools such as Ataccama and Collibra with cloud data platforms like Snowflake and SQL databases. This role is essential in ensuring our data governance standards are met with robust, scalable, and automated data quality processes. Role Proficiency Develop scalable applications using suitable technical options. Optimize application development, maintenance, and performance. Reuse proven design patterns and manage peer development activities. Key Responsibilities Technical & Functional Responsibilities Design and implement integration of data quality tools (Ataccama, Collibra, etc.) with Snowflake and SQL-based platforms. Develop automated pipelines and connectors for profiling, cleansing, monitoring, and validating data. Configure and manage data quality rules and workflows aligned to governance policies and KPIs. Troubleshoot integration issues, monitor performance, and optimize reliability and efficiency. Collaborate with Data Governance, Architecture, and Engineering teams to align solutions with business needs. Maintain comprehensive documentation for integration solutions and configurations. Software Engineering Deliverables Code : Adhere to coding standards, perform peer reviews, and write optimized code. Documentation : Create/review design documents, templates, test cases, and checklists. Testing : Develop/review unit and integration test cases; support QA teams. Configuration : Define and manage configuration management practices. Release : Execute and oversee release processes. Project & Team Management Estimate efforts for project deliverables and track timelines. Perform defect RCA, trend analysis, and propose quality improvements. Set and review FAST goals for self and team. Mentor team members, manage aspirations, and keep the team engaged. Key Outcomes & Metrics Timely adherence to engineering and project standards. Minimal post-delivery defects and technical issues. Compliance with mandatory training and documentation processes. Increased customer satisfaction and domain relevance. Skills & Technologies Mandatory Skills Strong experience with data quality tools (Ataccama, Collibra). Hands-on with Snowflake and SQL databases (e.g., PostgreSQL, SQL Server, Oracle). Proficient in SQL scripting and data pipeline development (Python or Scala preferred). Sound understanding of data profiling, cleansing, enrichment, and monitoring. Familiar with REST APIs and metadata integration techniques. Desirable Skills Experience in cloud platforms (AWS, Azure) hosting Snowflake. Certification in Collibra, Ataccama, or Snowflake. Exposure to financial services or regulated industries. Prior involvement in data stewardship/governance initiatives. Soft Skills Strong analytical and problem-solving abilities. Ability to manage high-pressure environments and multiple priorities. Effective communication and presentation skills. Ability to mentor and guide junior engineers. Business etiquette in professional interactions. Certifications (Preferred) Ataccama/Collibra Certified Professional Snowflake Architect/Developer Certification AWS/Azure Data Engineering Certifications Domain Knowledge Deep understanding of enterprise data architecture and governance. Knowledge of financial services, insurance, or asset management domains is an advantage.

Posted 6 days ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Role Description: As a Data Engineering Lead, you will play a crucial role in overseeing the design, development, and maintenance of our organization's data architecture and infrastructure. You will be responsible for designing and developing the architecture for the data platform that ensures the efficient and effective processing of large volumes of data, enabling the business to make informed decisions based on reliable and high-quality data. The ideal candidate will have a strong background in data engineering, excellent leadership skills, and a proven track record of successfully managing complex data projects. Responsibilities : Data Architecture and Design : Design and implement scalable and efficient data architectures to support the organization's data processing needs Work closely with cross-functional teams to understand data requirements and ensure that data solutions align with business objectives ETL Development : Oversee the development of robust ETL processes to extract, transform, and load data from various sources into the data warehouse Ensure data quality and integrity throughout the ETL process, implementing best practices for data cleansing and validation Big Data Technology - Stay abreast of emerging trends and technologies in big data and analytics, and assess their applicability to the organization's data strategy Implement and optimize big data technologies to process and analyze large datasets efficiently Cloud Integration: Collaborate with the IT infrastructure team to integrate data engineering solutions with cloud platforms, ensuring scalability, security, and performance. Performance Monitoring and Optimization : Implement monitoring tools and processes to track the performance of data pipelines and proactively address any issues Optimize data processing. Documentation : Maintain comprehensive documentation for data engineering processes, data models, and system architecture Ensure that team members follow documentation standards and best practices. Collaboration and Communication : Collaborate with data scientists, analysts, and other stakeholders to understand their data needs and deliver solutions that meet those requirements Communicate effectively with technical and non-technical stakeholders, providing updates on project status, challenges, and opportunities. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 6-8 years of professional experience in data engineering In-depth knowledge of data modeling, ETL processes, and data warehousing. In-depth knowledge of building the data warehouse using Snowflake Should have experience in data ingestion, data lakes, data mesh and data governance Must have experience in Python programming Strong understanding of big data technologies and frameworks, such as Hadoop, Spark, and Kafka. Experience with cloud platforms, such as AWS, Azure, or Google Cloud. Familiarity with database systems like SQL, NoSQL, and data pipeline orchestration tools .Excellent problem-solving and analytical skills .Strong communication and interpersonal skills. Proven ability to work collaboratively in a fast-paced, dynamic environment.

Posted 6 days ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Design, develop, and optimize ETL/ELT pipelines for data ingestion, transformation, and loading using Snowflake. Build and maintain scalable and robust data warehousing solutions. Work closely with data architects, analysts, and business stakeholders to gather requirements and deliver solutions. Optimize Snowflake performance by managing clusters, warehouses, and query tuning. Monitor data pipelines and troubleshoot any issues related to data ingestion or transformation. Implement data security, governance, and compliance best practices within the Snowflake environment. Write complex SQL queries and stored procedures for data manipulation and reporting. Collaborate with BI and analytics teams to support data extraction and reporting needs. Document processes, architecture, and best practices. Required Skills: Strong experience with Snowflake data platform (warehouses, micro-partitions, streams, tasks). Expertise in ETL tools and frameworks (e.g., Talend, Informatica, Apache NiFi, or native Snowflake tasks). Proficient in SQL and performance tuning. Experience with data modeling and dimensional modeling techniques. Familiarity with cloud platforms like AWS, Azure, or GCP is a plus. Good understanding of data governance, data security, and compliance. Strong analytical, problem-solving, and communication skills. Role & responsibilities Preferred candidate profile

Posted 6 days ago

Apply

0.0 - 1.0 years

1 - 2 Lacs

Lucknow

Work from Office

Develop and maintain cloud-based data pipelines using tools like Apache Airflow or AWS Glue Support ETL processes and ensure data quality and consistency Monitor and troubleshoot cloud infrastructure performance Implement data security & compliance

Posted 6 days ago

Apply

7.0 - 12.0 years

19 - 22 Lacs

Bengaluru

Hybrid

Role & responsibilities We are looking for Sr. Snowflake developer for Bangalore - Hybrid (2 days WFO) someone with 7+ YOE in Snowflake, Stored procedures, Python, & cloud.

Posted 6 days ago

Apply

2.0 - 5.0 years

20 - 25 Lacs

Hyderabad

Work from Office

About the Role We are looking for an Analytics Engineer with 2+ years of experience to help build and maintain our modern data platform. You'll work with dbt , Snowflake , and Airflow to develop clean, well-documented, and trusted datasets. This is a hands-on role ideal for someone who wants to grow their technical skills while contributing to a high-impact analytics function. Key Responsibilities Build and maintain scalable data models using dbt and Snowflake Develop and orchestrate data pipelines with Airflow or similar tools Partner with teams across DAZN to translate business needs into robust datasets Ensure data quality through testing, validation, and monitoring practices Follow best practices in code versioning, CI/CD, and data documentation Contribute to the evolution of our data architecture and team standards What Were Looking For 2+ years of experience in analytics/data engineering or similar roles Strong skills in SQL and working knowledge of cloud data warehouses (Snowflake preferred) Experience with dbt for data modeling and transformation Familiarity with Airflow or other workflow orchestration tools Understanding of ELT processes, data modeling, and data governance principles Strong collaboration and communication skills Nice to Have Experience working in media, OTT, or sports technology domains Familiarity with BI tools like Looker , Tableau , or Power BI Exposure to testing frameworks like dbt tests or Great Expectations

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies