Jobs
Interviews

2470 Snowflake Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

delhi

On-site

As a Snowflake DBT Lead at Pyramid Consulting, you will be responsible for overseeing Snowflake data transformation and validation processes in Delhi, India. Your role will include ensuring efficient data handling, maintaining data quality, and collaborating closely with cross-functional teams. To excel in this role, you should have strong expertise in Snowflake, DBT, and SQL. Your experience in data transformation, modeling, and validation will be crucial for success. Proficiency in ETL processes and data warehousing is essential to meet the job requirements. Your excellent problem-solving and communication skills will enable you to effectively address challenges and work seamlessly with team members. As a candidate for this position, you should hold a Bachelor's degree in Computer Science or a related field. Your ability to lead and collaborate within a team environment will be key to delivering high-quality solutions and driving impactful results for our clients.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

Optum is a global organization dedicated to delivering care using technology to improve the lives of millions of people. Your work with our team will directly enhance health outcomes by providing individuals with access to care, pharmacy benefits, data, and resources necessary for their well-being. Our culture is defined by diversity and inclusion, alongside talented colleagues, comprehensive benefits, and opportunities for career development. Join us in making a positive impact on the communities we serve while contributing to the advancement of global health equity through caring, connecting, and growing together. In this role, your primary responsibilities will include analyzing client requirements and complex business scenarios, designing innovative and fully automated products and solutions, serving as a BI Developer for key projects, ensuring high-quality execution of products, providing consulting to teammates, leaders, and clients, and offering extensive solutions in ETL strategies. You should possess an undergraduate degree or equivalent experience, along with expertise in ETL processes and data integration using Azure Data Factory. Proficiency in Power BI semantic model creation, report development, and data visualization is required, with Snowflake and Azure Data Warehouse as primary data sources. Additionally, you should have a strong understanding of data modeling concepts, relational database systems, Snowflake, and Azure Data Warehouse. Familiarity with Databricks for data engineering, advanced analytics, and machine learning tasks is preferred, as well as proficiency in Azure Cloud services such as Azure Data Factory, Azure SQL Data Warehouse, Azure Data Lake Storage, and Azure Analytics. Solid programming skills in Python, SQL, and other scripting languages are essential, along with proven problem-solving abilities, effective communication and collaboration skills, and the capacity to manage multiple tasks simultaneously. Microsoft certifications in Power BI, Azure Cloud, Snowflake, or related fields are a plus. The role is based in Hyderabad, Telangana, IN.,

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 Lacs

hyderabad, telangana

On-site

At Techwave, we are committed to fostering a culture of growth and inclusivity. We ensure that every individual associated with our brand is challenged at every step and provided with the necessary opportunities to excel in their professional and personal lives. People are at the core of everything we do. Techwave is a leading global IT and engineering services and solutions company dedicated to revolutionizing digital transformations. Our mission is to enable clients to maximize their potential and achieve a greater market share through a wide array of technology services, including Enterprise Resource Planning, Application Development, Analytics, Digital solutions, and the Internet of Things (IoT). Founded in 2004 and headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to help businesses accelerate their growth. We are a team of dreamers and doers who constantly push the boundaries of what's possible, and we want YOU to be a part of it. Job Title: Data Lead Experience: 10+ Years Mode of Hire: Full-time Key Skills: As a senior-level ETL developer with 10-13 years of experience, you will be responsible for building relational and data warehousing applications. Your primary role will involve supporting the existing EDW, designing and developing various layers of our data, and testing, documenting, and optimizing the ETL process. You will collaborate within a team environment to design and develop frameworks and services according to specifications. Your responsibilities will also include preparing detailed system documentation, performing unit and system tests, coordinating with Operations staff on application deployment, and ensuring that all activities are performed with quality and compliance standards. Additionally, you will design and implement ETL batches that meet SLAs, develop data collection, staging, movement, quality, and archiving strategies, and design automation processes to control data access and movement. To excel in this role, you must have 8-10 years of ETL/ELT experience, strong SQL skills, and proficiency in Stored Procedures and database development. Experience in Azure Data Lake, Synapse, Azure Data Factory, and Databricks, as well as Snowflake, is essential. You should possess a good understanding of data warehouse ETL and ELT design best practices, be able to work independently, and have a strong database experience with DB2, SQL Server, and Azure. Furthermore, you should be adept at designing Relational and Dimensional Data models, have a good grasp of Enterprise reporting (particularly Power BI), and understand Agile practices and methodologies. Your role will also involve assisting in analyzing and extracting relevant information from historical business data to support Business Intelligence initiatives and conducting Proof of Concept for new technology selection and proposing data warehouse architecture enhancements. If you are a self-starter with the required skills and experience, we invite you to join our dynamic team at Techwave and be a part of our journey towards innovation and excellence.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead Data Engineer with 8-12 years of experience, you will be responsible for handling a variety of tasks for one of our clients in Hyderabad. This is a full-time position with an immediate start date. Your proficiency in Python, Spark, SQL, Snowflake, Airflow, AWS, and DBT will be essential for this role. In this role, you will be expected to work on a range of data engineering tasks using the specified skill set. Your expertise in these technologies will be crucial in developing efficient data pipelines, ensuring data quality, and optimizing data workflows. Furthermore, you will collaborate with cross-functional teams to understand data requirements, design and implement data solutions, and provide technical guidance on best practices. Your ability to communicate effectively and work well in a team setting will be key to your success in this role. If you are interested in this opportunity and possess the required skill set, please share your profile with us at srujanat@teizosoft.com. We look forward to potentially having you join our team in Hyderabad.,

Posted 2 weeks ago

Apply

3.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Talend Data Engineer specializing in Talend and Snowflake, you will be responsible for developing and maintaining ETL workflows using Talend and Snowflake. Your role will involve designing and optimizing data pipelines to ensure performance and scalability. Collaboration with various teams to address data integration and reporting requirements will be crucial. Your focus will also include ensuring data quality, governance, and security protocols. To excel in this role, you should possess 3 to 9 years of experience working with Talend ETL and Snowflake. Proficiency in SQL, Python, and working knowledge of cloud platforms such as AWS, Azure, and Google Cloud is essential. Previous experience in constructing end-to-end data pipelines and familiarity with data warehouses are key requirements for this position. Experience in Snowflake performance tuning and an understanding of Agile methodologies are considered advantageous for this role. This is a full-time position that follows a day shift schedule from Monday to Friday, requiring your presence at the office in Nagpur, Pune, Bangalore, or Chennai. Join us in this dynamic opportunity to leverage your expertise in Talend and Snowflake to drive impactful data solutions while collaborating with cross-functional teams to meet business objectives effectively.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

If you are a software engineering leader ready to take the reins and drive impact, weve got an opportunity just for you. As a Director of Software Engineering at JPMorgan Chase within the Asset and Wealth Management LOB, you lead a data technology area and drive impact within teams, technologies, and deliveries. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex initiatives, while serving as a primary decision maker for your teams and be a driver of engineering innovation and solution delivery. The current role focuses on delivering data solutions for some of the Wealth Management businesses. Job responsibilities Leads engineering and delivery of a data and analytics solutions Makes decisions that influence teams resources, budget, tactical operations, and the execution and implementation of processes and procedures Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership & maintainability Delivers technical solutions that can be leveraged across multiple businesses and domains Influences and collaborates with peer leaders and senior stakeholders across the business, product, and technology teams Champions the firms culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Experience managing data solutions across a large, global consumer community in the Financial Services domain Experience hiring, developing and leading cross-functional teams of technologists Experience handling multiple, global stakeholders across business, technology and product Appreciation of the data product; modeling, sourcing, quality, lineage, discoverability, access management, visibility, purging, etc. Experience researching and upgrading to latest technologies in the continuously evolving data ecosystem Practical hybrid cloud native experience, preferably AWS Experience using current technologies, such as GraphQL, Glue, Spark, SnowFlake, SNS, SQS, Kinesis, Lambda, ECS, EventBridge, QlikSense, etc. Experience with Java and/or Python programming languages Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field Preferred qualifications, capabilities, and skills Comfortable being hands-on as required to drive solutions and solve challenges for the team Exposure and appreciation of the continuously evolving data science space Exposure to the Wealth Management business,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Location Pune/Nagpur Need immediate joiner only Job Description Key Responsibilities : Design, develop, and maintain robust data pipelines and ETL processes using Snowflake and other cloud technologies. Work with large datasets, ensuring their availability, quality, and performance across systems. Implement data models, optimize storage, and query performance on Snowflake. Write complex SQL queries for data extraction, transformation, and reporting purposes. Develop, test, and implement data solutions leveraging Python scripting and Snowflakes native features. Collaborate with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and efficiency. Perform data migration, integration, and processing tasks across cloud platforms. Stay updated with the latest developments in Snowflake, SQL, and cloud technologies. Required Skills Snowflake: Expertise in building, optimizing, and managing data warehousing solutions on Snowflake. SQL: Strong knowledge of SQL for querying and managing relational databases, writing complex queries, stored procedures, and performance tuning. Python: Proficiency in Python for scripting, automation, and integration within data pipelines. Experience in developing and managing ETL processes, and ensuring data accuracy and performance. Hands-on experience with data migration and integration processes across cloud platforms. Familiarity with data security and governance best practices. Strong problem-solving skills with the ability to troubleshoot and resolve data-related issues. (ref:hirist.tech),

Posted 2 weeks ago

Apply

10.0 - 17.0 years

0 Lacs

hyderabad, telangana

On-site

We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,

Posted 2 weeks ago

Apply

7.0 - 10.0 years

30 - 32 Lacs

Hyderabad

Work from Office

6+ years of Java development. Strong knowledge of SQL Agile development methodologies. Working experience with Snowflake and its native features (snowpark, data shares) Python Understanding of core AWS services and cloud infrastructure

Posted 2 weeks ago

Apply

10.0 - 20.0 years

20 - 30 Lacs

Pune

Remote

Role & responsibilities Minimum 10+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.

Posted 2 weeks ago

Apply

10.0 - 20.0 years

25 - 35 Lacs

Bengaluru

Remote

Role & responsibilities Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements. Certification is preferred.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Lead Data Engineer with DBT, Snowflake, SQL, and data warehousing expertise. Design, build, and maintain scalable data pipelines, ensure data quality, and solve complex data problems. ETL tool adaptability essential. SAP data and enterprise platform

Posted 2 weeks ago

Apply

5.0 - 10.0 years

35 - 50 Lacs

Bengaluru

Remote

-Design & develop interactive dashboards (Power BI/Tableau) -Familiarity with Azure/AWS/Snowflake -Strong in data modelling, SQL, ETL, and warehousing -Expert-level proficiency in Tableau and Power BI -Drive BI governance and performance standards

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Overview - Our Hosting team manages and supports the infrastructure of all our platforms, from hardware to software, to operating systems to PowerSchool products. This collaborative team helps meet the needs of an evolving technology and business model with a specialized focus on protecting customer data and keeping information secure. We work closely with product engineering teams to deliver products into production environments across Azure and AWS. Description - Design, develop and operate Infrastructure-as-code automation for Terraform, K8s, Snowflake, and while also executing customer tenant migrations in Analytics and Insights. Manage and optimize K8s clusters and workloads using Argo CD, Flux and Helm. Configure and support dynamic connector configurations. Work with the Product Engineering teams in building out the specifications and provide scalable, reliable platforms for the automation/delivery platform UI Development for internal dashboards and management applications CI/CD pipeline engineering and support Environment support including production support Participate in on-call schedules Requirements - Minimum of 8+ years of relevant and related work experience. Bachelors degree or equivalent, or equivalent years of relevant work experience. Additional experience may be substitute for an advanced Degree. Advanced knowledge and experience with Kubernettes, Flux, Terraform or equivalent technologies Advance knowledge of AWS services, including EKS, EFS, RDS, ECS, etc Working knowledge of monitoring, logging and alerting tools like Grafana and Prometheus Strong Java, Python, and Git experience

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad

Work from Office

Location : Hyderabad Work Model (Hybrid / WFO) : Hybrid & 3 Days per week (Tue, Wed, Thu) Experience Required: 5+ Employment Type (Full-Time / Part-Time / Contract): Full-time Mandatory Skills: Advanced SQL, Tableau / Snowflake / Teradata Python would be a plus Job Summary & Key Responsibilities: Design, develop, and support dashboards and reports. Provide analysis to support business management and executive decision-making. Design and develop effective SQL scripts to transform and aggregate large data sets, create derived metrics, and embed them into business solutions An effective and good communicator has demonstrated experience in handling larger and more complex business analytics projects 6+ years of relevant experience in design, development, and maintenance of Business Intelligence, reporting, and data applications Advanced hands-on experience with Tableau or similar BI dashboard visualisation tools Expertise in programmatically processing large data sets from multiple source systems, integration, data mining, summarisation, and presentation of results to exec audience. Advanced knowledge of SQL in related technologies like Oracle, Teradata, performance debugging, and tuning activities Strong understanding of dev to production processes, User Acceptance and Production Validation testing, waterfall and agile development, and code deployment methodologies Experience in core data warehousing concepts, dimensional data modeling, RDBMS, OLAP, ROLAP Experience with ETL tools used to automate manual processes (SQL scripts, Airflow/Nifi) Experience with R, Python, Hadoop Only Immediate Joiner Thanks & Regards, Milki Bisht- 9151206474 Email id milki.bisht@nlbtech.in

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Gurugram

Work from Office

About the Opportunity Job Type: PermanentApplication Deadline: 31 July 2025 Title Senior Analyst - Data Science Department Enterprise Data & Analytics Location Gurgaon Reports To Gaurav Shekhar Level Data Scientist 4 About your team Join the Enterprise Data & Analytics team collaborating across Fidelitys global functions to empower the business with data-driven insights that unlock business opportunities, enhance client experiences, and drive strategic decision-making. About your role As a key contributor within the Enterprise Data & Analytics team, you will lead the development of machine learning and data science solutions for Fidelity Canada. This role is designed to turn advanced analytics into real-world impactdriving growth, enhancing client experiences, and informing high-stakes decisions. Youll design, build, and deploy ML models on cloud and on-prem platforms, leveraging tools like AWS SageMaker, Snowflake, Adobe, Salesforce etc. Collaborating closely with business stakeholders, data engineers, and technology teams, youll translate complex challenges into scalable AI solutions. Youll also champion the adoption of cloud-based analytics, contribute to MLOps best practices, and support the team through mentorship and knowledge sharing. This is a high-impact role for a hands-on problem solver who thrives on ownership, innovation, and seeing their work directly influence strategic outcomes. About you You have 47 years of experience working in data science domain, with a strong track record of delivering advanced machine learning solutions for business. Youre skilled in developing models for classification, forecasting, recommender systems and hands-on with frameworks like Scikit-learn, TensorFlow, or PyTorch. You bring deep expertise in developing and deploying models on AWS SageMaker, strong business problem-solving abilities, and are familiar with emerging GenAI trends. A background in engineering, mathematics, or economics from a Tier 1 institution will be preferred. For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

About the Opportunity Job Type: PermanentApplication Deadline: 31 July 2025 Title Senior Test Analyst Department Corporate Enablers Technology Location Gurgaon / Bengaluru (Bangalore) India Reports To Project Manager Level 3 About Fidelity International Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our Corporate Enablers Technology team and feel like youre part of something bigger. About your team Corporate Enablers Technology encompasses four distinct portfolios: Chief Finance Officer (CFO), Chief People Officer (CPO), General Counsel (GC), and Corporate Property Services (CPS). We provide the underlying technology for managing for example Fidelitys core finance, HR, procurement, legal, risk and compliance, statutory reporting, real estate management, global security and business continuity processes. About your role The is the role for an experienced Functional and Automation tester who understands the architecture overall, can understand the systems involved and the integration points etc. The candidate will be required to work closely with the Test lead and the development team to write the functional test scripts and identify the areas which are critical from the end-to-end testing perspective. Need a senior guy who has worked on medium scale projects and programs in past and hence will be able to add value to the overall program. The candidate will also be required to work extensively on the automation of the test solutions hence should have an excellent knowledge of the technologies and tools in the are test automation. Key Responsibilities Design, develop, and execute automated test scripts to improve efficiency and coverage. Perform functional, UI, and database testing to ensure the software meets the specified requirements. Identify, document, and track software defects and ensure their resolution. Collaborate with cross-functional teams to ensure quality throughout the software development lifecycle. Develop and maintain automation frameworks and tools. Continuously improve QA processes and methodologies to enhance product quality. Participate in code reviews and provide feedback on testability and quality. Ensure compliance with industry standards and best practices. Essential Skills Hands on experience in API Automation testing (SOAP and REST). Experience in Mobile testing (iOS and Android) using Appium or similar tool Good hands-on experience in programming languages like Java, Python etc. Experience in UI Automation using JAVA-Selenium or equivalent technologies. Familiarity with continuous integration and continuous deployment (CI/CD) pipelines. Proven experience as an ETL Tester or Quality Assurance Engineer with a focus on Snowflake. Solid understanding of SQL for data validation and querying within Snowflake. Experience in job scheduling with tools like Control-M etc. Experience in UNIX, strong knowledge of unix commands and tools like Putty. Soft Skills Sound analytical & debugging skills. Innovative and enthusiastic about technology and using it appropriately to solve problems. A can-do attitude and a natural curiosity to find better solutions. Can work as part of a small team with independent role. Proven ability to work well under pressure and in a team environment. Ability to work in a fast-paced, dynamic environment. Attention to detail and a commitment to quality. For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.

Posted 2 weeks ago

Apply

6.0 - 9.0 years

12 - 18 Lacs

Bengaluru

Work from Office

Role & responsibilities Good in Talend Developer Snowflake and SQl

Posted 2 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Gurugram

Work from Office

LocationBangalore/Hyderabad/Pune Experience level8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBTautomated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Were looking for a Product Engineer who thrives at the intersection of engineering and storytelling In this role, youll be responsible for helping data engineers and developers deeply understand what makes Firebolt unique and how to use it to build sub-second analytics experiences at scale Youll bring a strong technical foundation and real-world experience building analytics or data-intensive applications Youll use that expertise to craft high-quality content and experiences that resonate with a deeply technical audience - across formats like blog posts, demos, videos, documentation, and conference talks This role is ideal for someone who wants to stay close to the product and technology, while shaping how it is experienced and understood by the outside world Youll work cross-functionally with product, engineering, marketing, and customer-facing teams to translate technical capabilities into clear, compelling narratives REQUIREMENTS5+ years in engineering, solutions engineering, solution architect roles Proven experience building production-grade analytics systems, data pipelines, or data applications Strong understanding of modern data infrastructure, with hands-on experience using cloud data warehouses and/or data lakes Fluent in SQL, and comfortable with performance optimization and data modeling Excellent written and verbal communication skills, with the ability to translate complex technical topics into engaging content Experience creating developer-facing content such as technical blogs, demo apps, product tutorials, or internal enablement Self-starter with strong project management skills and the ability to lead initiatives from concept to execution Collaborative team player who enjoys working across disciplines and contributing to shared goals Curious and connected know whats happening in the industry, what users are building, and what tools they love (or hate) Bonus if you havePrior experience working in startups or fast-paced product organizations Background in AI, machine learning or dev tools Experience speaking at industry conferences, running webinars, or building video tutorials Contributions to open-source projects or active participation in developer/data communities

Posted 2 weeks ago

Apply

6.0 - 11.0 years

5 - 15 Lacs

Tirupati

Work from Office

About the Role We are seeking an experienced and driven Technical Project Manager / Technical Delivery Manager to lead complex, high-impact data analytics and data science projects for global clients. This role demands a unique blend of project management expertise, technical depth in cloud and data technologies, and the ability to collaborate across cross-functional teams. You will be responsible for ensuring the successful delivery of data platforms, data products, and enterprise analytics solutions that drive business value. Key Responsibilities Project & Delivery Management Lead the full project lifecycle for enterprise-scale data platformsincluding requirement gathering, development, testing, deployment, and post-production support. Own the delivery of Data Warehousing and Data Lakehouse solutions on cloud platforms (Azure, AWS, or GCP). Prepare and maintain detailed project plans (Microsoft Project Plan), and align them with the Statement of Work (SOW) and client expectations. Utilize hybrid project methodologies (Agile + Waterfall) for managing scope, budget, and timelines. Monitor key project KPIs (e.g., SLA, MTTR, MTTA, MTBF) and ensure adherence using tools like ServiceNow. Data Platform & Architecture Oversight Collaborate with data engineers and architects to guide the implementation of scalable Data Warehouses (e.g., Redshift, Synapse) and Data Lakehouse architectures (e.g., Databricks, Delta Lake). Ensure data platform solutions meet performance, security, and governance standards. Understand and help manage data integration pipelines, ETL/ELT processes, and BI/reporting requirements. Client Engagement & Stakeholder Management Serve as the primary liaison for US/UK clients; manage regular status updates, escalation paths, and expectations across stakeholders. Conduct WSRs, MSRs, and QBRs with clients and internal teams to drive transparency and performance reviews. Facilitate team meetings, highlight risks or blockers, and ensure consistent stakeholder alignment. Technical Leadership & Troubleshooting Provide hands-on support and guidance in data infrastructure troubleshooting using tools like Splunk, AppDynamics, Azure Monitor. Lead incident, problem, and change management processes with data platform operations in mind. Identify automation opportunities and propose technical process improvements across data pipelines and workflows. Governance, Documentation & Compliance Create and maintain SOPs, runbooks, implementation documents, and architecture diagrams. Manage project compliance related to data privacy, security, and internal/external audits. Initiate and track Change Requests (CRs) and look for revenue expansion opportunities with clients. Continuous Improvement & Innovation Participate in and lead at least three internal process optimization or innovation initiatives annually. Work with engineering, analytics, and DevOps teams to improve CI/CD pipelines and data delivery workflows. Monitor production environments to reduce deployment issues and improve time-to-insight. Must-Have Qualifications 10+ years of experience in technical project delivery, with strong focus on data analytics, BI, and cloud data platforms . Strong hands-on experience with SQL and data warehouse technologies like Snowflake, Synapse, Redshift, BigQuery , etc. Proven experience delivering Data Warehouse and Data Lakehouse solutions. Familiarity with tools such as Redshift, Synapse, BigQuery, Databricks, Delta Lake . Strong cloud knowledge with Azure, AWS, or GCP . Proficiency in project management tools like Microsoft Project Plan (MPP) , JIRA, Confluence, and ServiceNow. Expertise in Agile project methodologies. Excellent communication skillsboth verbal and writtenwith no MTI or grammatical errors. Hands-on experience working with global delivery models (onshore/offshore). Preferred Qualifications PMP or Scrum Master certification. Understanding of ITIL processes and DataOps practices. Experience managing end-to-end cloud data transformation projects. Experience in project estimation, proposal writing, and RFP handling. Desired Skills & Competencies Deep understanding of SDLC, data architecture, and data governance principles. Strong leadership, decision-making, and conflict-resolution abilities. High attention to detail and accuracy in documentation and reporting. Ability to handle multiple concurrent projects in a fast-paced, data-driven environment. A passion for data-driven innovation and business impact. Why Join Us? Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions. Work on impactful projects that make a difference across industries. Opportunities for professional growth and continuous learning. Competitive salary and benefits package.

Posted 2 weeks ago

Apply

12.0 - 16.0 years

40 - 45 Lacs

Pune

Remote

What You'll Do Job Summary We are looking for a Senior Technical Lead with expertise in designing SAAS applications and integrations for scale and expertise with full stack technologies development to join our globally distributed Electronic Invoicing & Live Reporting (ELR) team and assist us to become a global leader in the e-invoicing market, and part of every transaction in the world! We have a phenomenal team working in an open, collaborative environment that makes taxes and compliance less taxing to deal with. It will be up to you and the team to convert the product vision and requirements into a finished product. You will be Reporting to Senior Engineering Manager. You will work as individual contributor and you will not have managerial Responsibilities. You will work Remotely in India. What Your Responsibilities Will Be Job Responsibilities Avalara e-Invoicing and Platforms: Dig into our multi-patented, cloud-native Avalara product suite. We are building a flexible, platform that can handle any opportunity to create and submit electronic invoices and live reporting processes for any industry in any geography. Work with your team to create that while maximizing performance, scalability, and reliability while making it 'oh-so-simple' to operate. Design With The Vision In Mind Code, Review, Commit Create Industry-Leading Products Automation vs. People Power. Computers are great for process automation, but there's a limit to what they can do. You and the team will the unique challenges at the intersection of software Provide Technical guidance and mentoring the engineers in the team. What Youll Need To Be Successful Qualifications You have experience delivering high-quality features to production with expertise in service-oriented architectures, microservices, and web application development. You understand system performance trade-offs, load balancing, and high availability engineering. We're looking for a full-stack developer with expertise in Java, Node.js, and Python, so adaptability is valued. Experience with Java, React, microservices, web services, and REST APIs. We also use MySQL and PostgresDB as our primary transactional RDBMS We're expanding our cloud tech stack with Redis, Snowflake, Prometheus, Kafka, Kinesis, and Grafana. We use Docker for containerization, Kubernetes for orchestration, and AWS, though Azure and GCP backgrounds are welcomed. Collaborate with other teams to solve challenges and improve code to improve application efficiency. Prior experience working in e Invoicing. A Bachelor in Computer Science, Engineering, or related field is desirable. 12+ Years work experience Required.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Remote

As a Senior Azure Data Engineer, your responsibilities will include: Building scalable data pipelines using Databricks and PySpark Transforming raw data into usable business insights Integrating Azure services like Blob Storage, Data Lake, and Synapse Analytics Deploying and maintaining machine learning models using MLlib or TensorFlow Executing large-scale Spark jobs with performance tuning on Spark Pools Leveraging Databricks Notebooks and managing workflows with MLflow Qualifications: Bachelors/Masters in Computer Science, Data Science, or equivalent 7+ years in Data Engineering, with 3+ years in Azure Databricks Strong hands-on in: PySpark, Spark SQL, RDDs, Pandas, NumPy, Delta Lake Azure ecosystem: Data Lake, Blob Storage, Synapse Analytics

Posted 2 weeks ago

Apply

3.0 - 5.0 years

4 - 6 Lacs

Chennai, Bengaluru

Work from Office

Job Overview: We are seeking a highly skilled Technical Data Analyst for a remote contract position (6 to 12 months) to help build a single source of truth for our high-volume direct-to-consumer accounting and financial data warehouse. You will work closely with Finance & Accounting teams and play a pivotal role in dashboard creation, data transformation, and migration from Snowflake to Databricks. Key Responsibilities: 1. Data Analysis & Reporting Develop month-end accounting and tax dashboards using SQL in Snowflake (Snowsight) Migrate and transition reports/dashboards to Databricks Gather, analyze, and transform business requirements from finance/accounting stakeholders into data products 2. Data Transformation & Aggregation Build transformation pipelines in Databricks to support balance sheet look-forward views Maintain data accuracy and consistency throughout the Snowflake Databricks migration Partner with Data Engineering to optimize pipeline performance 3. ERP & Data Integration Support integration of financial data with NetSuite ERP Validate transformed data to ensure correct ingestion and mapping into ERP systems 4. Ingestion & Data Ops Work with Fivetran for ingestion and resolve any pipeline or data accuracy issues Monitor data workflows and collaborate with engineering teams on troubleshooting Required Skills & Qualifications: 5+ years of experience as a Data Analyst (preferably in Finance/Accounting domain) Strong in SQL, with proven experience in Snowflake and Databricks Experience in building financial dashboards (month-end close, tax reporting, balance sheets) Understanding of financial/accounting data: GL, journal entries, balance sheet, income statements Familiarity with Fivetran or similar data ingestion tools Experience with data transformation in a cloud environment Strong communication and stakeholder management skills Nice to have: Experience working with NetSuite ERP Location: Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 2 weeks ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Navi Mumbai

Work from Office

As Architect at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Architect, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong understanding data lake approaches, industry standards and industry best practices. Detail level understanding of HADOOP Framework, Ecosystem, MapReduce, and Data on Containers (data in OpenShift). Applies individual experiences / competency and IBM architecting structure thinking model to analyzing client IT systems. Experience with relational SQL, Big Data etc Experienced with Cloud native platforms such as AWS, Azure, Google, IBM Cloud or Cloud Native data platforms like Snowflake Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Unix shell scripting and python

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies