Jobs
Interviews

2379 Snowflake Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As an AWS Data Engineer at Kyndryl, you will be responsible for designing, building, and maintaining scalable, secure, and high-performing data pipelines using AWS cloud-native services. This role requires extensive hands-on experience with both real-time and batch data processing, expertise in cloud-based ETL/ELT architectures, and a commitment to delivering clean, reliable, and well-modeled datasets. Key Responsibilities: Design and develop scalable, secure, and fault-tolerant data pipelines utilizing AWS services such as Glue, Lambda, Kinesis, S3, EMR, Step Functions, and Athena. Create and maintain ETL/ELT workflows to support both structured and unstructured data ingestion from various sources, including RDBMS, APIs, SFTP, and Streaming. Optimize data pipelines for performance, scalability, and cost-efficiency. Develop and manage data models, data lakes, and data warehouses on AWS platforms (e.g., Redshift, Lake Formation). Collaborate with DevOps teams to implement CI/CD and infrastructure as code (IaC) for data pipelines using CloudFormation or Terraform. Ensure data quality, validation, lineage, and governance through tools such as AWS Glue Data Catalog and AWS Lake Formation. Work in concert with data scientists, analysts, and application teams to deliver data-driven solutions. Monitor, troubleshoot, and resolve issues in production pipelines. Stay abreast of AWS advancements and recommend improvements where applicable. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience Bachelor’s or master’s degree in computer science, Engineering, or a related field Over 8 years of experience in data engineering More than 3 years of experience with the AWS data ecosystem Strong experience with Java, Pyspark, SQL, and Python Proficiency in AWS services: Glue, S3, Redshift, EMR, Lambda, Kinesis, CloudWatch, Athena, Step Functions Familiarity with data modelling concepts, dimensional models, and data lake architectures Experience with CI/CD, GitHub Actions, CloudFormation/Terraform Understanding of data governance, privacy, and security best practices Strong problem-solving and communication skills Preferred Skills and Experience Experience working as a Data Engineer and/or in cloud modernization. Experience with AWS Lake Formation and Data Catalog for metadata management. Knowledge of Databricks, Snowflake, or BigQuery for data analytics. AWS Certified Data Engineer or AWS Certified Solutions Architect is a plus. Strong problem-solving and analytical thinking. Excellent communication and collaboration abilities. Ability to work independently and in agile teams. A proactive approach to identifying and addressing challenges in data workflows. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

7.0 - 11.0 years

15 - 30 Lacs

Noida

Remote

Job Title: IoT Solutions Architect (MQTT/HiveMQ) Consultant Location: 100% Remote Notes: Consumer goods and Manufacturing experience are highly preferred. Comfortable to work on the US Timezone Job Description The consultant will be working on a new MQTT/Hive MQ setup. IoT smart manufacturing project. Cloud platform - Azure Must have 3-4 years of experience in the field in manufacturing solutions and minimum 2 years of HiveMQ - Sequel, cloud/edge integrations experience. These are the skills required: Expertise in MQTT Protocols: Deep understanding of MQTT 3.1.1 and MQTT 5.0, including advanced features like QoS levels, retained messages, session expiry, and shared subscriptions. HiveMQ Platform Proficiency: Hands-on experience with HiveMQ broker setup, configuration, clustering, and deployment (on-premises, cloud, or Kubernetes). Edge-to-Cloud Integration: Ability to design and implement solutions that bridge OT (Operational Technology) and IT systems using MQTT. Sparkplug B Knowledge: Familiarity with Sparkplug B for contextual MQTT data in IIoT environments. Enterprise Integration: Experience with HiveMQ Enterprise Extensions (e.g., Kafka, Google Cloud Pub/Sub, AWS Kinesis, PostgreSQL, MongoDB, Snowflake). Security Implementation: Knowledge of securing MQTT deployments using HiveMQ Enterprise Security Extension (authentication, authorization, TLS, etc.). Custom Extension Development: Ability to develop and deploy custom HiveMQ extensions using the open-source SDK. Development & Scripting MQTT Client Libraries: Proficiency in using MQTT client libraries (e.g., Eclipse Paho, HiveMQ MQTT Client) in languages like Java, Python, or JavaScript. MQTT CLI: Familiarity with the MQTT Command Line Interface for testing and debugging. Scripting & Automation: Ability to automate deployment and testing using tools like HiveMQ Swarm. Soft Skills & Experience Must have 3-4 years of experience in the field in manufacturing solutions and minimum 2 years of HiveMQ - Sequel, cloud/edge integrations experience. IoT/IIoT Project Experience: Proven track record in implementing MQTT-based IoT solutions. Problem Solving & Debugging: Strong analytical skills to troubleshoot MQTT communication and broker issues. Communication & Documentation: Ability to clearly document architecture, configurations, and best practices for clients. Interested Candidate can apply : dsingh15@fcsltd.com

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Skill: Data Engineer Experience: 7+ Years Location: Warangal, Bangalore, Chennai, Hyderabad, Mumbai, Pune, Delhi, Noida, Gurgaon, Kolkata, Jaipur, Jodhpur Notice Period: Immediate - 15 Days Job Description: Design & Build Data Pipelines Develop scalable ETL/ELT workflows to ingest, transform, and load data into Snowflake using SQL, Python, or data integration tools. Data Modeling Create and optimize Snowflake schemas, tables, views, and materialized views to support business analytics and reporting needs. Performance Optimization Tune Snowflake compute resources (warehouses), optimize query performance, and manage clustering and partitioning strategies. Data Quality & Validation Security & Access Control Automation & CI/CD Monitoring & Troubleshooting Documentation

Posted 1 week ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Ahmedabad, Chennai, Bengaluru

Work from Office

Role & responsibilities TALEND ., SNOWFLAKE

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 27 Lacs

Pune

Hybrid

Notice Period - Immediate joiner Responsibilities Lead, develop and support analytical pipelines to acquire, ingest and process data from multiple sources Debug, profile and optimize integrations and ETL/ELT processes Design and build data models to conform to our data architecture Collaborate with various teams to deliver effective, high value reporting solutions by leveraging an established DataOps delivery methodology Continually recommend and implement process improvements and tools for data collection, analysis, and visualization Address production support issues promptly, keeping stakeholders informed of status and resolutions Partner closely with on and offshore technical resources Provide on-call support outside normal business hours as needed Provide status updates to the stakeholders. Identify obstacles and seek assistance with enough lead time to ensure delivery on time Demonstrate technical ability, thoroughness, and accuracy in all assignments Document and communicate on proper operations, standards, policies, and procedures Keep abreast on all new tools and technologies that are related to our Enterprise data architecture Foster a positive work environment by promoting teamwork and open communication. Skills/Qualifications Bachelors degree in computer science with focus on data engineering preferable. 6+ years of experience in data warehouse development, building and managing data pipelines in cloud computing environments Strong proficiency in SQL and Python Experience with Azure cloud services, including Azure Data Lake Storage, Data Factory, and Databricks Expertise in Snowflake or similar cloud warehousing technologies Experience with GitHub, including GitHub Actions. Familiarity with data visualization tools, such as Power BI or Spotfire Excellent written and verbal communication skills Strong team player with interpersonal skills to interact at all levels Ability to translate technical information for both technical and non-technical audiences Proactive mindset with a sense of urgency and initiative Adaptability to changing priorities and needs If you are interested share your updated resume on mail - recruit5@focusonit.com. Also Request you to please spread this message across your Networks or Contacts.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a highly motivated and experienced Data Engineer, you will be responsible for designing, developing, and implementing solutions that enable seamless data integration across multiple cloud platforms. Your expertise in data lake architecture, Iceberg tables, and cloud compute engines like Snowflake, BigQuery, and Athena will ensure efficient and reliable data access for various downstream applications. Your key responsibilities will include collaborating with stakeholders to understand data needs and define schemas, designing and implementing data pipelines for ingesting, transforming, and storing data. You will also be developing data transformation logic to make Iceberg tables compatible with the data access requirements of Snowflake, BigQuery, and Athena, as well as designing and implementing solutions for seamless data transfer and synchronization across different cloud platforms. Ensuring data consistency and quality across the data lake and target cloud environments will be crucial in your role. Additionally, you will be analyzing data patterns and identifying performance bottlenecks in data pipelines, implementing data optimization techniques to improve query performance and reduce data storage costs, and monitoring data lake health to proactively address potential issues. Collaboration and communication with architects, leads, and other stakeholders to ensure data quality meet specific requirements will also be an essential part of your role. To be successful in this position, you should have a minimum of 4+ years of experience as a Data Engineer, strong hands-on experience with data lake architectures and technologies, proficiency in SQL and scripting languages, and experience with data governance and security best practices. Excellent problem-solving and analytical skills, strong communication and collaboration skills, and familiarity with cloud-native data tools and services are also required. Additionally, certifications in relevant cloud technologies will be beneficial. In return, GlobalLogic offers exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. You will have the opportunity to collaborate with a diverse team of highly talented individuals in an open, laidback environment. Work-life balance is prioritized with flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional development opportunities include Communication skills training, Stress Management programs, professional certifications, and technical and soft skill trainings. GlobalLogic provides competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS(National Pension Scheme), extended maternity leave, annual performance bonuses, and referral bonuses. Fun perks such as sports events, cultural activities, food on subsidized rates, corporate parties, dedicated GL Zones, rooftop decks, and discounts for popular stores and restaurants are also part of the vibrant office culture at GlobalLogic. About GlobalLogic: GlobalLogic is a leader in digital engineering, helping brands design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise, GlobalLogic helps clients accelerate their transition into tomorrow's digital businesses. Operating under Hitachi, Ltd., GlobalLogic contributes to driving innovation through data and technology for a sustainable society with a higher quality of life.,

Posted 1 week ago

Apply

6.0 - 12.0 years

0 Lacs

karnataka

On-site

Your role as a Supervisor at Koch Global Services India (KGSI) will involve being part of a global team dedicated to creating new solutions and enhancing existing ones for Koch Industries. With over 120,000 employees worldwide, Koch Industries is a privately held organization engaged in manufacturing, trading, and investments. KGSI is being established in India to expand its IT operations and serve as an innovation hub within the IT function. This position offers the chance to join at the inception of KGSI and play a pivotal role in its development over the coming years. You will collaborate closely with international colleagues, providing valuable global exposure to the team. In this role, you will lead a team responsible for developing innovative solutions for KGS and its customers. You will oversee the performance and growth of data engineers at KGSI, ensuring the delivery of application solutions. Collaboration with global counterparts will be essential for enterprise-wide delivery success. Your responsibilities will include mentoring team members, providing feedback, and coaching them for their professional growth. Additionally, you will focus on understanding individual career aspirations, addressing challenges, and facilitating relevant training opportunities. Ensuring compensation aligns with Koch's philosophy and maintaining effective communication with HR will be key aspects of your role. Timely delivery of projects is crucial, and you will be responsible for identifying and addressing delays proactively. By fostering knowledge sharing and best practices within the team, you will contribute to the overall success of KGSI. Staying updated on market trends, talent acquisition, and talent retention strategies will be vital for your role. Your ability to lead by example, communicate effectively, and solve problems collaboratively will be essential in driving team success. To qualify for this role, you should hold a Bachelor's or Master's degree in computer science or information technology with a minimum of 12 years of IT experience, including leadership roles in integration teams. A solid background in data engineering, AWS cloud migration, and team management is required. Strong communication skills, customer focus, and a proactive mindset towards innovation are essential for success in this position. Experience with AWS Lambda, Glue, ETL projects, Python, SQL, and BI tools will be advantageous. Familiarity with manufacturing business processes and exposure to Scrum Master practices would be considered a plus. Join Koch Global Services (KGS) to be part of a dynamic team that creates solutions to support various business functions worldwide. With a global presence in India, Mexico, Poland, and the United States, KGS empowers employees to make a significant impact on a global scale.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

pune, maharashtra

On-site

As a Software Engineer at Mastercard, you will play a crucial role in supporting applications software by utilizing your programming, analysis, design, development, and delivery skills to provide innovative software solutions. You will be involved in designing highly scalable, fault-tolerant, and performant systems in the cloud, ensuring that project implementations and technical deliveries align with solution architectural design and best practices. Your responsibilities will include collaborating with stakeholders to understand business needs, evaluating emerging technologies, providing technical guidance to project teams, and supporting services before and after they go live. You will be expected to analyze ITSM activities, maintain system health, scale systems sustainably through automation, and evolve systems for improved reliability and velocity. Your role will involve explaining technical issues and solution strategies to stakeholders, assisting with project scoping, planning, and estimation, and staying up to date with new technologies through self-study and participation in relevant events. This position requires a minimum bachelor's degree in information technology, Computer Science, or equivalent work experience, along with at least 2 years of hands-on software development experience and familiarity with software and microservices architecture. The ideal candidate should have a current understanding of best practices in application and system security, experience in data analytics, ETL, data modeling, and pattern analysis, and be willing to learn new technology stacks. Strong domain knowledge of Java 8 (or later), experience with relational and NoSQL databases, and proficiency in user interface development frameworks, particularly Angular, are desired. Excellent communication skills, a collaborative mindset, and the ability to work effectively in a global team across different time zones are essential for success in this role. At Mastercard, corporate security responsibility is paramount, and it is expected that every individual takes ownership of information security by abiding by security policies, ensuring confidentiality and integrity of accessed information, reporting any security violations or breaches, and completing mandatory security trainings. If you are a motivated software engineer who thrives in a collaborative and innovative environment, eager to tackle challenging business problems using cutting-edge technologies, and keen on contributing to the growth of a dynamic company, we invite you to be part of our team and drive our solutions to the next level.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Location: Pune (Hybrid) Experience: 5+ years Key Responsibilities: Data Pipeline Architecture: Build and optimize large-scale data ingestion pipelines from multiple sources. Scalability & Performance: Ensure low-latency, high-throughput data processing for real-time and batch workloads. Cloud Infrastructure: Design and implement cost-effective, scalable data storage solutions. Automation & Monitoring: Implement observability tools for pipeline health, error handling, and performance tracking. Security & Compliance: Ensure data encryption, access control, and regulatory compliance in the data platform. Ideal Candidate Profile: Strong experience in Snowflake, dbt, and AWS for large-scale data processing. Expertise in Python, Airflow, and Spark for orchestrating pipelines. Deep understanding of data architecture principles for real-time and batch workloads. Hands-on experience with Kafka, Kinesis, or similar streaming technologies. Ability to work on cloud cost optimizations and infrastructure-as-code (Terraform, CloudFormation).,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

As a Snowflake DBT Lead at Pyramid Consulting, you will be responsible for overseeing Snowflake data transformation and validation processes in Delhi, India. Your role will include ensuring efficient data handling, maintaining data quality, and collaborating closely with cross-functional teams. To excel in this role, you should have strong expertise in Snowflake, DBT, and SQL. Your experience in data transformation, modeling, and validation will be crucial for success. Proficiency in ETL processes and data warehousing is essential to meet the job requirements. Your excellent problem-solving and communication skills will enable you to effectively address challenges and work seamlessly with team members. As a candidate for this position, you should hold a Bachelor's degree in Computer Science or a related field. Your ability to lead and collaborate within a team environment will be key to delivering high-quality solutions and driving impactful results for our clients.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

Optum is a global organization dedicated to delivering care using technology to improve the lives of millions of people. Your work with our team will directly enhance health outcomes by providing individuals with access to care, pharmacy benefits, data, and resources necessary for their well-being. Our culture is defined by diversity and inclusion, alongside talented colleagues, comprehensive benefits, and opportunities for career development. Join us in making a positive impact on the communities we serve while contributing to the advancement of global health equity through caring, connecting, and growing together. In this role, your primary responsibilities will include analyzing client requirements and complex business scenarios, designing innovative and fully automated products and solutions, serving as a BI Developer for key projects, ensuring high-quality execution of products, providing consulting to teammates, leaders, and clients, and offering extensive solutions in ETL strategies. You should possess an undergraduate degree or equivalent experience, along with expertise in ETL processes and data integration using Azure Data Factory. Proficiency in Power BI semantic model creation, report development, and data visualization is required, with Snowflake and Azure Data Warehouse as primary data sources. Additionally, you should have a strong understanding of data modeling concepts, relational database systems, Snowflake, and Azure Data Warehouse. Familiarity with Databricks for data engineering, advanced analytics, and machine learning tasks is preferred, as well as proficiency in Azure Cloud services such as Azure Data Factory, Azure SQL Data Warehouse, Azure Data Lake Storage, and Azure Analytics. Solid programming skills in Python, SQL, and other scripting languages are essential, along with proven problem-solving abilities, effective communication and collaboration skills, and the capacity to manage multiple tasks simultaneously. Microsoft certifications in Power BI, Azure Cloud, Snowflake, or related fields are a plus. The role is based in Hyderabad, Telangana, IN.,

Posted 1 week ago

Apply

8.0 - 13.0 years

0 Lacs

hyderabad, telangana

On-site

At Techwave, we are committed to fostering a culture of growth and inclusivity. We ensure that every individual associated with our brand is challenged at every step and provided with the necessary opportunities to excel in their professional and personal lives. People are at the core of everything we do. Techwave is a leading global IT and engineering services and solutions company dedicated to revolutionizing digital transformations. Our mission is to enable clients to maximize their potential and achieve a greater market share through a wide array of technology services, including Enterprise Resource Planning, Application Development, Analytics, Digital solutions, and the Internet of Things (IoT). Founded in 2004 and headquartered in Houston, TX, USA, Techwave leverages its expertise in Digital Transformation, Enterprise Applications, and Engineering Services to help businesses accelerate their growth. We are a team of dreamers and doers who constantly push the boundaries of what's possible, and we want YOU to be a part of it. Job Title: Data Lead Experience: 10+ Years Mode of Hire: Full-time Key Skills: As a senior-level ETL developer with 10-13 years of experience, you will be responsible for building relational and data warehousing applications. Your primary role will involve supporting the existing EDW, designing and developing various layers of our data, and testing, documenting, and optimizing the ETL process. You will collaborate within a team environment to design and develop frameworks and services according to specifications. Your responsibilities will also include preparing detailed system documentation, performing unit and system tests, coordinating with Operations staff on application deployment, and ensuring that all activities are performed with quality and compliance standards. Additionally, you will design and implement ETL batches that meet SLAs, develop data collection, staging, movement, quality, and archiving strategies, and design automation processes to control data access and movement. To excel in this role, you must have 8-10 years of ETL/ELT experience, strong SQL skills, and proficiency in Stored Procedures and database development. Experience in Azure Data Lake, Synapse, Azure Data Factory, and Databricks, as well as Snowflake, is essential. You should possess a good understanding of data warehouse ETL and ELT design best practices, be able to work independently, and have a strong database experience with DB2, SQL Server, and Azure. Furthermore, you should be adept at designing Relational and Dimensional Data models, have a good grasp of Enterprise reporting (particularly Power BI), and understand Agile practices and methodologies. Your role will also involve assisting in analyzing and extracting relevant information from historical business data to support Business Intelligence initiatives and conducting Proof of Concept for new technology selection and proposing data warehouse architecture enhancements. If you are a self-starter with the required skills and experience, we invite you to join our dynamic team at Techwave and be a part of our journey towards innovation and excellence.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead Data Engineer with 8-12 years of experience, you will be responsible for handling a variety of tasks for one of our clients in Hyderabad. This is a full-time position with an immediate start date. Your proficiency in Python, Spark, SQL, Snowflake, Airflow, AWS, and DBT will be essential for this role. In this role, you will be expected to work on a range of data engineering tasks using the specified skill set. Your expertise in these technologies will be crucial in developing efficient data pipelines, ensuring data quality, and optimizing data workflows. Furthermore, you will collaborate with cross-functional teams to understand data requirements, design and implement data solutions, and provide technical guidance on best practices. Your ability to communicate effectively and work well in a team setting will be key to your success in this role. If you are interested in this opportunity and possess the required skill set, please share your profile with us at srujanat@teizosoft.com. We look forward to potentially having you join our team in Hyderabad.,

Posted 1 week ago

Apply

3.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Talend Data Engineer specializing in Talend and Snowflake, you will be responsible for developing and maintaining ETL workflows using Talend and Snowflake. Your role will involve designing and optimizing data pipelines to ensure performance and scalability. Collaboration with various teams to address data integration and reporting requirements will be crucial. Your focus will also include ensuring data quality, governance, and security protocols. To excel in this role, you should possess 3 to 9 years of experience working with Talend ETL and Snowflake. Proficiency in SQL, Python, and working knowledge of cloud platforms such as AWS, Azure, and Google Cloud is essential. Previous experience in constructing end-to-end data pipelines and familiarity with data warehouses are key requirements for this position. Experience in Snowflake performance tuning and an understanding of Agile methodologies are considered advantageous for this role. This is a full-time position that follows a day shift schedule from Monday to Friday, requiring your presence at the office in Nagpur, Pune, Bangalore, or Chennai. Join us in this dynamic opportunity to leverage your expertise in Talend and Snowflake to drive impactful data solutions while collaborating with cross-functional teams to meet business objectives effectively.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

If you are a software engineering leader ready to take the reins and drive impact, weve got an opportunity just for you. As a Director of Software Engineering at JPMorgan Chase within the Asset and Wealth Management LOB, you lead a data technology area and drive impact within teams, technologies, and deliveries. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex initiatives, while serving as a primary decision maker for your teams and be a driver of engineering innovation and solution delivery. The current role focuses on delivering data solutions for some of the Wealth Management businesses. Job responsibilities Leads engineering and delivery of a data and analytics solutions Makes decisions that influence teams resources, budget, tactical operations, and the execution and implementation of processes and procedures Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership & maintainability Delivers technical solutions that can be leveraged across multiple businesses and domains Influences and collaborates with peer leaders and senior stakeholders across the business, product, and technology teams Champions the firms culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Experience managing data solutions across a large, global consumer community in the Financial Services domain Experience hiring, developing and leading cross-functional teams of technologists Experience handling multiple, global stakeholders across business, technology and product Appreciation of the data product; modeling, sourcing, quality, lineage, discoverability, access management, visibility, purging, etc. Experience researching and upgrading to latest technologies in the continuously evolving data ecosystem Practical hybrid cloud native experience, preferably AWS Experience using current technologies, such as GraphQL, Glue, Spark, SnowFlake, SNS, SQS, Kinesis, Lambda, ECS, EventBridge, QlikSense, etc. Experience with Java and/or Python programming languages Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field Preferred qualifications, capabilities, and skills Comfortable being hands-on as required to drive solutions and solve challenges for the team Exposure and appreciation of the continuously evolving data science space Exposure to the Wealth Management business,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Location Pune/Nagpur Need immediate joiner only Job Description Key Responsibilities : Design, develop, and maintain robust data pipelines and ETL processes using Snowflake and other cloud technologies. Work with large datasets, ensuring their availability, quality, and performance across systems. Implement data models, optimize storage, and query performance on Snowflake. Write complex SQL queries for data extraction, transformation, and reporting purposes. Develop, test, and implement data solutions leveraging Python scripting and Snowflakes native features. Collaborate with data scientists, analysts, and business stakeholders to deliver scalable data solutions. Monitor and troubleshoot data pipelines to ensure smooth operation and efficiency. Perform data migration, integration, and processing tasks across cloud platforms. Stay updated with the latest developments in Snowflake, SQL, and cloud technologies. Required Skills Snowflake: Expertise in building, optimizing, and managing data warehousing solutions on Snowflake. SQL: Strong knowledge of SQL for querying and managing relational databases, writing complex queries, stored procedures, and performance tuning. Python: Proficiency in Python for scripting, automation, and integration within data pipelines. Experience in developing and managing ETL processes, and ensuring data accuracy and performance. Hands-on experience with data migration and integration processes across cloud platforms. Familiarity with data security and governance best practices. Strong problem-solving skills with the ability to troubleshoot and resolve data-related issues. (ref:hirist.tech),

Posted 1 week ago

Apply

10.0 - 17.0 years

0 Lacs

hyderabad, telangana

On-site

We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,

Posted 1 week ago

Apply

7.0 - 10.0 years

30 - 32 Lacs

Hyderabad

Work from Office

6+ years of Java development. Strong knowledge of SQL Agile development methodologies. Working experience with Snowflake and its native features (snowpark, data shares) Python Understanding of core AWS services and cloud infrastructure

Posted 1 week ago

Apply

10.0 - 20.0 years

20 - 30 Lacs

Pune

Remote

Role & responsibilities Minimum 10+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.

Posted 2 weeks ago

Apply

10.0 - 20.0 years

25 - 35 Lacs

Bengaluru

Remote

Role & responsibilities Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements. Certification is preferred.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Lead Data Engineer with DBT, Snowflake, SQL, and data warehousing expertise. Design, build, and maintain scalable data pipelines, ensure data quality, and solve complex data problems. ETL tool adaptability essential. SAP data and enterprise platform

Posted 2 weeks ago

Apply

5.0 - 10.0 years

35 - 50 Lacs

Bengaluru

Remote

-Design & develop interactive dashboards (Power BI/Tableau) -Familiarity with Azure/AWS/Snowflake -Strong in data modelling, SQL, ETL, and warehousing -Expert-level proficiency in Tableau and Power BI -Drive BI governance and performance standards

Posted 2 weeks ago

Apply

8.0 - 13.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Overview - Our Hosting team manages and supports the infrastructure of all our platforms, from hardware to software, to operating systems to PowerSchool products. This collaborative team helps meet the needs of an evolving technology and business model with a specialized focus on protecting customer data and keeping information secure. We work closely with product engineering teams to deliver products into production environments across Azure and AWS. Description - Design, develop and operate Infrastructure-as-code automation for Terraform, K8s, Snowflake, and while also executing customer tenant migrations in Analytics and Insights. Manage and optimize K8s clusters and workloads using Argo CD, Flux and Helm. Configure and support dynamic connector configurations. Work with the Product Engineering teams in building out the specifications and provide scalable, reliable platforms for the automation/delivery platform UI Development for internal dashboards and management applications CI/CD pipeline engineering and support Environment support including production support Participate in on-call schedules Requirements - Minimum of 8+ years of relevant and related work experience. Bachelors degree or equivalent, or equivalent years of relevant work experience. Additional experience may be substitute for an advanced Degree. Advanced knowledge and experience with Kubernettes, Flux, Terraform or equivalent technologies Advance knowledge of AWS services, including EKS, EFS, RDS, ECS, etc Working knowledge of monitoring, logging and alerting tools like Grafana and Prometheus Strong Java, Python, and Git experience

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad

Work from Office

Location : Hyderabad Work Model (Hybrid / WFO) : Hybrid & 3 Days per week (Tue, Wed, Thu) Experience Required: 5+ Employment Type (Full-Time / Part-Time / Contract): Full-time Mandatory Skills: Advanced SQL, Tableau / Snowflake / Teradata Python would be a plus Job Summary & Key Responsibilities: Design, develop, and support dashboards and reports. Provide analysis to support business management and executive decision-making. Design and develop effective SQL scripts to transform and aggregate large data sets, create derived metrics, and embed them into business solutions An effective and good communicator has demonstrated experience in handling larger and more complex business analytics projects 6+ years of relevant experience in design, development, and maintenance of Business Intelligence, reporting, and data applications Advanced hands-on experience with Tableau or similar BI dashboard visualisation tools Expertise in programmatically processing large data sets from multiple source systems, integration, data mining, summarisation, and presentation of results to exec audience. Advanced knowledge of SQL in related technologies like Oracle, Teradata, performance debugging, and tuning activities Strong understanding of dev to production processes, User Acceptance and Production Validation testing, waterfall and agile development, and code deployment methodologies Experience in core data warehousing concepts, dimensional data modeling, RDBMS, OLAP, ROLAP Experience with ETL tools used to automate manual processes (SQL scripts, Airflow/Nifi) Experience with R, Python, Hadoop Only Immediate Joiner Thanks & Regards, Milki Bisht- 9151206474 Email id milki.bisht@nlbtech.in

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Gurugram

Work from Office

About the Opportunity Job Type: PermanentApplication Deadline: 31 July 2025 Title Senior Analyst - Data Science Department Enterprise Data & Analytics Location Gurgaon Reports To Gaurav Shekhar Level Data Scientist 4 About your team Join the Enterprise Data & Analytics team collaborating across Fidelitys global functions to empower the business with data-driven insights that unlock business opportunities, enhance client experiences, and drive strategic decision-making. About your role As a key contributor within the Enterprise Data & Analytics team, you will lead the development of machine learning and data science solutions for Fidelity Canada. This role is designed to turn advanced analytics into real-world impactdriving growth, enhancing client experiences, and informing high-stakes decisions. Youll design, build, and deploy ML models on cloud and on-prem platforms, leveraging tools like AWS SageMaker, Snowflake, Adobe, Salesforce etc. Collaborating closely with business stakeholders, data engineers, and technology teams, youll translate complex challenges into scalable AI solutions. Youll also champion the adoption of cloud-based analytics, contribute to MLOps best practices, and support the team through mentorship and knowledge sharing. This is a high-impact role for a hands-on problem solver who thrives on ownership, innovation, and seeing their work directly influence strategic outcomes. About you You have 47 years of experience working in data science domain, with a strong track record of delivering advanced machine learning solutions for business. Youre skilled in developing models for classification, forecasting, recommender systems and hands-on with frameworks like Scikit-learn, TensorFlow, or PyTorch. You bring deep expertise in developing and deploying models on AWS SageMaker, strong business problem-solving abilities, and are familiar with emerging GenAI trends. A background in engineering, mathematics, or economics from a Tier 1 institution will be preferred. For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies