Jobs
Interviews

961 Dataflow Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What You’ll Be Doing... At Verizon, we are building a world-class Verizon Global Services Data Analytics (D&A) Hub organization. We are creating a comprehensive program to streamline processes, improve systems, realign the organization priorities and add opportunities for personal skill-building and professional growth. We’re finding new ways to add value and provide strategic and executive support to our stake holders. You will be working in a Data product ownership model with business and work on solving real world problems by compiling and analyzing data and help tell the story behind the numbers. This position offers opportunities to drive better business partnering and insights, while developing your Data Intelligence skill set and leadership as we continue to grow as a world class organization. You’ll become involved in, but not limited to, discovery, planning, integrating, modeling, analysis and reporting that will impact important decisions around the growth and development of Verizon business. Performing subject matter expertise & ad-hoc analysis, including identifying new revenue streams and improving operational efficiencies, reduction in man hours, new metrics insights and drivers with the help of the supply chain, logistic transportation and network end to end data operations and data products. Ensuring timely and accurate delivery of data intelligence applications for planning, reporting & analysis for the business. Liaison with cross-functional teams and business partners to build a network and acquire advanced business & technical acumen. Identifying improvement opportunities and executing projects which may include leveraging digital tools for cloud technologies, data workflow creation, system integration, automation tools and dashboards. Play a crucial role in defining the data architecture framework, standards and principles, including modeling, metadata, security and reference data. What We’re Looking For... You’ll need to have: Bachelor’s degree of four or more years of work experience. Six or more years of relevant work experience. Experience in data lifecycle management. Proven track record in design and build of the infrastructure for data extraction, preparation, and loading of data from a variety of sources using technology such as SQL ,Non -SQL and Big data. Identifying ways to improve data reliability, efficiency and quality by various data solution techniques. Experience in Google Cloud Platform technologies like Big query, Composer, Dataflow. Experience or transferable skills leveraging digital tools such as Tableau, Qlik, Looker, ThoughtSpot, Alteryx, SQL, Python, or R. Expert Knowledge of ETL process and reporting tools. Experience in dashboard development using Looker/Tableau/Thoughtspot. Experience in analyzing large amounts of information to discover trends and patterns. Experience with Microsoft Office Suite and Google Suite. Even better if you have one or more of the following: Master’s degree or direct work experience in Data analytics, Supply chain or Telecom industry. Expert in writing Complex SQL queries and scripts using databases/tools like Oracle, SQL Server, or Google BigQuery, Data Stage, Python, Snowflake and pulling data from SQL/EDW data warehouses. Master Knowledge of common business & cost drivers, operational statement analysis, and Storytelling. Industry standard Data Automation and Proactive Alerting skills. Excellent communication skills and ability to focus on the details. Proficiency with Google Suite. If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. #VGSNONCDIO Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: GCP Data Engineer – Data Migration & Transformation Location: Chennai Experience Level: 4+ Years Key Responsibilities: Design and build robust, scalable data pipelines and architectures on GCP, especially BigQuery. Migrate and transform large-scale data systems and datasets to GCP with a focus on performance, scalability, and reliability. Automate data lineage extraction and ensure data integrity across multiple systems and platforms. Collaborate with architects and stakeholders to implement GCP-native and 3rd-party tools for data ingestion, integration, and transformation. Develop and optimize complex SQL queries in BigQuery for data analysis and transformation. Operationalize data pipelines using tools like Apache Airflow (Cloud Composer), DataFlow, and Pub/Sub. Enable machine learning capabilities through well-structured, ML-friendly data pipelines. Participate in Agile processes and contribute to technical design discussions, code reviews, and documentation. Required Skills & Experience: 5+ years of experience in Data Warehousing , Data Engineering , or similar roles. Minimum 2 years of hands-on experience working with GCP BigQuery . Proficiency in Python , SQL , Apache Airflow , and GCP services including BigQuery, DataFlow, Cloud Composer, Pub/Sub, and Cloud Functions. Experience with data pipeline automation , data modeling, and building reusable data products. Solid understanding of data lineage , metadata integration , and data cataloging (preferably GCP Data Catalog and Informatica EDC). Proven ability to analyze complex datasets and derive actionable insights. Demonstrated experience building and deploying analytics platforms on cloud environments (GCP preferred). Preferred Skills: Strong analytical and problem-solving capabilities. Exposure to machine learning pipeline architecture and model deployment workflows. Excellent communication skills and ability to work collaboratively with cross-functional teams. Familiarity with Agile methodologies and DevOps best practices. Self-driven, innovative mindset with a commitment to delivering high-quality solutions. Experience with documenting complex data engineering systems and developing test plans. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant (Grade – Manager) P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation, upgrade and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Genius, Sapiens, One-Shield, Acquarium, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management, communication, and resolving conflict working with multi-cultural / global stakeholders. Should have handled international client transition and end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant (Grade – Manager) P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation, upgrade and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Genius, Sapiens, One-Shield, Acquarium, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management, communication, and resolving conflict working with multi-cultural / global stakeholders. Should have handled international client transition and end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant (Grade – Manager) P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation, upgrade and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Genius, Sapiens, One-Shield, Acquarium, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management, communication, and resolving conflict working with multi-cultural / global stakeholders. Should have handled international client transition and end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant (Grade – Manager) P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation, upgrade and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Genius, Sapiens, One-Shield, Acquarium, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management, communication, and resolving conflict working with multi-cultural / global stakeholders. Should have handled international client transition and end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant (Grade – Manager) P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcreek) LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation, upgrade and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Genius, Sapiens, One-Shield, Acquarium, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management, communication, and resolving conflict working with multi-cultural / global stakeholders. Should have handled international client transition and end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Innovate with Impact: Design and develop software solutions that push the boundaries of what's possible, elevating our capabilities and delighting our customers. Collaborative Brilliance: Consult with product owners and business partners to define requirements and create software designs that hit the sweet spot between feasibility and excellence. Mentorship Magic: Share your expertise by mentoring and guiding less experienced team members through the intricate dance of software development, ensuring they become stars in their own right. Testing Trailblazer: Define scope, develop testing methods, and collaborate with the QA team to enhance our testing efforts. Your goal? Ensure our solutions stand up to the highest standards. Operational Maestro: Provide top-tier operational support, diagnose complex issues in production systems, and resolve incidents with the finesse of a seasoned performer. Tech Explorer: Dive into the world of new and alternate technologies, evaluating, recommending, and applying them. Your mission is to keep our team at the forefront of innovation. Job Qualifications Experience Aplenty: 5+ years of hands-on experience in applicable software development environments, showcasing your prowess and ability to excel. Educational Symphony: A Bachelor's degree is strongly preferred, demonstrating your commitment to continuous learning and growth. Tech Savvy: Should have demonstrated experience in Cloud environments like AWS, GCP, or Azure. Comparable knowledge of tools like Azure Pipelines, BigQuery, MFT, Vault, & DataFlow. Workflow management and orchestration tools such as Airflow. Experience with object function/object-oriented scripting languages including Java and Python. Working knowledge of Snowflake, and DataFlow a definite plus! Business Acumen: Translate business needs into technical requirements with finesse, showcasing your ability to balance technical excellence with customer satisfaction. Team Player: Collaborate seamlessly with the team, responding to requests in a timely manner, meeting individual commitments, and contributing to the collective success. Mentor Extraordinaire: Leverage your coaching and teaching skills to guide and mentor your fellow team members, fostering an environment of continuous improvement. Cotality's Diversity Commitment Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates Show more Show less

Posted 1 month ago

Apply

3.0 years

4 - 22 Lacs

Chennai

On-site

We are looking for a skilled GCP Data Engineer with 3+ years of experience to design, develop, and optimize scalable data pipelines using Google Cloud Platform (GCP) services. The ideal candidate will have expertise in BigQuery, Dataflow, Dataproc, Data Fusion, SQL, Airflow, Python, PySpark, Terraform, and Tekton. Responsibilities: Design and build scalable ETL/ELT pipelines on Google Cloud Platform (GCP) . Develop and optimize BigQuery queries for high-performance data processing. Implement data workflows using Dataflow, Dataproc, and Data Fusion . Automate and schedule data pipelines using Apache Airflow . Write clean, efficient, and scalable Python and PySpark code for data processing. Use Terraform to define and manage cloud infrastructure as code (IaC). Implement CI/CD pipelines using Tekton for data engineering workflows. Collaborate with data scientists, analysts, and business teams to understand data needs. Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency. Ensure data security, governance, and compliance with industry standards. Requirements: 3+ years of experience as a Data Engineer working with Google Cloud Platform (GCP) . Strong expertise in BigQuery, Dataflow, Dataproc, and Data Fusion . Proficiency in writing and optimizing SQL queries. Experience with Apache Airflow for orchestration and workflow automation. Hands-on experience in Python and PySpark for data processing. Knowledge of Terraform for infrastructure automation. Familiarity with Tekton for CI/CD pipeline automation. Strong analytical and problem-solving skills. Ability to work in an agile and collaborative environment. Nice to Have: Experience with Kafka, Pub/Sub, or other real-time streaming technologies . Knowledge of machine learning pipelines on GCP. Understanding of DataOps and DevOps best practices . Job Types: Full-time, Permanent Pay: ₹477,742.42 - ₹2,200,000.00 per year Schedule: Monday to Friday Application Question(s): Looking for immediate joiner. mention last working date Experience: Dataflow: 4 years (Preferred) SQL: 4 years (Preferred) Dataproc: 4 years (Preferred) Datafusion: 4 years (Preferred) Python: 4 years (Preferred) Pyspark: 4 years (Preferred) Terraform: 4 years (Preferred) tekton: 4 years (Preferred) GCP: 4 years (Preferred) BigQuery: 4 years (Preferred) Work Location: In person

Posted 1 month ago

Apply

5.0 years

0 Lacs

Jaipur, Rajasthan, India

Remote

Job Summary Auriga is looking for a Data Engineer to design and maintain cloud-native data pipelines supporting real-time analytics and machine learning. You'll work with cross-functional teams to build scalable, secure data solutions using GCP (BigQuery, Looker), SQL, Python, and orchestration tools like Dagster and DBT. Mentoring junior engineers and ensuring data best practices will also be part of your role. WHAT YOU'LL DO: Design, build, and maintain scalable data pipelines and architectures to support analytical and operational workloads. Develop and optimize ETL/ELT pipelines, ensuring efficient data extraction, transformation, and loading from various sources. Work closely with backend and platform engineers to integrate data pipelines into cloud-native applications. Manage and optimize cloud data warehouses, primarily BigQuery, ensuring performance, scalability, and cost efficiency. Implement data governance, security, and privacy best practices, ensuring compliance with company policies and regulations. Collaborate with analytics teams to define data models and enable self-service reporting and BI capabilities. Develop and maintain data documentation, including data dictionaries, lineage tracking, and metadata management. Monitor, troubleshoot, and optimize data pipelines, ensuring high availability and reliability. Stay up to date with emerging data engineering technologies and best practices, continuously improving our data infrastructure. WHAT WE'RE LOOKING FOR: Strong proficiency in English (written and verbal communication) is required. Experience working with remote teams across North America and Latin America, ensuring smooth collaboration across time zones. 5+ years of experience in data engineering, with expertise in building scalable data pipelines and cloud-native data architectures. Strong proficiency in SQL for data modeling, transformation, and performance optimization. Experience with BI and data visualization tools (e.g., Looker, Tableau, or Google Data Studio). Expertise in Python for data processing, automation, and pipeline development. Experience with cloud data platforms, particularly Google Cloud Platform (GCP).Hands-on experience with Google BigQuery, Cloud Storage, and Pub/Sub. Strong knowledge of ETL/ELT frameworks such as DBT, Dataflow, or Apache Beam. Familiarity with workflow orchestration tools like Dagster, Apache Airflow or Google Cloud Workflows. Understanding of data privacy, security, and compliance best practices. Strong problem-solving skills, with the ability to debug and optimize complex data workflows. Excellent communication and collaboration skills.  NICE TO HAVE: Experience with real-time data streaming solutions (e.g., Kafka, Pub/Sub, or Kinesis). Familiarity with machine learning workflows and MLOps best practices. Knowledge of Terraform for Infrastructure as Code (IaC) in data environments. Familiarity with data integrations involving Contentful, Algolia, Segment, and Talon.One .  About Company Hi there! We are Auriga IT. We power businesses across the globe through digital experiences, data and insights. From the apps we design to the platforms we engineer, we're driven by an ambition to create world-class digital solutions and make an impact. Our team has been part of building the solutions for the likes of Zomato, Yes Bank, Tata Motors, Amazon, Snapdeal, Ola, Practo, Vodafone, Meesho, Volkswagen, Droom and many more. We are a group of people who just could not leave our college-life behind and the inception of Auriga was solely based on a desire to keep working together with friends and enjoying the extended college life. Who Has not Dreamt of Working with Friends for a Lifetime Come Join In! Our Website Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are searching for a skilled Lead Data Engineer to enhance our energetic team. In the role of Lead Data Engineer, you will take charge of creating, building, and sustaining data integration solutions for our clientele. Your leadership will guide a team of engineers towards achieving high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for an experienced data integration expert who is enthusiastic about technology and excels in a rapid, evolving setting. Responsibilities Design, build, and sustain data integration solutions for clients Guide a team of engineers to guarantee high-quality, scalable, and efficient data integration solutions Collaborate with multidisciplinary teams to grasp business needs and devise appropriate data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, such as technical specifications, data flow diagrams, and data mappings Continuously update knowledge and skills related to the latest data integration methods and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

🚀 We're Hiring: ETL Developer – GCP Dataflow | 6+ Years Experience 📍 Location : Chennai, India 🕒 Experience : 6+ Years Are you passionate about data engineering and cloud-based ETL solutions? We’re looking for an experienced ETL Developer with expertise in Google Cloud Platform (GCP) – especially Dataflow, BigQuery, and Workflow – to join our growing team in Chennai . 🔍 Key Responsibilities: Design, develop, and support ETL workflows and modules using GCP Dataflow , BigQuery , and Workflow . Participate in and support DevSecOps activities. Execute unit tests , and contribute to code reviews and peer inspections . Perform impact analysis on existing systems for new developments or enhancements. Understand business requirements and deliver scalable, maintainable ETL solutions. Collaborate with cross-functional teams to ensure high-quality delivery. ✅ Requirements: 6+ years of experience in ETL development . Proficiency with GCP Dataflow , BigQuery , and other GCP data services. Strong understanding of data integration , pipelines , and cloud-native architectures. Experience in Agile environments and ability to work in fast-paced development cycles. Strong problem-solving skills and ability to work independently and within a team. 📩 Apply Now or reach out directly Apply now : DM me or send your resume to samdarshi.singh@mwidm.com More info : +91 62392 61536 #ETLDeveloper #ChennaiJobs #GCP #Dataflow #BigQuery #CloudJobs #DataEngineering #NowHiring #TechJobs Show more Show less

Posted 1 month ago

Apply

0.0 - 4.0 years

0 Lacs

Jaipur, Rajasthan

Remote

Senior Data Engineer Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur,Kolkata Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: ·Design,develop, and maintain scalable ETL/ELT data pipelines using SQL and Python · Collaborate with data analysts, data scientists, and product teams to understand data needs · Optimize queries and data models for performance and reliability · Integrate data from various sources, including APIs, internal databases, and third-party systems · Monitor and troubleshoot data pipelines to ensure data quality and integrity · Document processes, data flows, and system architecture · Participate in code reviews and contribute to a culture of continuous improvement Required Skills: ·4–6 years of experience in data engineering, data architecture, or backend development with a focus on data · Strong command of SQL for data transformation and performance tuning · Experience with Python (e.g., pandas, Spark, ADF) · Solid understanding of ETL/ELT processes and data pipeline orchestration · Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) · Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) · Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) · Basic Programming Skills · Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: ·Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. · Exposure to enterprise solutions (e.g., Databricks, Synapse) · Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) · Background in real-time data streaming and event-driven architectures · Understanding of data governance, security, and compliance best practices · Prior experience working in agile development environment Educational Qualifications: ·Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm Job Types: Full-time, Permanent Pay: ₹826,249.60 - ₹1,516,502.66 per year Benefits: Flexible schedule Health insurance Leave encashment Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Quarterly bonus Yearly bonus Ability to commute/relocate: Jaipur, Rajasthan: Reliably commute or planning to relocate before starting work (Required) Experience: Data Engineer: 4 years (Required) Location: Jaipur, Rajasthan (Required) Work Location: In person

Posted 1 month ago

Apply

5.0 years

0 Lacs

India

Remote

Job Title: GCP Data Engineer Location: Remote Job Type: Contract / Full-Time Experience Required: 5+ Years About the Role: We are seeking a skilled GCP Data Engineer to design and implement robust data pipelines and solutions using Google Cloud Platform. The ideal candidate will have strong experience in data engineering, ETL development, and real-time data processing in GCP. Key Responsibilities: - Design and build scalable, efficient data pipelines using GCP services (BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Composer). - Develop and manage data models and warehouses using BigQuery. - Write efficient SQL and Python code for data ingestion, transformation, and validation. - Implement data quality checks, monitoring, and governance practices. - Collaborate with data scientists, analysts, and stakeholders to deliver business-critical data solutions. - Use CI/CD tools and Terraform for infrastructure automation and deployment. - Optimize queries and pipeline performance for large-scale datasets. Required Skills: - Strong experience with GCP services: BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage. - Proficient in Python and SQL for data engineering tasks. - Experience with orchestration tools such as Cloud Composer (Apache Airflow). - Hands-on experience with Spark and PySpark. - Familiarity with Git, CI/CD workflows, and infrastructure as code (Terraform preferred). - Solid understanding of data modeling (Star, Snowflake schemas), ETL/ELT concepts. Preferred: - Experience in real-time data streaming and analytics. - Exposure to Snowflake, Kafka, or other cloud-native data tools. - Previous experience in Agile environments Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

JOB DESCRIPTION • Strong in Python with libraries such as polars, pandas, numpy, scikit-learn, matplotlib, tensorflow, torch, transformers • Must have: Deep understanding of modern recommendation systems including two-tower , multi-tower , and cross-encoder architectures • Must have: Hands-on experience with deep learning for recommender systems using TensorFlow , Keras , or PyTorch • Must have: Experience generating and using text and image embeddings (e.g., CLIP , ViT , BERT , Sentence Transformers ) for content-based recommendations • Must have: Experience with semantic similarity search and vector retrieval for matching user-item representations • Must have: Proficiency in building embedding-based retrieval models , ANN search , and re-ranking strategies • Must have: Strong understanding of user modeling , item representations , temporal/contextual personalization • Must have: Experience with Vertex AI for training, tuning, deployment, and pipeline orchestration • Must have: Experience designing and deploying machine learning pipelines on Kubernetes (e.g., using Kubeflow Pipelines , Kubeflow on GKE , or custom Kubernetes orchestration ) • Should have experience with Vertex AI Matching Engine or deploying Qdrant , F AISS , ScaNN , on GCP for large-scale retrieval • Should have experience working with Dataproc (Spark/PySpark) for feature extraction, large-scale data prep, and batch scoring • Should have a strong grasp of cold-start problem solving using metadata and multi-modal embeddings • Good to have: Familiarity with Multi-Modal Retrieval Models combining text, image, and tabular features • Good to have: Experience building ranking models (e.g., XGBoost , LightGBM , DLRM ) for candidate re-ranking • Must have: Knowledge of recommender metrics (Recall@K, nDCG, HitRate, MAP) and offline evaluation frameworks • Must have: Experience running A/B tests and interpreting results for model impact • Should be familiar with real-time inference using Vertex AI , Cloud Run , or TF Serving • Should understand feature store concepts , embedding versioning , and serving pipelines • Good to have: Experience with streaming ingestion (Pub/Sub, Dataflow) for updating models or embeddings in near real-time • Good to have: Exposure to LLM-powered ranking or personalization , or hybrid recommender setups • Must follow MLOps practices — version control, CI/CD, monitoring, and infrastructure automation GCP Tools Experience: ML & AI : Vertex AI, Vertex Pipelines, Vertex AI Matching Engine, Kubeflow on GKE, AI Platform Embedding & Retrieval : Matching Engine, FAISS, ScaNN, Qdrant, GKE-hosted vector DBs (Milvus) Storage : BigQuery, Cloud Storage, Firestore Processing : Dataproc (PySpark), Dataflow (batch & stream) Ingestion : Pub/Sub, Cloud Functions, Cloud Run Serving : Vertex AI Online Prediction, TF Serving, Kubernetes-based custom APIs, Cloud Run CI/CD & IaC : GitHub Actions, GitLab CI Show more Show less

Posted 1 month ago

Apply

15.0 years

0 Lacs

Greater Hyderabad Area

On-site

Compiler Lead Hyderabad Founded by highly respected Silicon Valley veterans - with its design centers established in Santa Clara, California. / Hyderabad/Bangalore A US based well-funded product-based startup looking for Highly talented Verification Engineers for the following roles. We are looking for a highly experienced systems engineer with deep expertise in compilers, machine learning infrastructure, and system-level performance optimization. This role is hands-on and research-driven, ideal for someone who thrives on solving low-level performance challenges and building core infrastructure that powers next-generation AI workloads. Key Responsibilities: Compiler Design & Optimization Develop and enhance compiler toolchains based on LLVM, MLIR, Open64, or Glow. Build and optimize intermediate representations, custom dialects, and code generation flows for AI accelerators. Implement transformations and optimizations for latency, memory usage, and compute efficiency. AI System Integration Work closely with hardware teams to co-design compilers targeting custom silicon. Integrate compiler backends with ML frameworks like PyTorch, TensorFlow, or ONNX. Build graph-level and kernel-level transformations for AI training and inference pipelines. Performance Tuning & System Analysis Conduct low-level profiling and performance tuning across compiler and runtime layers. Identify and eliminate bottlenecks across CPU/GPU/NPU workloads. Develop parallel programming solutions leveraging SIMD, multi-threading, and heterogeneous computing. Tooling & Infrastructure Develop tooling for performance analysis, debug, and test automation. Contribute to internal SDKs and devkits used by AI researchers and system engineers. Required Skills & Experience: Strong compiler development experience using LLVM, MLIR, Glow, or similar toolchains. Proficiency in C/C++, with solid command of Python for tooling and automation. In-depth understanding of compiler internals, including IR design, lowering, codegen, and scheduling. Deep knowledge of hardware-software co-design, particularly for AI/ML workloads. Experience with runtime systems, memory models, and performance modeling. Solid grasp of parallel and heterogeneous computing paradigms. Nice to Have: Experience working with custom AI hardware or edge inference platforms. Familiarity with quantization, scheduling for dataflow architectures, or compiler autotuning. Contributions to open-source compiler projects (e.g., LLVM, MLIR, TVM). Qualifications: Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, or a related field. 8–15 years of relevant hands-on experience in compilers, systems programming, or AI infrastructure. Contact: Uday Mulya Technologies muday_bhaskar@yahoo.com "Mining The Knowledge Community" Show more Show less

Posted 1 month ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job description for GCP Developer Extensive experience with Google Data Products (Cloud Data Fusion, BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion, BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle Java Full Stack 8 to 10 Years of experience Cloud:: GCP, Dataflow, Dataproc, AI building Blocks, Looker, Dataprep, Cloud Data Fusion, BigQuery & Dataproc Database: SQL/No SQL Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Strong skills in Python and GCP services including Google Composer, Bigquery, Google Storage) Strong expertise in writing SQL, PL-SQL in Oracle, MYSQL or any other relation database. Good to have skill: Data warehousing & ETL (any tool) Proven experience in using GCP services is preferred. Strong Presentation and communication skills Analytical & Problem-Solving skills Mandatory Skill Sets GCP Data Engineer Preferred Skill Sets GCP Data Engineer Years Of Experience Required 4-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering, GCP Dataflow Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Karnataka, India

On-site

Who You’ll Work With You will be part of the Digital Design & Merchandising, Product Creation, Planning, and Manufacturing Technology team at Converse. You will take direction and work primarily with the Demand and Supply team, supporting business planning space. You'll work with a talented team of engineers, data architects, and business stakeholders to design and implement scalable data integration solutions on cloud-based platforms to support our planning org. The successful candidate will be responsible for leading the integration of planning systems, processes, and data across the organization Who We Are Looking For We're looking for a seasoned Cloud Integration Lead with expertise in Databricks, Apache Spark, and cloud-based data integration. You'll have a strong technical background, excellent collaboration skills, and a passion for delivering high-quality solutions. The Ideal Candidate Will Have 5+ years of experience with Databricks, Apache Spark, and cloud-based data integration. Strong Technical expertise with cloud-based platforms, including AWS and or Azure cloud. Strong programming skills in languages like SQL, Python, Java, or Scala. 3+ years' experience with cloud-based data infrastructure and integration leveraging tools like S3, Airflow, EC2, AWS Glue, DynamoDB & Lambdas, Athena, AWS Code deploy, Azure Data Factory, or Google Cloud Dataflow. Experience with Jenkins and other CI/CD tools like GitLab CI/CD, CircleCI, etc. Experience with containerization using Docker and Kubernetes. Experience with infrastructure such as code using tools like Terraform or CloudFormation Experience with Agile development methodologies and version control systems like Git Experience with IT service management tools like ServiceNow, JIRA, etc. Data warehousing solutions, such as Amazon Redshift, Azure Synapse Analytics, or Google BigQuery will be a plus but not mandatory. Data science and machine learning concepts, including TensorFlow, PyTorch, or scikit-learn will be a plus but not mandatory. Strong technical background in computer science, software engineering, or a related field. Excellent collaboration, communication, and interpersonal skills. Experience with data governance, data quality, and data security principles. Ability to lead and mentor junior team members. AWS Certified Solutions Architect or AWS Certified Developer Associate or Azure Certified Solutions Architect certification. What You’ll Work On Design and implement scalable data integration solutions using Databricks, Apache Spark, and cloud-based platforms. Develop and implement cloud-based data pipelines using Databricks, Nifi, AWS Glue, Azure Data Factory, or Google Cloud Dataflow. Collaborate with cross-functional teams to deliver high-quality solutions that meet business requirements. Develop and maintain technical standards, best practices, and documentation. Integrate various data sources, including on-premises and cloud-based systems, applications, and databases. Ensure data quality, integrity, and security throughout the integration process. Collaborate with data engineering, data science, and business stakeholders to understand requirements and deliver solutions. Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are seeking an experienced Lead Data Engineer to join our dynamic team. As a Lead Data Engineer, you will be responsible for designing, developing, and maintaining data integration solutions for our clients. You will lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions. This is an exciting opportunity for a seasoned data integration professional passionate about technology and who thrives in a fast-paced, dynamic environment. Responsibilities Design, develop, and maintain data integration solutions for clients Lead a team of engineers to ensure the delivery of high-quality, scalable, and performant data integration solutions Collaborate with cross-functional teams to understand business requirements and design data integration solutions that meet those requirements Ensure data integration solutions are secure, reliable, and performant Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Continuously learn and stay up-to-date with the latest data integration approaches and tools Requirements Bachelor's degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or a related field Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for querying and manipulating data Experience with Snowflake for cloud data warehousing Experience with at least one cloud platform such as AWS, Azure, or GCP Experience leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Our organization is seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects related to data integration and ETL on cloud-based platforms. You will take charge of creating and executing sophisticated data solutions, ensuring data accuracy, dependability, and accessibility. Responsibilities Create and execute sophisticated data solutions on cloud-based platforms Build ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and sustain documentation, including technical specifications, data flow diagrams, and data mappings Enhance and tune data integration processes for optimal performance and efficiency, guaranteeing data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Qualifications in Snowflake for data warehousing Familiarity with cloud platforms like AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are looking for a skilled Lead Data Engineer to become an integral part of our vibrant team. In this role, you will take charge of designing, developing, and maintaining data integration solutions tailored to our clients' needs. You will oversee a team of engineers, ensuring the delivery of high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for a seasoned data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain client-specific data integration solutions Oversee a team of engineers to guarantee high-quality, scalable, and efficient delivery of data integration solutions Work with cross-functional teams to comprehend business requirements and create suitable data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, including technical specifications, data flow diagrams, and data mappings Stay informed and up-to-date with the latest data integration methods and tools Requirements Bachelor’s degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or related fields Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Role Proficiency: Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Code Outputs Expected: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation Requirements test cases and results Configure Define and govern configuration management plan Ensure compliance from the team Test Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort and size estimation and plan resources for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface With Customer Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications Obtain relevant domain and technology certifications Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments We are seeking a highly skilled Lead Data Engineer with strong expertise in Google Cloud Platform (GCP), particularly in BigQuery, to join our data engineering team. The ideal candidate will lead the design, development, and optimization of data pipelines and solutions to support advanced analytics and business intelligence efforts. Key Responsibilities: Lead end-to-end data engineering projects on GCP, ensuring scalability, performance, and cost-efficiency. Design and implement ETL/ELT pipelines using tools like Dataflow, BigQuery, Cloud Composer (Airflow), and Pub/Sub. Collaborate with data architects, analysts, and business teams to translate requirements into robust data solutions. Develop and optimize complex SQL queries and transformations in BigQuery. Implement and enforce data governance, security, and best practices across pipelines and storage. Monitor data pipelines for performance, reliability, and data quality. Mentor junior engineers and provide technical leadership within the team. Required Skills & Qualifications: 8+ years of experience in Data Engineering with at least 3+ years in GCP-based projects. Expert-level experience with BigQuery, Cloud Storage, and Dataflow. Strong proficiency in SQL, Python, and Apache Beam. Experience with workflow orchestration using Cloud Composer (Apache Airflow). Solid understanding of data warehousing concepts, data modeling, and performance tuning. Knowledge of DevOps practices, CI/CD for data pipelines, and infrastructure-as-code (e.g., Terraform). Excellent communication, leadership, and stakeholder management skills. Skills Google Bigquery,Sql,Python Show more Show less

Posted 1 month ago

Apply

15.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Job Title: Google Cloud Platform, Kubernetes, Apigee Architect with React & Node.js Location: Bangalore, Chennai, Kochi, Trivandrum, Hyderabad, Pune, Noida Job Description We are looking for a highly skilled Google Cloud Platform (GCP), Kubernetes, and Apigee Architect with expertise in React and Node.js. The ideal candidate will have a deep understanding of cloud architecture, security, and API management while demonstrating strong leadership and problem-solving capabilities. Must-Have Skills Experience: 15+ years of overall work experience, with at least 7+ years of hands-on experience with cloud platforms. Expertise in: Cloud Technologies: Google Cloud Platform (GCP), Compute Engine, Google Kubernetes Engine (GKE), App Engine, Cloud Functions, BigQuery, Dataflow, Dataproc, Spanner. Security & Networking: IAM, SailPoint, Security Command Center, VPC, Cloud Load Balancing. Infrastructure as Code: Terraform, Deployment Manager. API Management: Apigee, API Gateway. Development Frameworks: React.js, Node.js. Agile & DevOps Methodologies: CI/CD Pipelines, DevOps, Dockers Key Responsibilities Architect and implement scalable cloud solutions using Google Cloud Platform. Design and optimize API management solutions using Apigee and API Gateway. Develop and deploy containerized applications with Kubernetes (GKE). Ensure cloud security and compliance best practices. Drive DevOps adoption, automation, and continuous integration/deployment processes. Collaborate with cross-functional teams for high-performance application development in React and Node.js. Provide technical leadership and mentorship to development teams. Troubleshoot performance and security issues and optimize cloud solutions. Develop clear technical documentation for architecture, processes, and best practices. Preferred Qualifications Professional Cloud Architect Certification preferred. Candidates must be GCP certified or willing to get certified within 6 months of joining. Strong problem-solving skills with the ability to work in collaborative environments. Excellent communication skills to articulate technical solutions to both technical and non-technical audiences. Skills Google Cloud Platform,Kubernetes,Apigee, React / Node / C# .Net Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The Opportunity We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic data team in Gurgaon. The ideal candidate will have a strong background in designing, building, and maintaining robust, scalable, and efficient data pipelines and data warehousing solutions. You will play a crucial role in transforming raw data into actionable insights, enabling data-driven decision-making across the Responsibilities : Data Pipeline Development : Design, develop, construct, test, and maintain highly scalable data pipelines using various ETL/ELT tools and programming languages (e.g., Python, Scala, Java). Data Warehousing : Build and optimize data warehouse solutions (e.g., Snowflake, Redshift, BigQuery, Databricks) to support reporting, analytics, and machine learning initiatives. Data Modeling : Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and design optimal data models (dimensional, relational, etc.). Performance Optimization : Identify and implement solutions for data quality issues, data pipeline performance bottlenecks, and data governance challenges. Cloud Technologies : Work extensively with cloud-based data platforms (AWS, Azure, GCP) and their respective data services (e.g., S3, EC2, Lambda, Glue, Data Factory, Azure Synapse, GCS, Dataflow, BigQuery). Automation & Monitoring : Implement automation for data pipeline orchestration, monitoring, and alerting to ensure data reliability and availability. Mentorship : Mentor junior data engineers, provide technical guidance, and contribute to best practices and architectural decisions within the data team. Collaboration : Work closely with cross-functional teams, including product, engineering, and business intelligence, to deliver data solutions that meet business needs. Documentation : Create and maintain comprehensive documentation for data pipelines, data models, and data Qualifications : Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. 5+ years of professional experience in data engineering, with a strong focus on building and optimizing data pipelines and data warehousing solutions. Proficiency in at least one programming language commonly used in data engineering (e.g., Python, Scala, Java). Python is highly preferred. Extensive experience with SQL and relational databases. Demonstrated experience with cloud data platforms (AWS, Azure, or GCP) and their relevant data services. Strong understanding of data warehousing concepts (e.g., Kimball methodology, OLAP, OLTP) and experience with data modeling techniques. Experience with big data technologies (e.g., Apache Spark, Hadoop, Kafka). Familiarity with version control systems (e.g., Skills : Experience with specific data warehousing solutions like Snowflake, Redshift, or Google BigQuery. Knowledge of containerization technologies (Docker, Kubernetes). Experience with CI/CD pipelines for data solutions. Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker). Understanding of machine learning concepts and how data engineering supports ML workflows. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team in a fast-paced environment (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies