Home
Jobs

259 Data Flow Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

8 - 18 Lacs

Noida

Remote

Naukri logo

Well versed in C++ Multithreading programming Excellent programming skills using C++, Java, C# programming languages Familiarity with FIX protocol, market data distribution, order handling is a plus Strong command of spoken and written English

Posted 4 weeks ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP Ariba Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring that business operations run smoothly and efficiently. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor system performance and proactively address potential issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Ariba.- Strong understanding of application support processes and methodologies.- Experience with troubleshooting and resolving software issues.- Familiarity with system integration and data flow management.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in SAP Ariba.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good to have- SAP ABAP, CDP views- Strong understanding of data modeling concepts and best practices.- Experience with application design methodologies and tools.- Ability to analyze and interpret complex business requirements.- Familiarity with integration techniques and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Pune

Work from Office

Naukri logo

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application, model and design the application data structure, storage, and integration. You will play a crucial role in shaping the data architecture of the organization and ensuring seamless data flow. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data architecture discussions and decisions- Develop data models and database design- Implement data governance policies Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI)- Strong understanding of data modeling and database design- Experience with ETL processes and tools- Knowledge of data integration and data warehousing concepts- Hands-on experience with SQL and database management- Good To Have Skills: Experience with data visualization tools Additional Information:- The candidate should have a minimum of 7.5 years of experience in Microsoft Power Business Intelligence (BI)- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP HCM On Premise ABAP, SAP ABAP BOPF Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application, model and design the application data structure, storage, and integration. You will play a crucial role in shaping the data architecture of the project and ensuring seamless data flow. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance initiatives to ensure data quality and integrity- Develop data models and database design for efficient data storage- Implement data security measures to protect sensitive information Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM On Premise ABAP, SAP ABAP BOPF- Strong understanding of data modeling and database design- Experience in data integration and ETL processes- Knowledge of data governance and data security best practices- Hands-on experience with SAP HANA database management Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP HCM On Premise ABAP.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 9.0 years

14 - 19 Lacs

Chennai

Work from Office

Naukri logo

Project description We are seeking a highly skilled Senior Power BI Developer with strong expertise in Power BI, SQL Server, and data modeling to join our Business Intelligence team. In this role, you will lead the design and development of interactive dashboards, robust data models, and data pipelines that empower business stakeholders to make informed decisions. You will work collaboratively with cross-functional teams and drive the standardization and optimization of our BI architecture. Responsibilities Power BI Dashboard Development (UI Dashboards) Design, develop, and maintain visually compelling, interactive Power BI dashboards aligned with business needs. Collaborate with business stakeholders to gather requirements, develop mockups, and refine dashboard UX. Implement advanced Power BI features like bookmarks, drill-throughs, dynamic tooltips, and DAX calculations. Conduct regular UX/UI audits and performance tuning on reports. Data Modeling in SQL Server & Dataverse Build and manage scalable, efficient data models in Power BI, Dataverse, and SQL Server. Apply best practices in dimensional modeling (star/snowflake schema) to support analytical use cases. Ensure data consistency, accuracy, and alignment across multiple sources and business areas. Perform optimization of models and queries for performance and load times. Power BI Dataflows & ETL Pipelines Develop and maintain reusable Power BI Dataflows for centralized data transformations. Create ETL processes using Power Query, integrating data from diverse sources including SQL Server, Excel, APIs, and Dataverse. Automate data refresh schedules and monitor dependencies across datasets and reports. Ensure efficient data pipeline architecture for reuse, scalability, and maintenance. Skills Must have Experience6+ years in Business Intelligence or Data Analytics with a strong focus on Power BI and SQL Server. Technical Skills: Expert-level Power BI development, including DAX, custom visuals, and report optimization. Strong knowledge of SQL (T-SQL) and relational database design. Experience with Dataverse and Power Platform integration. Proficiency in Power Query, Dataflows, and ETL development. ModelingProven experience in dimensional modeling, star/snowflake schema, and performance tuning. Data IntegrationSkilled in connecting and transforming data from various sources, including APIs, Excel, and cloud data services. CollaborationAbility to work with stakeholders to define KPIs, business logic, and dashboard UX. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior

Posted 1 month ago

Apply

3.0 - 5.0 years

10 - 13 Lacs

Chennai

Work from Office

Naukri logo

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 1 month ago

Apply

5.0 - 7.0 years

15 - 17 Lacs

Chennai

Work from Office

Naukri logo

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Guwahati

Work from Office

Naukri logo

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Kochi

Work from Office

Naukri logo

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Kanpur

Work from Office

Naukri logo

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 1 month ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Kolkata

Work from Office

Naukri logo

Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps

Posted 1 month ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Jhagadia

Work from Office

Naukri logo

Develop, implement, and maintain the organization's MIS to ensure accurate and real-time reporting of key business metrics. Oversee the preparation and distribution of daily, weekly, and monthly reports to various departments and senior management. Ensure data accuracy, integrity, and consistency across all reporting platforms. Design and maintain dashboards for business performance monitoring. Analyze data trends and provide insights to management for informed decision-making. Establish and maintain cost accounting systems and procedures for accurate tracking of material, labor, and overhead costs. Review and update cost standards, analyzing variances and taking corrective actions when necessary. Collaborate with other departments to monitor and control project costs, ensuring alignment with budget and financial goals. Perform cost analysis and prepare cost reports to monitor financial performance and support pricing decisions. Conduct regular audits to ensure compliance with costing policies and industry standards. Provide regular cost analysis reports, highlighting variances between actual and budgeted figures, and recommend corrective actions. Support financial forecasting and budgeting processes by providing relevant data and insights. Assist in month-end and year-end closing processes by ensuring accurate costing and reporting entries. Review profitability analysis reports and identify areas for cost optimization.

Posted 1 month ago

Apply

5.0 - 7.0 years

15 - 20 Lacs

Chennai

Work from Office

Naukri logo

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills agile development,Data Processing,Python,Shell Script,SQL,Google Cloud Platform*,GCS*,DataProc*,Big Query*,Data Flow*

Posted 1 month ago

Apply

7.0 - 9.0 years

19 - 22 Lacs

Chennai

Work from Office

Naukri logo

This role is for 7+ years experienced Software Engineer with data engineering knowledge and following skill set. 1.) End 2 End Full Stack 2.) GCP - Services like Big Query, Astronomer, Terraform, Airflow, Data flow, GCP Architecture 3.) Python Fullstack Java with Cloud Mandatory Key Skills Software Engineering,Big Query,Terraform,Airflow,Data flow,GCP Architecture,Java,Cloud,data engineering*

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Job Title: Ab Initio Developer Location: Gurugram Experience: 4-5 years Employment Type: Full Time Job Summary: We are seeking an experienced Ab Initio Developer to design, develop, and maintain high-volume, enterprise-grade ETL solutions for our data warehouse environment. The ideal candidate will have strong technical expertise in Ab Initio components, SQL, UNIX scripting, and the ability to work collaboratively with both business and technical teams to deliver robust data integration solutions. Key Responsibilities: Analyze, design, implement, and maintain large-scale, multi-terabyte data warehouse ETL applications that operate 24/7 with high performance and reliability. Develop logical and physical data models to support data warehousing and business intelligence initiatives. Lead and participate in complex ETL development projects using Ab Initio, ensuring quality and efficiency. Translate business requirements into system and data flows, mappings, and transformation logic. Create detailed design documentation, including high-level (HLD) and low-level design (LLD) specifications. Conduct design reviews, capture feedback, and facilitate additional sessions as required. Develop, test, and deploy ETL workflows using Ab Initio components such as Rollup, Scan, Join, Partition, Gather, Merge, Interleave, Lookup, etc. Perform SQL database programming and optimize SQL queries for performance. Develop and maintain UNIX shell scripts to automate ETL workflows and system processes. Collaborate with Release Management, Configuration Management, Quality Assurance, Architecture, Database Support, and other development teams. Ensure adherence to source control standards using EME or similar tools. Provide ongoing support and maintenance of ETL processes and troubleshoot issues as needed. Required Skills & Qualifications: Hands-on development experience with Ab Initio components (Rollup, Scan, Join, Partition by key, Round Robin, Gather, Merge, Interleave, Lookup, etc.) Strong background in designing and delivering complex, large-volume data warehouse applications Experience with source-code control tools such as EME Proficient in SQL database programming, including query optimization and performance tuning Good working knowledge of UNIX scripting and Oracle SQL/PL-SQL Strong technical expertise in preparing detailed design documents (HLD, LLD) and unit testing Ability to understand and communicate effectively with both business and technical stakeholders Strong problem-solving skills and attention to detail Ability to work independently as well as part of a team

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Job Title: Ab Initio Developer Location: Gurugram Experience: 4-5 years Employment Type: Full Time Job Summary: We are seeking an experienced Ab Initio Developer to design, develop, and maintain high-volume, enterprise-grade ETL solutions for our data warehouse environment. The ideal candidate will have strong technical expertise in Ab Initio components, SQL, UNIX scripting, and the ability to work collaboratively with both business and technical teams to deliver robust data integration solutions. Key Responsibilities: Analyze, design, implement, and maintain large-scale, multi-terabyte data warehouse ETL applications that operate 24/7 with high performance and reliability. Develop logical and physical data models to support data warehousing and business intelligence initiatives. Lead and participate in complex ETL development projects using Ab Initio, ensuring quality and efficiency. Translate business requirements into system and data flows, mappings, and transformation logic. Create detailed design documentation, including high-level (HLD) and low-level design (LLD) specifications. Conduct design reviews, capture feedback, and facilitate additional sessions as required. Develop, test, and deploy ETL workflows using Ab Initio components such as Rollup, Scan, Join, Partition, Gather, Merge, Interleave, Lookup, etc. Perform SQL database programming and optimize SQL queries for performance. Develop and maintain UNIX shell scripts to automate ETL workflows and system processes. Collaborate with Release Management, Configuration Management, Quality Assurance, Architecture, Database Support, and other development teams. Ensure adherence to source control standards using EME or similar tools. Provide ongoing support and maintenance of ETL processes and troubleshoot issues as needed. Skills and Qualifications 4-5 years of experience in Ab Initio development. Ab Initio: Proficient in using Ab Initio tools, such as GDE and Enterprise Metadata Environment (EME). ETL Concepts: Understanding of ETL processes, data transformations, and data warehousing. SQL: Knowledge of SQL for data retrieval and manipulation. Unix/Linux Shell Scripting: Familiarity with Unix/Linux shell scripting for automation and scripting tasks. Problem-Solving: Ability to identify and solve technical issues related to Ab Initio application

Posted 1 month ago

Apply

0.0 - 1.0 years

0 Lacs

New Delhi, Jammu

Work from Office

Naukri logo

Teqtive IT Services Pvt Ltd is looking for Graphics Design Intern to join our dynamic team and embark on a rewarding career journey. Collaborating with clients or team members to determine design requirements and project goalsDeveloping and creating visual contentSelecting and manipulating appropriate images, fonts, and other design elements to enhance the visual impact of designsUsing graphic design software, such as Adobe Photoshop, Illustrator, and InDesign, to produce final designsPresenting design concepts and presenting revisions to clients or team membersManaging multiple projects and meeting tight deadlinesEnsuring designs meet brand guidelines and quality standards

Posted 1 month ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview: We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities: Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source technologies. Create efficient data models and write high-performance analytics queries. Optimize pipelines for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineering teams to ensure smooth data integration and transformation. Maintain high data quality, enforce validation rules, and set up monitoring and alerting. Participate in code reviews, deployment activities, and provide production support. Technical Skills Required: Cloud PlatformsGCP (Google Cloud Platform)- mandatory Key GCP ServicesDataproc, BigQuery, Dataflow Programming LanguagesPython, Java, PySpark Data Engineering ConceptsData Ingestion, Change Data Capture (CDC), ETL/ELT pipeline design Strong understanding of distributed computing, data structures, and performance tuning Required Qualifications & Attributes: 5"“8 years of hands-on experience in data engineering roles Proficiency in building and optimizing distributed data pipelines Solid grasp of data governance and security best practices in cloud environments Strong analytical and problem-solving skills Effective verbal and written communication skills Proven ability to work independently and in cross-functional teams Show more Show less

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bhopal, Pune, Gurugram

Hybrid

Naukri logo

Job Title: Senior Data Engineer GCP | Big Data | Airflow | dbt Company: Xebia Location: All Xebia locations Experience: 10+ Years Employment Type: Full Time Notice Period: Immediate to Max 30 Days Only Job Summary Join the digital transformation journey of one of the world’s most iconic global retail brands! As a Senior Data Engineer , you’ll be part of a dynamic Digital Technology organization, helping build modern, scalable, and reliable data products to power business decisions across the Americas. You'll work in the Operations Data Domain, focused on ingesting, processing, and optimizing high-volume data pipelines using Google Cloud Platform (GCP) and other modern tools. Key Responsibilities Design, develop, and maintain highly scalable big data pipelines (batch & streaming) Collaborate with cross-functional teams to understand data needs and deliver efficient solutions Architect robust data solutions using GCP-native services (BigQuery, Pub/Sub, Cloud Functions, etc.) Build and manage modern Data Lake/Lakehouse platforms Create frameworks and reusable components for scalable ingestion and processing Implement data governance, security, and ensure regulatory compliance Mentor junior engineers and lead an offshore team of 8+ engineers Monitor pipeline performance, troubleshoot bottlenecks, and ensure data quality Engage in code reviews, CI/CD deployments, and agile product releases Contribute to internal best practices and engineering standards Must-Have Skills & Qualifications 8+ years in data engineering with strong hands-on experience in production-grade pipelines Expertise in GCP Data Services – BigQuery, Vertex AI, Pub/Sub, etc. Proficiency in dbt (Data Build Tool) for data transformation Strong programming skills in Python, Java, or Scala Advanced SQL & NoSQL knowledge Experience with Apache Airflow for orchestration Hands-on with Git, GitHub Actions , Jenkins for CI/CD Solid understanding of data warehousing (BigQuery, Snowflake, Redshift) Exposure to tools like Hadoop, Spark, Kafka , Databricks (nice to have) Familiarity with BI tools like Tableau, Power BI, or Looker (optional) Strong leadership qualities to manage offshore engineering teams Excellent communication skills and stakeholder management experience Preferred Education Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field Notice Period Requirement Only Immediate Joiners or candidates with Max 30 Days Notice Period will be considered. How to Apply If you are passionate about solving real-world data problems and want to be part of a global data-driven transformation, apply now by sending your resume to vijay.s@xebia.com with the subject line: "Sr Data Engineer Application – [Your Name]" Kindly include the following details in your email: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day Key Skills Please do not apply if you are currently in process with any other role at Xebia or have recently interviewed.

Posted 1 month ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Years Of Exp - 5-12 Yrs Location - PAN India OFSAA Data Modeler Experience in design, build ,customize OFSAA Data model ,Validation of data model Excellent knowledge in Data model guidelines for Staging. processing and Reporting tables. Knowledge on Data model Support on configuring the UDPs, subtype and supertype relationship enhancements Experience on OFSAA platform (OFSAAI) with one or more of following OFSAA modules: o OFSAA Financial Solution Data Foundation - (Preferred) o OFSA Data Integrated Hub - Optional Good in SQL and PL/SQL. Strong in Data Warehouse Principles, ETL/Data Flow tools. Should have excellent Analytical and Communication skills. OFSAA Integration SME - DIH/Batch run framework Experience in ETL process, familiar with OFSAA. DIH setup in EDS, EDD, T2T, etc. Familiar with different seeded tables, SCD, DIM, hierarchy, look ups, etc Worked with FSDF in knowing the STG, CSA, FACT table structures Have working with different APIs and out of box connectors, etc. Familiar with Oracle patching and SR

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Title: Data Engineer GCP Company: Xebia Location: Hybrid - Any Xebia Location Experience: 5+ Years Salary: As per industry standards Job Type: Full Time About the Role: Xebia is hiring a seasoned Data Engineer (L4) to join a high-impact team building scalable data platforms using GCP, Databricks, and Airflow . If you thrive on architecting future-ready solutions and have strong experience in big data transformations, we’d love to hear from you. Project Overview: We currently manage 1000+ Data Pipelines using Databricks Clusters for end-to-end data transformation ( Raw Silver Gold ) with orchestration handled via Airflow — all on Google Cloud Platform (GCP) . Curated datasets are delivered through BigQuery and Databricks Notebooks . Our roadmap includes migrating to a GCP-native data processing framework optimized for Spark workloads. Key Responsibilities: Design and implement a GCP-native data processing framework Analyze and plan migration of existing workloads to cloud-native architecture Ensure data availability, integrity, and consistency Build reusable tools and standards for the Data Engineering team Collaborate with stakeholders and document processes thoroughly Required Experience: 5+ years in Data Engineering with strong data architecture experience Hands-on expertise in Databricks , Airflow , BigQuery , and PySpark Deep knowledge of GCP services for data processing (Dataflow, Dataproc, etc.) Familiarity with data lake table formats like Delta, Iceberg Experience with orchestration tools ( Airflow , Dagster , or similar) Key Skills: Python programming Strong understanding of data lake architectures and cloud-native best practices Excellent problem-solving and communication skills Notice Period Requirement: Only Immediate Joiners or Candidates with Max 30 Days Notice Period Will Be Considered How to Apply: Interested candidates can share their details and updated resume with vijay.s@xebia.com in the following format: Full Name: Total Experience (Must be 5+ years): Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: Note: Please apply only if you have not recently applied or interviewed for any open roles at Xebia.

Posted 1 month ago

Apply

1.0 - 3.0 years

10 - 15 Lacs

Kolkata, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies