Jobs
Interviews

41 Cloud Composer Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 10 Lacs

bengaluru, karnataka, india

On-site

Job Summary: We are seeking a talented GCP Data Engineer to join our team and help us design and implement robust data pipelines and analytics solutions on Google Cloud Platform (GCP). The ideal candidate will have strong expertise in BigQuery, DataFlow, Cloud Composer, and DataProc, along with experience in AI/ML tools such as Google Vertex AI or Dialogflow. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using DataFlow, Cloud Composer, and DataProc. Develop optimized queries and manage large-scale datasets using BigQuery. Collaborate with cross-functional teams to gather requirements and translate business needs into scalable data solutions. Implement best practices for data engineering, including version control, CI/CD pipelines, and data governance. Work on AI/ML use cases, leveraging Google Vertex AI or Dialogflow to create intelligent solutions. Perform data transformations, aggregations, and ETL processes to prepare data for analytics and reporting. Monitor and troubleshoot data workflows to ensure reliability, scalability, and performance. Document technical processes and provide guidance to junior team members. Qualifications: Experience: 35+ years of professional experience in GCP data engineering or related fields. Skills: Proficiency in BigQuery, DataFlow, Cloud Composer, and DataProc. Exposure to Google Vertex AI, Dialogflow, or other AI/ML platforms. Strong programming skills in Python, SQL, and familiarity with Terraform for GCP infrastructure. Experience with distributed data processing frameworks like Apache Spark is a plus. Knowledge of data security, governance, and best practices for cloud platforms.

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are looking for a GCP Cloud Engineer for a position based in Pune. As a GCP Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions on Google Cloud Platform. Your expertise in GCP services, solution design, and programming skills will be crucial for developing scalable and efficient cloud solutions. Your key responsibilities will include designing and implementing GCP-based data solutions following best practices, developing workflows and pipelines using Cloud Composer and Apache Airflow, building and managing data processing clusters using Dataproc, working with GCP services like Cloud Functions, Cloud Run, and Cloud Storage, and integrating multiple data sources through ETL/ELT workflows. You will be expected to write clean, efficient, and scalable code in languages such as Python, Java, or similar, apply logical problem-solving skills to address business challenges, and collaborate with stakeholders to design end-to-end GCP solution architectures. To be successful in this role, you should have hands-on experience with Dataproc, Cloud Composer, Cloud Functions, and Cloud Run, strong programming skills in Python, Java, or similar languages, a good understanding of GCP architecture, and experience in setting task dependencies in Airflow DAGs. Logical and analytical thinking, strong communication, and documentation skills are also essential for cross-functional collaboration. Preferred qualifications include GCP Professional Data Engineer or Architect Certification, experience in data lake and data warehouse solutions on GCP (e.g., BigQuery, Dataflow), and familiarity with CI/CD pipelines for GCP-based deployments.,

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a GCP Data Engineer at our organization, you will be a key member of our growing data team. We are looking for a highly skilled and experienced individual who is passionate about data and has a strong track record of designing, building, and maintaining scalable data solutions on Google Cloud Platform (GCP). Your role will involve transforming raw data into actionable insights, enabling data-driven decision-making throughout the organization. Your responsibilities will include designing, developing, implementing, and maintaining ETL/ELT data pipelines using various GCP services and programming languages. You will leverage Google BigQuery as a primary data warehouse, design optimal schemas, write efficient SQL queries, integrate data from diverse sources, and build, manage, and optimize ETL/ELT processes. Furthermore, you will design efficient data models in BigQuery, automate data workflows, ensure data quality and governance, optimize performance, collaborate with various teams, and ensure data security and compliance with regulations. To be successful in this role, you should have 5-7 years of experience in data engineering with a focus on GCP. You must possess hands-on expertise with GCP services such as BigQuery, Dataflow, Cloud Storage, Cloud Composer, Cloud Functions, and Pub/Sub. Strong SQL skills, understanding of ETL/ELT concepts, data modeling experience, and familiarity with version control systems are essential. Problem-solving skills, excellent communication abilities, and a collaborative mindset are also required. Preferred qualifications include a GCP Professional Data Engineer certification, experience with other cloud platforms, knowledge of Linux, familiarity with CI/CD pipelines and DevOps practices, proficiency in data visualization tools, and experience with data quality frameworks and observability tools. This role presents an exciting opportunity to work on cutting-edge data solutions in a dynamic and innovative environment. If you are a dedicated and skilled GCP Data Engineer seeking to make a significant impact, we invite you to share your resume with us at navneet@sourcebae.com.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at Deutsche Bank in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong grasp of essential engineering principles and possess root cause analysis skills to address enhancements and fixes in product reliability and resiliency. You should be capable of working independently on medium to large projects with strict deadlines and adapt to a cross-application mixed technical environment. Your role involves hands-on development experience in ETL, Big Data, Hadoop, Spark, and GCP while following an agile methodology. Collaboration with a geographically dispersed team is essential in this role. The position is part of the Compliance tech internal development team in India, focusing on delivering improvements in compliance tech capabilities to meet regulatory commitments and mandates. You will be involved in analyzing data sets, designing stable data ingestion workflows, and integrating them into existing workflows. Additionally, you will work closely with team members and stakeholders to provide ETL solutions, develop analytics algorithms, and handle data sourcing in Hadoop and GCP. Your responsibilities include unit testing, UAT deployment, end-user sign-off, and supporting production and release management teams. To excel in this role, you should have over 10 years of coding experience in reputable organizations, proficiency in technologies such as Hadoop, Python, Spark, SQL, Unix, and Hive, as well as hands-on experience in Bitbucket and CI/CD pipelines. Knowledge of data security in on-prem and GCP environments, cloud services, and data quality dimensions is crucial. Experience in regulatory delivery environments, banking, test-driven development, and data visualization tools like Tableau would be advantageous. At Deutsche Bank, you will receive support through training, coaching, and a culture of continuous learning to enhance your career progression. The company fosters a collaborative environment where employees are encouraged to act responsibly, think commercially, and take initiative. Together, we strive for excellence and celebrate the achievements of our diverse workforce. Deutsche Bank promotes a positive, fair, and inclusive work environment and welcomes applications from all individuals. For more information about Deutsche Bank and our values, please visit our company website: [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm),

Posted 1 week ago

Apply

7.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a GCP DBT Manager, your primary responsibility will be to collaborate with the team in designing, building, and maintaining data pipelines and transformations using Google Cloud Platform (GCP) and the Data Build Tool (dbt). This role will involve utilizing tools such as BigQuery, Cloud Composer, and Python, requiring a strong foundation in SQL skills and knowledge of data warehousing concepts. Additionally, you will play a crucial role in ensuring data quality, optimizing performance, and working closely with cross-functional teams. Your key responsibilities will include: Data Pipeline Development: - Designing, building, and maintaining ETL/ELT pipelines using dbt and GCP services like BigQuery and Cloud Composer. Data Modeling: - Creating and managing data models and transformations with dbt to ensure efficient and accurate data consumption for analytics and reporting. Data Quality: - Developing and maintaining a data quality framework, including automated testing and cross-dataset validation. Performance Optimization: - Writing and optimizing SQL queries to enhance data processing efficiency within BigQuery. Collaboration: - Collaborating with data engineers, analysts, scientists, and business stakeholders to deliver effective data solutions. Incident Resolution: - Providing support for day-to-day incident and ticket resolution related to data pipelines. Documentation: - Creating and maintaining comprehensive documentation for data pipelines, configurations, and procedures. Cloud Platform Expertise: - Leveraging GCP services like BigQuery, Cloud Composer, Cloud Functions, etc. for efficient data operations. Scripting: - Developing and maintaining SQL/Python scripts for data ingestion, transformation, and automation tasks. Preferred Candidate Profile: Requirements: - 7~12 years of experience in data engineering or a related field. - Strong hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery. - Proficiency in using dbt for data transformation, testing, and documentation. - Advanced SQL skills for data modeling, performance optimization, and querying large datasets. - Understanding of data warehousing concepts, dimensional modeling, and star schema design. - Experience with ETL/ELT tools and frameworks, such as Apache Beam, Cloud Dataflow, Data Fusion, or Airflow/Composer. In this role, you will be at the forefront of data pipeline development and maintenance, ensuring data quality, performance optimization, and effective collaboration across teams to deliver impactful data solutions using GCP and dbt.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

One of our prestigious clients, a TOP MNC Giant with a global presence, is currently seeking a Lead Enterprise Architect to join their team in Pune, Mumbai, or Bangalore. **Qualifications and Certifications:** **Education:** - Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field. **Experience:** - A minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience with GCP cloud platforms. - Proven track record in designing and implementing data workflows using GCP services such as BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. **Certifications:** - Google Cloud Professional Data Engineer certification is preferred. **Key Skills:** **Mandatory Skills:** - Advanced proficiency in Python for developing data pipelines and automation. - Strong SQL skills for querying, transforming, and analyzing large datasets. - Hands-on experience with various GCP services including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, and Kubernetes Engine (GKE). - Familiarity with CI/CD tools like Jenkins, GitHub, or Bitbucket. - Proficiency in Docker, Kubernetes, Terraform, or Ansible for containerization, orchestration, and infrastructure as code (IaC). - Knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer. - Strong understanding of Agile/Scrum methodologies. **Nice-to-Have Skills:** - Experience with other cloud platforms like AWS or Azure. - Familiarity with data visualization tools such as Power BI, Looker, or Tableau. - Understanding of machine learning workflows and their integration with data pipelines. **Soft Skills:** - Strong problem-solving and critical-thinking abilities. - Excellent communication skills to effectively collaborate with both technical and non-technical stakeholders. - Proactive attitude towards innovation and continuous learning. - Ability to work independently and as part of a collaborative team. If you are interested in this opportunity, please reply back with your updated CV and provide the following details: - Total Experience: - Relevant experience in Data Engineering: - Relevant experience in GCP cloud platforms: - Relevant experience as an Enterprise Architect: - Availability to join ASAP: - Preferred location (Pune / Mumbai / Bangalore): We will contact you once we receive your CV along with the above-mentioned details. Thank you, Kavita.A,

Posted 2 weeks ago

Apply

7.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrowpeople with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Description Looking for an experienced GCP Cloud/DevOps Engineer and or OpenShift to design, implement, and manage cloud infrastructure and services across multiple environments. This role requires deep expertise in Google Cloud Platform (GCP) services, DevOps practices, and Infrastructure as Code (IaC). Candidate will be deploying, automating, and maintaining high-availability systems, and implementing best practices for cloud architecture, security, and DevOps pipelines. Requirements Bachelor&aposs or master&aposs degree in computer science, Information Technology, or a similar field Must have 7 + years of extensive experience in designing, implementing, and maintaining applications on GCP and OpenShift Comprehensive expertise in GCP services such as GKE, Cloudrun, Functions, Cloud SQL, Firestore, Firebase, Apigee, GCP App Engine, Gemini Code Assist, Vertex AI, Spanner, Memorystore, Service Mesh, and Cloud Monitoring Solid understanding of cloud security best practices and experience in implementing security controls in GCP Thorough understanding of cloud architecture principles and best practices Experience with automation and configuration management tools like Terraform and a sound understanding of DevOps principles Proven leadership skills and the ability to mentor and guide a technical team Key Responsibilities Cloud Infrastructure Design and Deployment: Architect, design, and implement scalable, reliable, and secure solutions on GCP. Deploy and manage GCP services in both development and production environments, ensuring seamless integration with existing infrastructure. Implement and manage core services such as BigQuery, Datafusion, Cloud Composer (Airflow), Cloud Storage, Data Fusion, Compute Engine, App Engine, Cloud Functions and more. Infrastructure as Code (IaC) and Automation Develop and maintain infrastructure as code using Terraform or CLI scripts to automate provisioning and configuration of GCP resources. Establish and document best practices for IaC to ensure consistent and efficient deployments across environments. DevOps And CI/CD Pipeline Development Create and manage DevOps pipelines for automated build, test, and release management, integrating with tools such as Jenkins, GitLab CI/CD, or equivalent. Work with development and operations teams to optimize deployment workflows, manage application dependencies, and improve delivery speed. Security And IAM Management Handle user and service account management in Google Cloud IAM. Set up and manage Secrets Manager and Cloud Key Management for secure storage of credentials and sensitive information. Implement network and data security best practices to ensure compliance and security of cloud resources. Performance Monitoring And Optimization Monitoring & Security: Set up observability tools like Prometheus, Grafana, and integrate security tools (e.g., SonarQube, Trivy). Networking & Storage: Configure DNS, networking, and persistent storage solutions in Kubernetes. Set up monitoring and logging (e.g., Cloud Monitoring, Cloud Logging, Error Reporting) to ensure systems perform optimally. Troubleshoot and resolve issues related to cloud services and infrastructure as they arise. Workflow Orchestration Orchestrate complex workflows using Argo Workflow Engine. Containerization: Work extensively with Docker for containerization and image management. Optimization: Troubleshoot and optimize containerized applications for performance and security. Technical Skills Expertise with GCP and OCP (OpenShift) services, including but not limited to Compute Engine, Kubernetes Engine (GKE), BigQuery, Cloud Storage, Pub/Sub, Datafusion, Airflow, Cloud Functions, and Cloud SQL. Proficiency in scripting languages like Python, Bash, or PowerShell for automation. Familiarity with DevOps tools and CI/CD processes (e.g. GitLab CI, Cloud Build, Azure DevOps, Jenkins) Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 2 weeks ago

Apply

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description: At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required: Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required: GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred: In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range: 5+ years Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 3 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

As a skilled Data Engineer, you will leverage your expertise to contribute to the development of data modeling, ETL processes, and reporting systems. With over 3 years of hands-on experience in areas such as ETL, Big Query, SQL, Python, or Alteryx, you will play a crucial role in enhancing data engineering processes. Your advanced knowledge of SQL programming and database management will be key in ensuring the efficiency of data operations. In this role, you will utilize your solid experience with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau to create insightful reports and analytics. Your understanding of data warehousing concepts and best practices will enable you to design robust data solutions. Your problem-solving skills and attention to detail will be instrumental in addressing data quality issues and proposing effective BI solutions. Collaboration and communication are essential aspects of this role, as you will work closely with stakeholders to define requirements and develop data-driven insights. Your ability to work both independently and as part of a team will be crucial in ensuring the successful delivery of projects. Additionally, your proactive approach to learning new tools and techniques will help you stay ahead in a dynamic environment. Preferred skills include experience with GCP cloud services, Python, Hive, Spark, Scala, JavaScript, and various BI/reporting tools. Your strong oral, written, and interpersonal communication skills will enable you to effectively convey insights and solutions to stakeholders. A Bachelor's degree in Computer Science, Computer Information Systems, or a related field is required for this role. Overall, as a Data Engineer, you will play a vital role in developing and maintaining data pipelines, reporting systems, and dashboards. Your expertise in SQL, BI tools, and data validation will contribute to ensuring data accuracy and integrity across all systems. Your analytical mindset and ability to perform root cause analysis will be key in identifying opportunities for improvement and driving data-driven decision-making within the organization.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

It&aposs fun to work in a company where people truly BELIEVE in what they are doing! We&aposre committed to bringing passion and customer focus to the business. Location - Open Position: Data Engineer (GCP) Technology If you are an extraordinary developer and who loves to push the boundaries to solve complex business problems using creative solutions, then we wish to talk with you. As an Analytics Technology Engineer, you will work on the Technology team that helps deliver our Data Engineering offerings at large scale to our Fortune clients worldwide. The role is responsible for innovating, building and maintaining technology services. Responsibilities: Be an integral part of large scale client business development and delivery engagements Develop the software and systems needed for end-to-end execution on large projects Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions Build the knowledge base required to deliver increasingly complex technology projects Qualifications & Experience: A bachelors degree in Computer Science or related field with 5 to 10 years of technology experience Desired Technical Skills: Data Engineering and Analytics on Google Cloud Platform: Basic Cloud Computing Concepts Bigquery, Google Cloud Storage, Cloud SQL, PubSub, Dataflow, Cloud Composer, GCP Data Transfer, gcloud CLI Python, Google Cloud Python SDK, SQL Experience in working with Any NoSQL/Columnar / MPP Database Experience in working with Any ETL Tool (Informatica/DataStage/Talend/Pentaho etc.) Strong Knowledge of database concepts, Data Modeling in RDBMS Vs NoSQL, OLTP Vs OLAP, MPP Architecture Other Desired Skills: Excellent communication and co-ordination skills Problem understanding, articulation and solutioning Quick learner & adaptable with regards to new technologies Ability to research & solve technical issues Responsibilities: Developing Data Pipelines (Batch/Streaming) Developing Complex data transformations ETL Orchestration Data Migration Develop and Maintain Datawarehouse / Data Lakes Good To Have: Experience in working with Apache Spark / Kafka Machine Learning concepts Google Cloud Professional Data Engineer Certification If you like wild growth and working with happy, enthusiastic over-achievers, you&aposll enjoy your career with us! Not the right fit Let us know you&aposre interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will be working as a Technical Lead Data Engineer for a leading data and AI/ML solutions provider based in Gurgaon. In this role, you will be responsible for designing, developing, and leading complex data projects primarily on Google Cloud Platform and other modern data stacks. Your key responsibilities will include leading the design and implementation of robust data pipelines, collaborating with cross-functional teams to deliver end-to-end data solutions, owning project modules, developing technical roadmaps, and implementing data governance frameworks on GCP. You will be required to integrate GCP data services like BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, and GenAI with platforms such as Snowflake. Additionally, you will write efficient code in Python, SQL, and ETL/orchestration tools, utilize containerized solutions for scalable deployments, and apply expertise in PySpark, Kafka, and advanced data querying for high-volume data environments. Monitoring, optimizing, and troubleshooting system performance, reducing job run-times through architecture optimization, developing data warehouses, and mentoring team members will also be part of your role. To be successful in this position, you should have a Bachelors or Masters degree in Computer Science, Engineering, or a related field. Extensive hands-on experience with Google Cloud Platform data services, Snowflake integration, strong programming skills in Python and SQL, proficiency in PySpark, Kafka, and data querying tools, and experience with containerized solutions using Google Kubernetes Engine are essential. Strong communication skills, documentation skills, experience with large distributed datasets, and the ability to balance short-term deliverables with long-term technical sustainability are also required. Prior leadership experience in data engineering teams and exposure to cloud data platforms are desirable. This role offers you the opportunity to lead high-impact data projects for reputed clients in a fast-growing data consulting environment, work with cutting-edge technologies, and collaborate in an innovative and growth-oriented culture.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen , we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview :- LinkedIn : https://www.linkedin.com/company/technogeninc/about/ TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. Please share below details for further processing of your profile. Total years of experience: Relevant years of experience: CTC (Including Variable): ECTC: Notice Period: Reason for change: Current location: Job Title :GCP Data Engineer Required Experience : 5+ years Work Mode: WFO-4 Days from Office. Shift Time : UK Shift Time-12:00 PM IST to 09:00 PM IST. Location : Hyderabad. Job Summary :- As a GCP Data Engineer, we need someone with strong experience in SQL and Python. The ideal candidate should have hands-on expertise in Google Cloud Platform (GCP) services, especially BigQuery, Composer, Airflow framework and a solid understanding of data engineering best practices. You will work closely with our internal teams and technology partners to deliver comprehensive and scalable marketing data and analytics solutions. This role offers the unique opportunity to engage in many technology platforms in a rapidly evolving marketing technology landscape. Key Responsibilities: • Technical oversight and team management of the developers, coordination with US based Mattel resources, and perform estimation of work. Strong knowledge in cloud computing platforms - Google Cloud Expertise in MySQL & SQL/PL Good Experience in IICS Experience in ETL Ascend IO is added advantage GCP & BigQuery knowledge is must, GCP certification is added advantage Good experience in Google Cloud Storage (GCS), Cloud Composer, DAGs , Airflow REST API development experience Good in analytical and problem solving, efficient communication Experience in designing, implementing, and managing various ETL job execution flows. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Comprehensive data analysis across complex data sets Ability to collaborate effectively across technical development teams and business departments Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or related roles Strong understanding of Google Cloud Platform and associated tools. Proven experience in delivering consumer marketing data and analytics solutions for enterprise clients. Strong knowledge of data management, ETL processes, data warehousing, and analytics platforms. Experience with SQL and NoSQL databases. Proficiency in Python programming languages. Hands-on experience with data warehousing solutions Knowledge of marketing analytics tools and technologies, including but not limited to Google Analytics, Blueconic, Klaviyo, etc. Knowledge of performance marketing concepts such as targeting & segmentation, real-time optimization, A/B testing, attribute modeling, etc. Excellent communication skills with a track record of collaboration across multiple teams Strong collaboration skills and team-oriented mindset. Strong problem-solving skills, adaptability, and the ability to thrive in a dynamic and rapidly changing environment. Experience working in Agile development environments Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn

Posted 4 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,

Posted 4 weeks ago

Apply

8.0 - 13.0 years

0 Lacs

hyderabad, telangana

On-site

You are an experienced GCP Data Engineer with 8+ years of expertise in designing and implementing robust, scalable data architectures on Google Cloud Platform. Your role involves defining and leading the implementation of data architecture strategies using GCP services to meet business and technical requirements. As a visionary GCP Data Architect, you will be responsible for architecting and optimizing scalable data pipelines using Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. You will design solutions for large-scale batch processing and real-time streaming, leveraging tools like Dataproc for distributed data processing. Your responsibilities also include establishing and enforcing data governance, security frameworks, and best practices for data management. You will conduct architectural reviews and performance tuning for GCP-based data solutions, ensuring cost-efficiency and scalability. Collaborating with cross-functional teams, you will translate business needs into technical requirements and deliver innovative data solutions. The required skills for this role include strong expertise in GCP services such as Google Cloud Storage, BigQuery, Dataflow, Cloud Composer, Dataproc, and Pub/Sub. Proficiency in designing and implementing data processing frameworks for ETL/ELT, batch, and real-time workloads is essential. You should have an in-depth understanding of data modeling, data warehousing, and distributed data processing using tools like Dataproc and Spark. Hands-on experience with Python, SQL, and modern data engineering practices is required. Your knowledge of data governance, security, and compliance best practices on GCP will be crucial in this role. Strong problem-solving, leadership, and communication skills are necessary for guiding teams and engaging stakeholders effectively.,

Posted 4 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

At PwC, the focus in data and analytics is on leveraging data to drive insights and make informed business decisions. Utilizing advanced analytics techniques to help clients optimize their operations and achieve strategic goals is key. In data analysis at PwC, the emphasis is on utilizing advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. Skills in data manipulation, visualization, and statistical modeling play a crucial role in supporting clients in solving complex business problems. Candidates with 4+ years of hands-on experience are sought for the position of Senior Associate in supply chain analytics. Successful candidates should possess proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design. Additionally, hands-on experience working on optimization methods like linear programming, mixed integer programming, and scheduling optimization is required. Proficiency in forecasting techniques and machine learning techniques, along with a strong command of statistical modeling, testing, and inference, is essential. Familiarity with GCP tools like BigQuery, Vertex AI, Dataflow, and Looker is also necessary. Required skills include building data pipelines and models for forecasting, optimization, and scenario planning, strong SQL and Python programming skills, experience deploying models in a GCP environment, and knowledge of orchestration tools like Cloud Composer (Airflow). Nice-to-have skills consist of familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools, as well as strong communication and stakeholder engagement skills at the executive level. The roles and responsibilities of the Senior Associate involve assisting analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions. They are expected to interact with and advise consultants/clients as subject matter experts, conduct analysis using advanced analytics tools, and implement quality control measures for deliverable integrity. Validating analysis outcomes, making presentations, and contributing to knowledge and firm building activities are also part of the role. The ideal candidate should hold a degree in BE / B.Tech / MCA / M.Sc / M.E / M.Tech / Masters Degree / MBA from a reputed institute.,

Posted 1 month ago

Apply

7.0 - 12.0 years

22 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities BigQuery for building and optimizing data warehouses. Implement both batch and real-time (streaming) data processing solutions using Java. Cloud Composer (Airflow) for workflow orchestration and pipeline management. Dataproc for managing Apache Spark jobs in the cloud. Google Cloud Storage (GCS) for data storage and management.

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,

Posted 1 month ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant - GCP Sr Data Engineer We are seeking a highly accomplished and strategic Google Cloud Data Engineer with over deep experience in data engineering, with a significant and demonstrable focus on the Google Cloud Platform (GCP). In this leadership role, you will be instrumental in defining and driving our overall data strategy on GCP, architecting transformative data solutions, and providing expert guidance to engineering teams. You will be a thought leader in leveraging GCP%27s advanced data services to solve complex business challenges, optimize our data infrastructure at scale, and foster a culture of data excellence. Responsibilities Define and champion the strategic direction for our data architecture and infrastructure on Google Cloud Platform, aligning with business objectives and future growth. Architect and oversee the development of highly scalable, resilient, and cost-effective data platforms and pipelines on GCP, leveraging services like BigQuery , Dataflow, Cloud Composer, DataProc , and more. Provide expert-level guidance and technical leadership to senior data engineers and development teams on best practices for data modeling, ETL/ELT processes, and data warehousing within GCP. Drive the adoption of cutting-edge GCP data technologies and methodologies to enhance our data capabilities and efficiency. Lead the design and implementation of comprehensive data governance frameworks, security protocols, and compliance measures within the Google Cloud environment. Collaborate closely with executive leadership, product management, data science, and analytics teams to translate business vision into robust and scalable data solutions on GCP. Identify and mitigate critical technical risks and challenges related to our data infrastructure and architecture on GCP. Establish and enforce data quality standards, monitoring systems, and incident response processes within the GCP data landscape. Mentor and develop senior data engineers, fostering their technical expertise and leadership skills within the Google Cloud context. Evaluate and recommend new GCP services and third-party tools to optimize our data ecosystem. Represent the data engineering team in strategic technical discussions and contribute to the overall technology roadmap. Qualifications we seek in you! Minimum Q ualifications / Skills Bachelor%27s or Master%27s degree in Computer Science , Engineering, or a related field. progressive and impactful experience in data engineering roles, with a significant and deep focus on the Google Cloud Platform. Expert-level knowledge of GCP%27s core data engineering services and best practices for building scalable and reliable solutions. Proven ability to architect and implement complex data warehousing and data lake solutions on GCP ( BigQuery , Cloud Storage). Mastery of SQL and extensive experience with programming languages relevant to data engineering on GCP (e.g., Python, Scala, Java). Deep understanding of data governance principles, security best practices within GCP (IAM, Security Command Center), and compliance frameworks (e.g., GDPR, HIPAA). Exceptional problem-solving, strategic thinking, and analytical skills, with the ability to navigate complex technical and business challenges. Outstanding communication, presentation, and influencing skills, with the ability to articulate complex technical visions to both technical and non-technical audiences, including executive leadership. Proven track record of leading and mentoring high-performing data engineering teams within a cloud- first environment. Preferred Q ualifications / Skills Google Cloud Certified Professional Data Engineer. Extensive experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Deep expertise in data streaming technologies on GCP (e.g., Dataflow, Pub/Sub, Apache Beam). Proven experience in integrating machine learning workflows and MLOps on GCP (e.g., Vertex AI). Significant contributions to open-source data projects or active participation in the GCP data engineering community. Experience in defining and implementing data mesh or data fabric architectures on GCP. Strong understanding of enterprise architecture principles and their application within the Google Cloud ecosystem. Experience in [mention specific industry or domain relevant to your company]. Demonstrated ability to drive significant technical initiatives and influence organizational data strategy. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply

18.0 - 22.0 years

25 - 30 Lacs

Pune

Work from Office

Treasury Technology are responsible for the design, build and operation of Deutsche Banks Treasury trading, Balance-sheet Management and Liquidity Reporting ecosystem. In partnership with the Treasury business we look to deliver innovative technology solutions that will enable business to gain competitive edge and operational efficiency. This is a Global role to lead the Engineering function for Treasury Engineering product portfolio. This aim is to develop best in class portfolio consisting of following products: Liqudity Measurement and Management Issuance and Securitization Risk in Banking Book Funds Transfer Pricing Treasury is about managing the money and financial risks in a business. This involves making sure the business has the capital it needs to manage its day-to-day business obligations, while helping develop its long term financial strategy and policies. Economic factors such as interest rate rises, changes in regulations and volatile foreign exchange rates can have a serious impact on any business. Treasuey is responsobile to monitor and assess market conditions and put strategies in place to mitigate any potential financial risks to the business. As a senior leader in Software Engineering, you will lead a highly inspired and inquisitive team of technologists to develop applications to the highest standards. You will be expected to solve complex business and technical challenges while managing a large and senior business stakeholders. You will build an effective and trusted global engineering capability that can deliver consistently against the business ambitions. You are expected to take ownership of the quality of the platform, dev automation, agile processes and production resiliency. Position Specific Responsibilities and Accountabilities: Lead the Global Engineering function across our strategic locations based at Pune, Buchrest, London and New York Communicate with senior business stakeholders with regards to the vision and business goals. Provide transparency to program status, and manage risks and issues Lead a culture of innovation and experimentation, support full software development lifecycle that incorporates the best of technology approaches and delivery methodologies Ensure on time product releases that are of high quality, enabling the core vision of next generation trade processing systems compliant with regulatory requirements Lead development of next generation of cloud enabled platforms which includes modern web frameworks and complex transaction processing systems leveraging a broad set of technology stacks Experience in building fault-tolerant, low-latency, scalable solutions that are performed at a global enterprise scale Implement the talent strategy for engineering aligned to the broader Treasury Technology talent strategy & operating model Develop application with industry best practise using DevOps and automated deployment and testing framework Skills Matrix: Education Qualifications: Degree from an accredited college or university (or equivalent certification and/or relevant work experience). Business Analysis and SME Experience: 18+ years experience in the following areas: Well-developed requirements analysis skills, including good communication abilities (both speaking and listening) and stakeholder management (all levels up to Managing Director). Experience working with Front Office business teams highly desirable Experience in IT delivery or architecture including experience as an Application Developer and people manager Strong object-oriented design skills Previous experience hiring, motivating and managing internal and vendor teams. Technical Experience Mandatory Skills: Java, ideally Spark and Scala Oracle PostGres other Database technologies Experience developing microservices based architectures UI design and implementation Business Process management tools (e.g.g JBPM, IBM BPN) Experience with a range of BI technologies including Tableau Experience with DevOps best practices (DORA), CI/CD Experience in application security, scalability, performance tuning and optimization (NFRs) Experience in API designing, sound knowledge of micro services, containerization (Docker), exposure to federated and NoSQL DB. Experience in database query tuning and optimization Experience in implementing Devops best practices including CI CD, Implementing API testing automation. Experience working in an Agile based team ideally Scrum Desirable skills: Experience with Cloud Services Platforms in particular Google Cloud, and internal cloud based development (Cloud Run, Cloud Composer, Cloud SQL, Docker, K8s) Industry Domain Experience Hands-on knowledge of enterprise technology platforms supporting Front Office, Finance and/or Risk domains would be a significant advantage, as would experience or interest in Sustainable Finance. For example: Knowledge of the Finance/controlling domain and end-to-end workflow for a banking & trading businesses. High level understanding of financial products across Investment, Corporate and Private/Retail banking, in particular Loans. Knowledge of the investment banking, sales & trading, asset management and similar industries is a strong advantage. Clear Thought & Leadership A mindset built on simplicity A clear understanding of the concept of re-use in software development, and the drive to apply it relentlessly Proficiency to talk in functional and data terms to clients, embedded architects and senior managers Technical Leadership skills Ability to work in a fast paced environment with competing and alternating priorities with a constant focus on delivery. Proven ability to balance business demands and IT fulfillment in terms of standardisation, reducing risk and increasing IT flexibility. Logical & structured approach to problem-solving in both near-term (tactical) and mid-long term (strategic) horizons. Communication: Good verbal as well as written communication and presentation capabilities. Good team player facilitator-negotiator networker. Able to lead senior managers towards common goals and build consensus across a diverse group. Able to lead and influence a diverse team from a range of technical and non-technical backgrounds.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Python Pyspark Engineer to join our team in Hyderabad, Telangana (IN-TG), India (IN). Strong hands-on experience in designing and building data pipelines using Google Cloud Platform (GCP) services like BigQuery, Dataflow, and Cloud Composer. Proficient in Python for data processing, scripting, and automation in cloud and distributed environments. Solid working knowledge of Apache Spark / PySpark, with experience in large-scale data transformation and performance tuning. Familiar with CI/CD processes, version control (Git), and workflow orchestration tools such as Airflow or Composer. Ability to work independently in fast-paced Agile environments with strong problem-solving and communication skills. Exposure to modern data architectures and real-time/streaming data solutions is an added advantage. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - GCP Sr Data Engineer We are seeking a highly accomplished and strategic Google Cloud Data Engineer with deep experience in data engineering, with a significant and demonstrable focus on the Google Cloud Platform (GCP). In this leadership role, you will be instrumental in defining and driving our overall data strategy on GCP, architecting transformative data solutions, and providing expert guidance to engineering teams. You will be a thought leader in leveraging GCP%27s advanced data services to solve complex business challenges, optimize our data infrastructure at scale, and foster a culture of data excellence. Responsibilities Define and champion the strategic direction for our data architecture and infrastructure on Google Cloud Platform, aligning with business objectives and future growth. Architect and oversee the development of highly scalable, resilient, and cost-effective data platforms and pipelines on GCP, leveraging services like BigQuery , Dataflow, Cloud Composer, DataProc , and more. Provide expert-level guidance and technical leadership to senior data engineers and development teams on best practices for data modeling, ETL/ELT processes, and data warehousing within GCP. Drive the adoption of cutting-edge GCP data technologies and methodologies to enhance our data capabilities and efficiency. Lead the design and implementation of comprehensive data governance frameworks, security protocols, and compliance measures within the Google Cloud environment. Collaborate closely with executive leadership, product management, data science, and analytics teams to translate business vision into robust and scalable data solutions on GCP. Identify and mitigate critical technical risks and challenges related to our data infrastructure and architecture on GCP. Establish and enforce data quality standards, monitoring systems, and incident response processes within the GCP data landscape. Mentor and develop senior data engineers, fostering their technical expertise and leadership skills within the Google Cloud context. Evaluate and recommend new GCP services and third-party tools to optimize our data ecosystem. Represent the data engineering team in strategic technical discussions and contribute to the overall technology roadmap. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor%27s or Master%27s degree in Computer Science , Engineering, or a related field. experience in data engineering roles, with a significant and deep focus on the Google Cloud Platform. Expert-level knowledge of GCP%27s core data engineering services and best practices for building scalable and reliable solutions. Proven ability to architect and implement complex data warehousing and data lake solutions on GCP ( BigQuery , Cloud Storage). Mastery of SQL and extensive experience with programming languages relevant to data engineering on GCP (e.g., Python, Scala, Java). Deep understanding of data governance principles, security best practices within GCP (IAM, Security Command Center), and compliance frameworks (e.g., GDPR, HIPAA). Exceptional problem-solving, strategic thinking, and analytical skills, with the ability to navigate complex technical and business challenges. Outstanding communication, presentation, and influencing skills, with the ability to articulate complex technical visions to both technical and non-technical audiences, including executive leadership. Proven track record of leading and mentoring high-performing data engineering teams within a cloud- first environment. Preferred Qualifications/ Skills Google Cloud Certified Professional Data Engineer. Extensive experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Deep expertise in data streaming technologies on GCP (e.g., Dataflow, Pub/Sub, Apache Beam). Proven experience in integrating machine learning workflows and MLOps on GCP (e.g., Vertex AI). Significant contributions to open-source data projects or active participation in the GCP data engineering community. Experience in defining and implementing data mesh or data fabric architectures on GCP. Strong understanding of enterprise architecture principles and their application within the Google Cloud ecosystem. Experience in [mention specific industry or domain relevant to your company]. Demonstrated ability to drive significant technical initiatives and influence organizational data strategy. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Responsibilities Lead and mentor a team of data engineers, providing technical guidance, setting best practices, and overseeing task execution for the migration project. Design, develop, and architect scalable ETL processes to extract, transform, and load petabytes of data from on-premises SQL Server to GCP Cloud SQL PostgreSQL. Oversee the comprehensive analysis of existing SQL Server schemas, data types, stored procedures, and complex data models, defining strategies for their optimal conversion and refactoring for PostgreSQL. Establish and enforce rigorous data validation, quality, and integrity frameworks throughout the migration lifecycle, ensuring accuracy and consistency. Collaborate strategically with Database Administrators, application architects, business stakeholders, and security teams to define migration scope, requirements, and cutover plans. Lead the development and maintenance of advanced scripts (primarily Python) for automating large-scale migration tasks, complex data transformations, and reconciliation processes. Proactively identify, troubleshoot, and lead the resolution of complex data discrepancies, performance bottlenecks, and technical challenges during migration. Define and maintain comprehensive documentation standards for migration strategies, data mapping, transformation rules, and post-migration validation procedures. Ensure data governance, security, and compliance standards are meticulously applied throughout the migration process, including data encryption and access controls within GCP. Implement Schema conversion or custom schema mapping strategy for SQL Server to PostgreSQL shift Refactor and translate complex stored procedures and T-SQL logic to PostgreSQL-compatible constructs while preserving functional equivalence. Develop and execute comprehensive data reconciliation strategies to ensure consistency and parity between legacy and migrated datasets post-cutover. Design fallback procedures and lead post-migration verification and support to ensure business continuity. Ensuring metadata cataloging and data lineage tracking using GCP-native or integrated tools. Must-Have Skills Expertise in data engineering, specifically for Google Cloud Platform (GCP). Deep understanding of relational database architecture, advanced schema design, data modeling, and performance tuning. Expert-level SQL proficiency, with extensive hands-on experience in both T-SQL (SQL Server) and PostgreSQL. Hands-on experience with data migration processes, including moving datasets from on-premises databases to cloud storage solutions. Proficiency in designing, implementing, and optimizing complex ETL/ELT pipelines for high-volume data movement, leveraging tools and custom scripting. Strong knowledge of GCP services: Cloud SQL, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer, Cloud Functions, and Bigquery. Solid understanding of data governance, security, and compliance practices in the cloud, including the management of sensitive data during migration. Strong programming skills in Python or Java for building data pipelines and automating processes. Experience with real-time data processing using Pub/Sub, Dataflow, or similar GCP services. Experience with CI/CD practices and tools like Jenkins, GitLab, or Cloud Build for automating the data engineering pipeline. Knowledge of data modeling and best practices for structuring cloud data storage for optimal query performance and analytics in GCP. Familiarity with observability and monitoring tools in GCP (e.g., Stackdriver, Prometheus) for real-time data pipeline visibility and alerting. Good-to-Have Skills Direct experience with GCP Database Migration Service, Storage Transfer Service, or similar cloud-native migration tools. Familiarity with data orchestration using tools like Cloud Composer (based on Apache Airflow) for managing workflows. Experience with containerization tools like Docker and Kubernetes for deploying data pipelines in a scalable manner. Exposure to DataOps tools and methodologies for managing data workflows. Experience with machine learning platforms like AI Platform in GCP to integrate with data pipelines. Familiarity with data lake architecture and the integration of BigQuery with Google Cloud Storage or Dataproc.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Hybrid

Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 18 Lacs

Hyderabad

Hybrid

About the Role: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4-6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies