Jobs
Interviews

1095 Bigquery Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! Solid experience designing, building, and maintaining cloud-based data platforms and infrastructure. Deep proficiency in GCP Cloud Services, including significant experience with Big Query, Cloud Storage, Data Proc, APIGEE, Cloud Run, Google Kubernetes Engine (GKE), Postgres, Artifact Registry, Secret Manager, and Access Management (IAM). Hands-on experience implementing and managing CI/CD Pipelines using tools like Tekton and potentially Astronomer. Strong experience with Job Scheduling and workflow orchestration using Airflow. Proficiency with Version Control systems, specifically Git. Strong programming skills in Python. Expertise in SQL and experience with relational databases like SQL Server, MY SQL, Postgres SQL. Experience with or knowledge of data visualization tools like Power BI. Familiarity with code quality and security scanning tools such as FOSSA and SonarQube. Foundational Knowledge on Artificial Intelligence and Machine Learning concepts and workflows. problem-solving skills and the ability to troubleshoot complex distributed systems. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure, GCP)Skills Required:GCP , Big Query,, AI/ML Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering/AIML Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

About the Role: We are seeking a skilled and detail-oriented Data Engineer with deep expertise in PostgreSQL and SQL to design, maintain, and optimize our database systems. As a key member of our data infrastructure team, you will work closely with developers, DevOps, and analysts to ensure data integrity, performance, and scalability of our applications. Key Responsibilities: Design, implement, and maintain PostgreSQL database systems for high availability and performance. Write efficient, well-documented SQL queries, stored procedures, and database functions. Analyze and optimize slow-performing queries and database structures. Collaborate with software engineers to support schema design, indexing, and query optimization. Perform database migrations, backup strategies, and disaster recovery planning. Ensure data security and compliance with internal and regulatory standards. Monitor database performance and proactively address bottlenecks and anomalies. Automate routine database tasks using scripts and monitoring tools. Contribute to data modeling and architecture discussions for new and existing systems. Support ETL pipelines and data integration processes as needed. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field. 5 years of professional experience in a database engineering role. Proven expertise with PostgreSQL (version 12+ preferred). Strong SQL skills with the ability to write complex queries and optimize them. Experience with performance tuning, indexing, query plans, and execution analysis. Familiarity with database design best practices and normalization techniques. Solid understanding of ACID principles and transaction management. Preferred Qualifications: Experience with cloud platforms (e.g., AWS RDS, GCP Cloud SQL, or Azure PostgreSQL). Familiarity with other database technologies (e.g., MySQL, NoSQL, MongoDB, Redis). Knowledge of scripting languages (e.g., Python, Bash) for automation. Experience with monitoring tools (e.g., pgBadger, pg_stat_statements, Prometheus/Grafana). Understanding of CI/CD processes and infrastructure as code (e.g., Terraform). Exposure to data warehousing or analytics platforms (e.g., Redshift, BigQuery).

Posted 2 weeks ago

Apply

5.0 - 9.0 years

18 - 22 Lacs

Chennai

Hybrid

Position Description: Experience using tools in BI, ETL, Reporting /Visualization/Dashboards. (PowerBI/ QlikSense) Should have good understanding on the visualization techniques and ability to solve business problems through visualization. Ability to get Insights from Data, provide visualization, and story-telling. Experience with data handling using R or Python, added advantage. Exposure to Bigdata based analytical solutions and hands-on experience with data lakes/ data cleansing/ data Skills Required: Power BI, POSTGRES, alteryx, Big Query,, Data/Analytics dashboards Skills Preferred: Data Analysis, GCP , Python Experience Required: 5+ exp Education Required: Bachelor's Degree Education Preferred: Master's Degree

Posted 2 weeks ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Kolkata

Work from Office

Proven experience as aBusinessAnalystwith a focus on client engagement and P2P processes. Expertise in GenAIand advancedAI/ML concepts. Strong analytical and problem-solving skills. Excellent communication and interpersonal abilities. Familiarity with Agile methodologies. Primary Skills Artificial Intelligence, Gen AI Business Analyst, FRD, BRD, User Stories BSFI Domain Knowledge Secondary Skills Analytical Thinking Verbal Communication Stakeholder Management

Posted 2 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Hyderabad

Work from Office

About this role: Wells Fargo is seeking a Senior Cloud Platform engineer IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Python skills. In this role, you will: Understanding of Cloud Platform Technologies (GCP preferred) in the big data and data warehousing space (BigQuery, Dataproc, Dataflow, Data Catalog, Cloud Composer/Airflow, GKE/Anthos). Hands-on experience in IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Self-healing mechanisms, Load-balancing, API Gateway. In-depth knowledge of Cloud tools/solutions such as Cloud Pub/Sub, GKE, IAM, Scalability, Fault-tolerant design, Availability, BCP. Ability to quickly learn and adapt to the new cloud platforms technologies Strong development experience in Python Extensive experience in working with Python API based solution design and integration. Required Qualifications, International: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Bachelors or Masters Degree in Comp. Science or equivalent Desired Qualifications: GCP DevOps, Terraform and K8s Certification.

Posted 2 weeks ago

Apply

0.0 years

9 - 14 Lacs

Noida

Work from Office

Required Skills: GCP Proficiency Strong expertise in Google Cloud Platform (GCP) services and tools. Strong expertise in Google Cloud Platform (GCP) services and tools, including Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, IAM, Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging, and Error Reporting. Cloud-Native Applications Experience in designing and implementing cloud-native applications, preferably on GCP. Workload Migration Proven expertise in migrating workloads to GCP. CI/CD Tools and Practices Experience with CI/CD tools and practices. Python and IaC Proficiency in Python and Infrastructure as Code (IaC) tools such as Terraform. Responsibilities: Cloud Architecture and Design Design and implement scalable, secure, and highly available cloud infrastructure solutions using Google Cloud Platform (GCP) services and tools such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design Development of high-level architecture design and guidelines for develop, deployment and life-cycle management of cloud-native applications on CGP, ensuring they are optimized for security, performance and scalability using services like App Engine, Cloud Functions, and Cloud Run. API ManagementDevelop and implement guidelines for securely exposing interfaces exposed by the workloads running on GCP along with granular access control using IAM platform, RBAC platforms and API Gateway. Workload Migration Lead the design and migration of on-premises workloads to GCP, ensuring minimal downtime and data integrity. Skills (competencies)

Posted 2 weeks ago

Apply

6.0 - 10.0 years

6 - 11 Lacs

Mumbai

Work from Office

Primary Skills Google Cloud Platform (GCP) Expertise in Compute (VMs, GKE, Cloud Run), Networking (VPC, Load Balancers, Firewall Rules), IAM (Service Accounts, Workload Identity, Policies), Storage (Cloud Storage, Cloud SQL, BigQuery), and Serverless (Cloud Functions, Eventarc, Pub/Sub). Strong experience in Cloud Build for CI/CD, automating deployments and managing artifacts efficiently. Terraform Skilled in Infrastructure as Code (IaC) with Terraform for provisioning and managing GCP resources. Proficient in Modules for reusable infrastructure, State Management (Remote State, Locking), and Provider Configuration . Experience in CI/CD Integration with Terraform Cloud and automation pipelines. YAML Proficient in writing Kubernetes manifests for deployments, services, and configurations. Experience in Cloud Build Pipelines , automating builds and deployments. Strong understanding of Configuration Management using YAML in GitOps workflows. PowerShell Expert in scripting for automation, managing GCP resources, and interacting with APIs. Skilled in Cloud Resource Management , automating deployments, and optimizing cloud operations. Secondary Skills CI/CD Pipelines GitHub Actions, GitLab CI/CD, Jenkins, Cloud Build Kubernetes (K8s) Helm, Ingress, RBAC, Cluster Administration Monitoring & Logging Stackdriver (Cloud Logging & Monitoring), Prometheus, Grafana Security & IAM GCP IAM Policies, Service Accounts, Workload Identity Networking VPC, Firewall Rules, Load Balancers, Cloud DNS Linux & Shell Scripting Bash scripting, system administration Version Control Git, GitHub, GitLab, Bitbucket

Posted 2 weeks ago

Apply

4.0 - 9.0 years

2 - 5 Lacs

Gurugram

Work from Office

- A Content BDM is responsible for Global end to end ownership of Architecture and Technology. The BDM will be accountable to align, capture and create the relevant content to go into the role-based learning maps enabling Partners to deliver a perfect Pitch to the customer whilst understanding on how to deploy and support the solution effectively. Also responsible to work with the key Solution Plus, Strategic, ISV and Cloud partners to deliver their joint solutions and technology Training to Partners, Distributors and Sales Staff.In addition, the BDM will be responsible for monitoring the usage of the educational framework by Partners and increase traction and adoption as much as possible Roles & Responsibilities in Detail: Collate, Curate and design the training and education curriculum for the entire Enterprise Networking Architecture Portfolio including: Create/ Evaluate Quiz and COLT Working with the lab team to build the relevant LABs and Demos required to go into the partner enablement Learning Maps. Working with the BU and relevant Architecture stake holders in each GEO, Partner org to drive the Architecture Curriculum on the platform Specialized experience requirements:- a. 4-9 Years of experience on the EN Architecture or RNS or Wireless or Sdwan with stake holder management b. Understanding and hands on experience preferred in the detailed sub technologies in that particular Architecture

Posted 2 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

17 - 22 Lacs

Noida, Gurugram, Coimbatore

Work from Office

Your role Strong experience working on ServiceNow implementations / Enhancement Hands on experience in ServiceNow ITOM module. Must have experience in implementing CMDB/Discovery/Service Mapping/event management solutions, Event rule, alert rule, sub-flow creation, Assignment rules, RegEx. Must have Power Shell/Shell and Java Script knowledge for use cases development. Working knowledge in ServiceNow IT Operations Management Solutions and be able to build or modify custom patterns and troubleshoot. Experienced in working with 3rd party integrations Your profile ServiceNow ITOM Module 3rd party integrations CMDB/Discovery/Service Mapping/event management Certifications - ServiceNow Certified System Administrator, CIS Event Management Should understand CI binding, correlation logic in ServiceNow. What you'll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges. Location - Gurugram,Coimbatore,Noida,Chennai,Mumbai,Pune,Hyderabad,Bengaluru

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years or more of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google BigQuery. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing scalable solutions to meet the needs of our clients. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google BigQuery.- Collaborate with cross-functional teams to analyze business requirements and develop scalable solutions to meet the needs of our clients.- Develop and maintain technical documentation, including design documents, test plans, and user manuals.- Ensure the quality of deliverables by conducting thorough testing and debugging of applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Good To Have Skills: Experience with other cloud-based data warehousing solutions such as Amazon Redshift or Snowflake.- Strong understanding of SQL and database design principles.- Experience with ETL tools and processes.- Experience with programming languages such as Python or Java. Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bengaluru office. Qualification 15 years or more of full time education

Posted 2 weeks ago

Apply

16.0 - 25.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Skill required: Tech for Operations - Technological Innovation Designation: Program & Project Mgmt Senior Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions. The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent Automation In Technology Innovation, you will be working on the scientific field of innovation studies which serves to explain the nature and rate of technological change. You will have to understand new products, processes and significant technological changes of products and processes. What are we looking for In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 2 weeks ago

Apply

5.0 - 9.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

-Design, develop & maintain data pipelines using GCP services: Dataflow, BigQuery, and Pub/Sub -Provisioning infrastructure on GCP using IaC with Terraform -Implement & manage data warehouse solutions -Monitor and resolve issues in data workflows Required Candidate profile -Expertise in GCP, Apache Beam, Dataflow, & BigQuery -Pro in Python, SQL, PySpark -Worked with Cloud Composer for orchestration -Solid understanding of DWH, ETL pipelines, and real-time data streaming

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Hyderabad, Chennai

Hybrid

Role:- Python Developer with GCP Location:- Chennai ll Hyderabad Employment Type:- Full time Experience Required:- 5+ years Shift 1 PM to 9 PM or 2 PM to 10 PM Primary Skills: 5+ in hand experience as python developer 2yrs experience in GCP, Big query Good communication skill Team player

Posted 2 weeks ago

Apply

7.0 - 10.0 years

18 - 25 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Data Engineer : You will be responsible for designing, building, and maintaining our data infrastructure, ensuring data quality, and enabling data-driven decision-making across the organization. The ideal candidate will have a strong background in data engineering, excellent problem-solving skills, and a passion for working with data. Responsibilities: Design, build, and maintain our data infrastructure, including data pipelines, warehouses, and databases Ensure data quality and integrity by implementing data validation, testing, and monitoring processes Collaborate with cross-functional teams to understand data needs and translate them into technical requirements Develop and implement data security and privacy policies and procedures Optimize data processing and storage performance, ensuring scalability and reliability Stay up-to-date with the latest data engineering trends and technologies Provide mentorship and guidance to junior data engineers and analysts Contribute to the development of data-driven solutions and products Requirements: 3+ years of experience in data engineering, with a Bachelor's degree in Computer Science, Engineering, or a related field Strong knowledge of data engineering tools and technologies, including SQL, and GCP Experience with big data processing frameworks, such as Spark or Hadoop Experience with data warehousing solutions : BigQuery Strong problem-solving skills, with the ability to analyze complex data sets and identify trends and insights Excellent communication and collaboration skills, with the ability to work with cross-functional teams and stakeholders Strong data security and privacy knowledge and experience Experience with agile development methodologies is a plus

Posted 2 weeks ago

Apply

3.0 - 6.0 years

15 - 20 Lacs

Hyderabad

Hybrid

Hello, Urgent job openings for Data Engineer role @ GlobalData(Hyd). Job Description given below please go through to understand the requirement. if requirement is matching to your profile & interested to apply please share your updated resume @ mail id (m.salim@globaldata.com). Mention Subject Line :- Applying for Data Engineer @ GlobalData(Hyd) Share your details in the mail :- Full Name : Mobile # : Qualification : Company Name : Designation : Total Work Experience Years : How many years of experience working on Snowflake/Google BigQuery : Current CTC : Expected CTC : Notice Period : Current Location/willing to relocate to Hyd? : Office Address : 3rd Floor, Jyoti Pinnacle Building, Opp to Prestige IVY League Appt, Kondapur Road, Hyderabad, Telangana-500081. Job Description :- We are looking for a skilled and experienced Data Delivery Specification (DDS) Engineer to join our data team. The DDS Engineer will be responsible for designing, developing, and maintaining robust data pipelines and delivery mechanisms, ensuring timely and accurate data delivery to various stakeholders. This role requires strong expertise in cloud data platforms such as AWS, Snowflake, and Google BigQuery, along with a deep understanding of data warehousing concepts. Key Responsibilities Design, develop, and optimize data pipelines for efficient data ingestion, transformation, and delivery from various sources to target systems. Implement and manage data delivery solutions using cloud platforms like AWS (S3, Glue, Lambda, Redshift), Snowflake, and Google BigQuery. Collaborate with data architects, data scientists, and business analysts to understand data requirements and translate them into technical specifications. Develop and maintain DDS documents, outlining data sources, transformations, quality checks, and delivery schedules. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor data pipelines, troubleshoot issues, and implement solutions to ensure continuous data flow. Optimize data storage and query performance on cloud data warehouses. Implement automation for data delivery processes and monitoring. Stay current with new data technologies and best practices in data engineering and cloud platforms. Required Skills & Qualifications Bachelors or Master’s degree in Computer Science, Data Engineering, or a related quantitative field. 4+ years of experience in data engineering, with a focus on data delivery and warehousing. Proven experience with cloud data platforms, specifically: AWS: S3, Glue, Lambda, Redshift, or other relevant data services. Snowflake: Strong experience with data warehousing, SQL, and performance optimization. Google BigQuery: Experience with data warehousing, SQL, and data manipulation. Proficient in SQL for complex data querying, manipulation, and optimization. Experience with scripting languages (e.g., Python) for data pipeline automation. Solid understanding of data warehousing concepts, ETL/ELT processes, and data modeling. Experience with version control systems (e.g., Git). Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Thanks & Regards, Salim (Human Resources)

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Responsibilities :- Build and manage data infrastructure on AWS , including S3, Glue, Lambda, Open Search, Athena, and CloudWatch using IaaC tool like Terraform Design and implement scalable ETL pipelines with integrated validation and monitoring. Set up data quality frameworks using tools like Great Expectations , integrated with PostgreSQL or AWS Glue jobs. Implement automated validation checks at key points in the data flow: post-ingest, post-transform, and pre-load. Build centralized logging and alerting pipelines (e.g., using CloudWatch Logs, Fluent bit ,SNS, File bit ,Logstash , or third-party tools). Define CI/CD processes for deploying and testing data pipelines (e.g., using Jenkins, GitHub Actions) Collaborate with developers and data engineers to enforce schema versioning, rollback strategies, and data contract enforcement. Preferred candidate profile 5+ years of experience in DataOps, DevOps, or data infrastructure roles. Proven experience with infrastructure-as-code (e.g., Terraform, CloudFormation). Proven experience with real-time data streaming platforms (e.g., Kinesis, Kafka). Proven experience building production-grade data pipelines and monitoring systems in AWS . Hands-on experience with tools like AWS Glue , S3 , Lambda , Athena , and CloudWatch . Strong knowledge of Python and scripting for automation and orchestration. Familiarity with data validation frameworks such as Great Expectations, Deequ, or dbt tests. Experience with SQL-based data systems (e.g., PostgreSQL). Understanding of security, IAM, and compliance best practices in cloud data environments.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Hyderabad, Bengaluru

Work from Office

Role & responsibilities Strong proficiency in Java, with hands-on experience developing data processing components. Practical experience with Google Cloud Pub/Sub or similar streaming platforms such as Kafka. Proficient in JSON schema design and data serialization techniques. Skilled in designing BigQuery views optimized for performance and ease of consumption. Excellent communication and teamwork abilities. 5+yrs NP: Immediate to 15days Blr / Hyd Mok@teksystems.com

Posted 2 weeks ago

Apply

7.0 - 10.0 years

32 - 40 Lacs

Bengaluru

Work from Office

: Job TitleProject & Change Lead, AVP LocationBangalore, India Role Description We are looking for an experienced Business Implementation Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead and/or support cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us! What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Responsible for Business Implementation change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the implementation story, maintain risk management controls, monitor, resolve as appropriate and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to adoption success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories)including liaising with end-users and performing analysis of gathered data, training on new features/functions, supporting hypercare and adoption constraints.. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years providing business implementation management to complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

35 - 40 Lacs

Pune

Work from Office

: Job TitleLead Engineer LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply

16.0 - 21.0 years

37 - 45 Lacs

Bengaluru

Work from Office

: Job TitleTechnology Operations Lead LocationBangalore, India Corporate TitleVP Role Description You will be operating within Corporate Bank Production as a Production Support in Manager capacity in Core Banking Services subdomain. Core Banking Services domain under Corporate Bank serves the critical corporate customers message transactions categorized under Liquidity, Payments Orchestration, Messaging and Surveillance, Data. We ensure safe passage and clearing of payments, trade settlements with regulatory filtering and SWIFT connectivity. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Partner with, and influence, stakeholders globally from development, infrastructure and production on risk identification, remediation solutions, and managing change conflicts to build momentum in optimizing the processes, platforms across Production. Working as Regional Functional Lead for a suite of Messaging and Surveillance applications in corporate banking technology. Lead the team in driving a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Thought leadership with Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability by understanding emerging trends and proactively addressing them. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Understand business workflows and make recommendations for improvements (directly in workflows and/or analytics) Assist in the development of long-term organizational strategy to improve production Track progress of strategic goals, monitor key performance indicators (KPIs) and provide periodic updates to senior management Collaborate with internal and external stakeholders to ensure alignment of tactical initiatives in production with business goals Provide data driven insights and reports to support decision making Promote a culture of continuous improvement and foster innovation within the organization Experience in defining and identifying SLOs and measuring application services Experience in investing and driving SRE culture Your skills and experience University degree with technological or scientific focus or equivalent working experience -ideally in Financial Services / Banking industry. Extensive working experience ~16+ years in the financial services industry and a clear understanding of Finance's key processes and system. Leadership and People Management experience working in global matrix structure. Highly qualified, hands- on experience with Production Application Support and ITIL Practices with SRE knowledge and mindset. Proactive Service Management of all services provided across Businesses and Functions ensuring services are delivered in accordance to the agreed SLA Banking Domain knowledge with deep understanding of application Support and/or Development and complex IT infrastructure (UNIX, Database, Middleware, Cloud, MQ etc.) Good understanding of most recent technologies, be it cloud (GCP, AWS, Azure), programming languages (Java, JavaScript, Python), databases (Postgres, BigQuery), and other solutions. Must be able to constantly improve process and mechanism based on learning and feedback from various stakeholders. Excellent partnering and communication skills as well as stakeholder management combined with the ability to successfully navigate a complex organization, build strong relationships and work collaboratively with other teams Analytical aptitude and strong attention to detail combined with high level of commitment and the ability to deliver high quality results within tight deadlines Data analysis and visualization experience and understanding, with ability to translate data analysis to extract meaningful commercial insights and visualize data to support decision making processes Excellent communication and interpersonal skills together with ability to explain complex concepts for non-technical stakeholders to understand Strong analytical and problem-solving skills Experience in project management and change management High degree of emotional intelligence and cultural awareness Result oriented with a focus on strategic outcomes Guide and drive customers, suppliers and partners. Makes decisions which influence the success of projects and team objectives. Collaborates regularly with team members, users, cross-functional teams and customers. Engages to ensure that Customers/ Clients needs are being met throughout. Works under general direction within a clear framework of accountability. Plans own work to meet given objectives. Ability to work independently and manage multiple priorities Communicates fluently, orally and in writing, and can present complex information to both technical and non-technical audiences. Plans, schedules, and monitors work to meet time and quality targets. Facilitates collaboration between stakeholders who share common objectives. Fully understands the importance of security to own work and the operation of the organization. Nice to have: Cloud servicesGCP Experience with automation solutions (Ansible, Jenkins/Groovy, Python, Java) DevOps & Continuous Integration/ Agile oriented How well support you

Posted 2 weeks ago

Apply

7.0 - 10.0 years

32 - 40 Lacs

Jaipur

Work from Office

: Job TitleProject & Change Lead, AVP LocationJaipur, India Role Description We are looking for an experienced Business Implementation Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead and/or support cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us! What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Responsible for Business Implementation change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the implementation story, maintain risk management controls, monitor, resolve as appropriate and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to adoption success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories)including liaising with end-users and performing analysis of gathered data, training on new features/functions, supporting hypercare and adoption constraints.. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years providing business implementation management to complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree How well support you

Posted 2 weeks ago

Apply

5.0 - 10.0 years

11 - 16 Lacs

Bengaluru

Work from Office

: Job TitleSenior GCP Data Engineer Corporate TitleAssociate LocationBangalore, India Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly motivated and experienced Senior GCP Data Engineer to join our team. In this role, you will play a critical role in designing, developing, and maintaining robust data pipelines that transform raw data into valuable insights for our organization. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop, and maintain data pipelines using GCP services like Dataflow, Dataproc, and Pub/Sub. Develop and implement data ingestion and transformation processes using tools like Apache Beam and Apache Spark. Manage and optimize data storage solutions on GCP, including Big Query, Cloud Storage, and Cloud SQL. Implement data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center. Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Collaborate with data experts, analysts, and product teams to understand data needs and deliver effective solutions. Automate data processing tasks using scripting languages like Python. Participate in code reviews and contribute to establishing best practices for data engineering on GCP. Stay up to date on the latest advancements and innovations in GCP services and technologies. Your skills and experience 5+ years of experience as a Data Engineer or similar role. Proven expertise in designing, developing, and deploying data pipelines. In-depth knowledge of Google Cloud Platform (GCP) and its core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Strong proficiency in Python & SQL for data manipulation and querying. Experience with distributed data processing frameworks like Apache Beam or Apache Spark (a plus). Familiarity with data security and access control principles. Excellent communication, collaboration, and problem-solving skills. Ability to work independently, manage multiple projects, and meet deadlines Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus Knowledge of cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies