Home
Jobs

660 Bigquery Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Immediate joiner Only 5+ years of software development experience. o 5+ years experience on Python or Java Hands-on experience on writing and understanding complex SQL (Hive/PySpark-data frames), optimizing joins while processing huge amount of data. o 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark). o Hands on Experience on Google Cloud Platform (BigQuery, DataProc, Cloud Composer) o 3+ years of experience in UNIX shell scripting o Should have experience in analysis, design, development, testing, and implementation of system applications. o Ability to effectively communicate with internal and external business partners. • Additional Good to have requirements: o Understanding of Distributed eco system. o Experience in designing and building solutions using Kafka streams or queues. o Experience with NoSQL i.e., HBase, Cassandra, Couchbase or MongoDB o Experience with Data Visualization tools like Tableau, SiSense, Looker o Ability to learn and apply new programming concepts. o Knowledge of Financial reporting ecosystem will be a plus. o Experience in leading teams of engineers and scrum teams

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Kolkata

Hybrid

Naukri logo

Role - GCP Data Engineer Experience:4+ years Preferred - Data Engineering Background Location -Kolkata ( Face to Face Interview) Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Job description - > Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP >Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. > Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. > Experience programming in Java, Python, etc. > Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. > Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 3+ years experience in IT or professional services experience in IT delivery or large-scale IT analytics projects Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets Interested candidate please revert with your updates CV to aruna.b@tredence.com

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

4.0 - 6.0 years

5 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Required Skills: Strong expertise in NestJS framework. Proficient in building and managing Microservices architecture. Hands-on experience with Apache Kafka for real-time data streaming and messaging. Experience with Google Cloud Platform (GCP) services, including but not limited to Cloud Functions, Cloud Run, Pub/Sub, BigQuery, and Kubernetes Engine. Familiarity with RESTful APIs, database systems (SQL/NoSQL), and performance optimization. Solid understanding of version control systems, particularly Git. Preferred Skills: Knowledge of containerization using Docker. Experience with automated testing frameworks and methodologies. Understanding of monitoring, logging, and observability tools and practices. Responsibilities: Design, develop, and maintain backend services using NestJS within a microservices architecture. Implement robust messaging and event-driven architectures using Kafka. Deploy, manage, and optimize applications and services on Google Cloud Platform. Ensure high performance, scalability, reliability, and security of backend services. Collaborate closely with front-end developers, product managers, and DevOps teams. Write clean, efficient, and maintainable code, adhering to best practices and coding standards. Perform comprehensive testing and debugging, addressing production issues promptly. Job Location is in Office & based out of Gurgaon Selected candidate needs to have own Laptop

Posted 1 week ago

Apply

5.0 - 9.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 20 to 35 LPA Exp: 4 to 8 years Location: Gurgaon Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Remote

Naukri logo

Hiring for US based Multinational Company (MNC) We are seeking a skilled and detail-oriented Data Engineer to join our team. In this role, you will design, build, and maintain scalable data pipelines and infrastructure to support business intelligence, analytics, and machine learning initiatives. You will work closely with data scientists, analysts, and software engineers to ensure that high-quality data is readily available and usable. Design and implement scalable, reliable, and efficient data pipelines for processing and transforming large volumes of structured and unstructured data. Build and maintain data architectures including databases, data warehouses, and data lakes. Collaborate with data analysts and scientists to support their data needs and ensure data integrity and consistency. Optimize data systems for performance, cost, and scalability. Implement data quality checks, validation, and monitoring processes. Develop ETL/ELT workflows using modern tools and platforms. Ensure data security and compliance with relevant data protection regulations. Monitor and troubleshoot production data systems and pipelines. Proven experience as a Data Engineer or in a similar role Strong proficiency in SQL and at least one programming language such as Python, Scala, or Java Experience with data pipeline tools such as Apache Airflow, Luigi, or similar Familiarity with modern data platforms and tools: Big Data: Hadoop, Spark Data Warehousing: Snowflake, Redshift, BigQuery, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB Experience with cloud platforms (AWS, Azure, or GCP) Knowledge of data modeling, schema design, and ETL best practices Strong analytical and problem-solving skills

Posted 1 week ago

Apply

6.0 - 11.0 years

17 - 30 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of GCP Sr Data Engineer We are seeking a highly experienced and visionary Senior Google Cloud Data Engineer to spearhead the design, development, and optimization of our data infrastructure and pipelines on the Google Cloud Platform (GCP). With over 10 years of hands-on experience in data engineering, you will be instrumental in building scalable, reliable, and performant data solutions that power our advanced analytics, machine learning initiatives, and real-time reporting. You will provide technical leadership, mentor team members, and champion best practices for data engineering within a GCP environment. Responsibilities Architect, design, and implement end-to-end data pipelines on GCP using services like Dataflow, Cloud Composer (Airflow), Pub/Sub, and BigQuery. Build and optimize data warehousing solutions leveraging BigQuery's capabilities for large-scale data analysis. Design and implement data lakes on Google Cloud Storage, ensuring efficient data organization and accessibility. Develop and maintain scalable ETL/ELT processes to ingest, transform, and load data from diverse sources into GCP. Implement robust data quality checks, monitoring, and alerting mechanisms within the GCP data ecosystem. Collaborate closely with data scientists, analysts, and business stakeholders to understand their data requirements and deliver high-impact solutions on GCP. Lead the evaluation and adoption of new GCP data engineering services and technologies. Implement and enforce data governance policies, security best practices, and compliance requirements within the Google Cloud environment. Provide technical guidance and mentorship to other data engineers on the team, promoting knowledge sharing and skill development within the GCP context. Troubleshoot and resolve complex data-related issues within the GCP infrastructure. Contribute to the development of data engineering standards, best practices, and comprehensive documentation specific to GCP. Qualifications we seek in you! Minimum Qualifications / Skills • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 10+ years of progressive experience in data engineering roles, with a strong focus on cloud technologies. • Deep and demonstrable expertise with the Google Cloud Platform (GCP) and its core data engineering services (e.g., BigQuery, Dataflow, Cloud Composer, Cloud Storage, Pub/Sub, Cloud Functions). • Extensive experience designing, building, and managing large-scale data pipelines and ETL/ELT workflows specifically on GCP. • Strong proficiency in SQL and at least one programming language relevant to data engineering on GCP (e.g., Python). • Comprehensive understanding of data warehousing concepts, data modeling techniques optimized for BigQuery, and NoSQL database options on GCP (e.g., Cloud Bigtable, Firestore). • Solid grasp of data governance principles, data security best practices within GCP (IAM, KMS), and compliance frameworks. • Excellent problem-solving, analytical, and debugging skills within a cloud environment. • Exceptional communication, collaboration, and presentation skills, with the ability to articulate technical concepts clearly to various audiences. Preferred Qualifications/ Skills Google Cloud certifications relevant to data engineering (e.g., Professional Data Engineer). Experience with infrastructure-as-code tools for GCP (e.g., Terraform, Deployment Manager). Familiarity with data streaming technologies on GCP (e.g., Dataflow, Pub/Sub). Experience with machine learning workflows and MLOps on GCP (e.g., Vertex AI). Knowledge of containerization technologies (Docker, Kubernetes) and their application within GCP data pipelines (e.g., Dataflow FlexRS). Experience with data visualization tools that integrate well with GCP (e.g., Looker). Familiarity with data cataloging and data lineage tools on GCP (e.g., Data Catalog). Experience in [mention specific industry or domain relevant to your company]. Proven experience in leading technical teams and mentoring junior engineers in a GCP environment. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

5.0 - 9.0 years

10 - 17 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job Summary: Skill: Data Engineering Experience: 5-8 Years Location: Hyderabad, Chennai, Bangalore, Noida, Gurgaon, Mumbai, Pune, Kochi Highly preffered for Bangalore Location Notice Period: 15 days, 30 days or Currently Serving Notice Period. hashtag#GCP + hashtag#Bigquery + hashtag#Cube Check with all your candidates if they have experience in Cube Cloud or Cube. Senior level to be able to work independently on given tasks Required cloud certification: GCP. cloud professional engineer.

Posted 1 week ago

Apply

7.0 - 10.0 years

15 - 30 Lacs

Pune

Hybrid

Naukri logo

Looking for 7–10 yrs exp (4+ in data modeling, 2–3 in Data Vault 2.0). Must know DBT, Dagster/Airflow, GCP (BigQuery, CloudSQL), and data modeling. DV 2.0 hands-on is a must. Docker is a plus.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Naukri logo

Job Description Job Title: Offshore Data Engineer Base Location: Bangalore Work Mode: Remote Experience: 5+ Years Job Description: We are looking for a skilled Offshore Data Engineer with strong experience in Python, SQL, and Apache Beam . Familiarity with Java is a plus. The ideal candidate should be self-driven, collaborative, and able to work in a fast-paced environment . Key Responsibilities: Design and implement reusable, scalable ETL frameworks using Apache Beam and GCP Dataflow. Develop robust data ingestion and transformation pipelines using Python and SQL . Integrate Kafka for real-time data streams alongside batch workloads. Optimize pipeline performance and manage costs within GCP services. Work closely with data analysts, data architects, and product teams to gather and understand data requirements. Manage and monitor BigQuery datasets, tables, and partitioning strategies. Implement error handling, resiliency, and observability mechanisms across pipeline components. Collaborate with DevOps teams to enable automated delivery (CI/CD) for data pipeline components. Required Skills: 5+ years of hands-on experience in Data Engineering or Software Engineering . Proficiency in Python and SQL . Good understanding of Java (for reading or modifying codebases). Experience building ETL pipelines with Apache Beam and Google Cloud Dataflow . Hands-on experience with Apache Kafka for stream processing. Solid understanding of BigQuery and data modeling on GCP. Experience with GCP services (Cloud Storage, Pub/Sub, Cloud Compose, etc.). Good to Have: Experience building reusable ETL libraries or framework components. Knowledge of data governance, data quality checks, and pipeline observability. Familiarity with Apache Airflow or Cloud Composer for orchestration. Exposure to CI/CD practices in a cloud-native environment (Docker, Terraform, etc.). Tech stack : Python, SQL, Java, GCP (BigQuery, Pub/Sub, Cloud Storage, Cloud Compose, Dataflow), Apache Beam, Apache Kafka, Apache Airflow, CI/CD (Docker, Terraform)

Posted 1 week ago

Apply

2.0 - 4.0 years

3 - 5 Lacs

Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Dear Candidates, We are pleased to announce a walk-in interview for the role of Business Intelligence Analyst - Goregaon West at Rentokil PCI , a leading organization committed to delivering excellence. This is a great opportunity to join a dynamic team and grow your career in a fast-paced, technology-driven environment. Walk-in Interview Details: Position: Business Intelligence Analyst Experience Required: 2+ to 4 years of proven experience in a Business Intelligence Analyst role. Date: Monday 16th June 2025 Time: 11:00 AM to 2:00 PM Venue: Rentokil PCI, Pest Control Pvt. Ltd.3 Floor,'Narayani, Ambabai Temple, Compound, Aarey Rd, near Bank of Maharashtra, Goregaon West, Mumbai, Maharashtra 400062 Important Information: Candidates with strong English communication skills will be preferred, especially those currently based in Western line of Mumbai . A minimum of 2+ to 4 years of experience as a Business Intelligence Analyst is required. We are looking for immediate joiners or those with a short notice period . Please carry your updated resume and attend the interview in formal attire . About the Role: The Business Intelligence is responsible for working within the BI team to deliver reporting and dashboard solutions that meet the needs of the organisation. The developer must work well in a team setting and have excellent organisational, prioritisation, communication, and time management skills. The successful candidate will demonstrate accountability, flexibility and adaptability to handle multiple and changing priorities and be able to successfully collaborate with development teams, technology groups, consultants, and key stakeholders. The person will report to the Manager - Application Support. The incumbent will have to work as part of a multi-functional team and this involves collaboration with the internal team and external stakeholders. Job Responsibilities: Develop and manage BI solutions Analyse business processes and requirements Create and maintain documentation including requirements, design and user manuals Conduct unit testing and troubleshooting Develop and execute database queries and conduct analyses Identify development needs in order to improve and streamline operations Identify opportunities to improve processes and strategies with technology solutions Key Result Areas: Ensure quality and accuracy of data assets and analytic deliverables Troubleshooting business intelligence modelling issues and developing solutions within the timelines Query resolution Enhancing application knowledge to implement new solutions On time deployment of different projects as per the business requirements On time creation and analysis of visualisations and reports Competencies (Skills essential to the role): Strong analytical skills Ability to solve practical problems and deal with a variety of concrete variables in situations where only limited standardisation exists Ability to think logically and troubleshoot issues Excellent interpersonal (verbal and written) communication skills are required to support working in project environments that includes internal, external and customer teams Requirements Educational Qualification / Other Requirement: Graduate Degree in Computer Science, Information Technology 1 to 2 years Experience working on BI platform, DataStudio, any Cloud Platform, Qlik Strong SQL Developer skills. Strong SQL development skills with in-depth knowledge of complex SQL queries and good understanding of QlikSense. Good working knowledge of SSIS, SSRS, SSAS and proper workflow design and exception management Experience in Data Warehouse, ETL, Cube and Report design and development Role Type / Key working relationships: Individual Contributor Internal team External stakeholders Benefits Attractive Base Salary Annual Performance Based Bonus Group Mediclaim Insurance Policy Travel Reimbursement Equal Opportunities What can you expect from RPCI? Our values lie at the core of our mission and vision. We believe that its our people who make our company what it is. We believe in: Safety Integrity Innovation Learning & Development Open & Transparent Performance Orientation Why Join Rentokil PCI? Rentokil PCI is a recognized leader in the pest control and hygiene industry, committed to delivering excellence and ensuring customer satisfaction. By joining our team, you will have the opportunity to advance your career in a dynamic, fast-paced environment, with continuous learning and development at the forefront of our culture. If you meet the requirements and are interested, we would be delighted to meet you at the walk-in interview. For any questions or further details, please feel free to contact us at: Contact Person: Hitesha Patel Contact Number : 8828018709 Email ID : hiteshav.patel@rentokil-pci.com We look forward to meeting you at the interview! Visit us: www.rentokil-pestcontrolindia.com

Posted 1 week ago

Apply

5.0 - 8.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled and experienced Senior Cloud Native Developer to join our team and drive the design, development, and delivery of cutting-edge cloud-based solutions on Google Cloud Platform (GCP). This role emphasizes technical expertise, best practices in cloud-native development, and a proactive approach to implementing scalable and secure cloud solutions. Responsibilities Design, develop, and deploy cloud-based solutions using GCP, adhering to architecture standards and best practices Code and implement Java applications using GCP Native Services like GKE, CloudRun, Functions, Firestore, CloudSQL, and Pub/Sub Select appropriate GCP services to address functional and non-functional requirements Demonstrate deep expertise in GCP PaaS, Serverless, and Database services Ensure compliance with security and regulatory standards across all cloud solutions Optimize cloud-based solutions to enhance performance, scalability, and cost-efficiency Stay updated on emerging cloud technologies and trends in the industry Collaborate with cross-functional teams to architect and deliver successful cloud implementations Leverage foundational knowledge of GCP AI services, including Vertex AI, Code Bison, and Gemini models when applicable Requirements 5+ years of extensive experience in designing, implementing, and maintaining applications on GCP Comprehensive expertise in using GCP services, including GKE, CloudRun, Functions, Firestore, Firebase, and Cloud SQL Knowledge of advanced GCP services, such as Apigee, Spanner, Memorystore, Service Mesh, Gemini Code Assist, Vertex AI, and Cloud Monitoring Solid understanding of cloud security best practices and expertise in implementing security controls in GCP Proficiency in cloud architecture principles and best practices, with a focus on scalable and reliable solutions Experience with automation and configuration management tools, particularly Terraform, along with a strong grasp of DevOps principles Familiarity with front-end technologies like Angular or React Nice to have Familiarity with GCP GenAI solutions and models, including Vertex AI, Codebison, and Gemini models Background in working with front-end frameworks and technologies to complement back-end cloud development Capability to design end-to-end solutions integrating modern AI and cloud technologies

Posted 1 week ago

Apply

7.0 - 10.0 years

0 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

As a GCP data engineer the colleague should be able to designs scalable data architectures on Google Cloud Platform, using services like Big Query and Dataflow. They write and maintain code (Python, Java), ensuring efficient data models and seamless ETL processes. Quality checks and governance are implemented to maintain accurate and reliable data. Security is a priority, enforcing measures for storage, transmission, and processing, while ensuring compliance with data protection standards. Collaboration with cross-functional teams is key for understanding diverse data requirements. Comprehensive documentation is maintained for data processes, pipelines, and architectures. Responsibilities extend to optimizing data pipelines and queries for performance, troubleshooting issues, and proactively monitoring data accuracy. Continuous learning is emphasized to stay updated on GCP features and industry best practices, ensuring a current and effective data engineering approach. Experience - Proficiency in programming languages: Python, Pyspark - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud Composer , Cloud Spanner, GCS, DBT etc., - Data Engineering skillset using Python, SQL - Experience in ETL (Extract, Transform, Load) processes - Knowledge of DevOps tools like Jenkins, GitHub, Terraform is desirable. Should have good knowledge on Kafka (Batch/ streaming) - Understanding of Data models and experience in performing ETL design and build, database replication using Message based CDC - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Desirables Knowledge of data modelling and database design Good understanding of Cloud Security Proven practical experience of using the Google Cloud SDK to deliver APIs and automation Crafting continuous integration and continuous delivery/deployment tooling pipelines (Jenkins/Spinnaker)

Posted 1 week ago

Apply

3.0 - 8.0 years

15 - 20 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

We are seeking a skilled and innovative Systems Engineer with a strong focus on the Google Cloud Platform (GCP) to join our team. The ideal candidate will be responsible for designing, implementing, and maintaining cloud-based infrastructure solutions, ensuring optimal performance and scalability for our ongoing projects. Responsibilities Design, configure, and maintain the GCP environment for the data mesh architecture project Develop infrastructure using an Infrastructure-as-Code approach on GCP Create CI/CD pipelines and automation with deployment models using GitHub Actions Collaborate with cross-functional teams to define cloud infrastructure requirements and ensure scalability, security, and reliability Implement continuous integration and deployment pipelines aligned with DevOps standards Document all aspects of GCP infrastructure and deployment processes Troubleshoot and resolve technical issues or performance inefficiencies on the GCP platform Optimize costs and consistently evaluate GCP resources for better performance Ensure compliance with security policies and recommend improvements where needed Perform regular monitoring, maintenance, and upgrades for cloud infrastructure Requirements 3-5 years of experience working with Google Cloud Platform services, including compute, storage, networking, and security Demonstrated background in designing and implementing scalable cloud infrastructure on GCP Proficiency in DevOps practices, CI/CD workflows, and automation using tools such as GitHub Actions Understanding of Infrastructure-as-Code frameworks such as Terraform or similar tools Strong analytical and problem-solving skills to address complex cloud-related challenges effectively Familiarity with cloud performance monitoring, security best practices, and cost optimization techniques

Posted 1 week ago

Apply

12.0 - 22.0 years

30 - 45 Lacs

Chennai

Work from Office

Naukri logo

Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Google BigQuery, PySpark Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark, PySpark, Google BigQuery. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 12 years of experience in Apache Spark.

Posted 1 week ago

Apply

1.0 years

4 - 6 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

We are looking for a proactive & highly skilled Technical Support Engineer to provide exceptional technical support to overseas projects, working in rotational shifts to ensure 24/7 availability. This role involves troubleshooting complex issues across cloud platforms, networking, application architectures, and DevOps toolchains. The ideal candidate should be self motivated, a collaborator, agile and a continuous learner. Key Responsibilities Provide technical support and troubleshoot issues related to cloud platforms and services such as Fargate , ECS , DynamoDB , BigQuery , SNS etc. Understand the problems by consuming logs and metrics from various sources using the services such as CloudWatch , Prometheus , Grafana , Loki , Alert Managers and Splunk etc. Analyze and resolve networking challenges, including load balancers, API gateways, reverse proxies, ingress controllers, and service-to-service communications. Work on issues related to client-server communications, firewalls, and virtual machines. Collaborate with DevOps teams to manage and troubleshoot toolchains like Docker, Kubernetes, Jenkins, Ingress Controllers etc. Act as the first point of contact for technical queries and escalate issues when necessary. Liaise with development and operations teams to identify root causes and resolve incidents effectively. Document troubleshooting steps, solutions, and maintain a knowledge base for recurring issues. Collaborate with cross-functional teams to implement best practices for monitoring and incident response. Participate in shift handovers and provide timely updates on ongoing issues. Preferred Skills Technical Skills Cloud Platforms and Services Hands on knowledge working with Fargate and ECS for managing and troubleshooting containerized workloads. Proficiency with DynamoDB and BigQuery for analyzing data and take decisions based on the analysis. Hands-on knowledge of SNS for debugging message delivery issues and integration workflows. Monitoring and Logging Tools Proficiency in CloudWatch Logs , Loki , and Splunk for consuming and analyzing logs to identify and resolve issues. Hands-on knowledge with Prometheus and Grafana for analysing metrics using dashboards and monitoring system health. Knowledge of Alert Manager for configuring and managing alert escalation. Ability to interpret metrics from various sources and create actionable insights. Networking and Security Understanding of load balancers (e.g., ALB, NLB) for distributing traffic and troubleshooting connectivity issues. Knowledge in API Gateways like AWS API Gateway or NGINX for managing API traffic. Knowledge of reverse proxies and ingress controllers (e.g., NGINX Ingress , Traefik ) for managing internal/external traffic. Understanding service-to-service communications , including DNS, HTTP/HTTPS, and gRPC protocols. Hands-on knowledge with firewalls , security groups, and IAM roles for secure communications. Troubleshooting skills for VM-related issues in platforms like AWS EC2 or equivalent. DevOps Toolchains Proficiency with Docker for managing container images and runtime debugging. Understanding of Kubernetes concepts of managing deployments, ingress setups, and pod-related issues and related troubleshooting commands and mechanisms. Knowledge of CI/CD pipeline building tools such as Jenkins, GitHub Actions, ArgoCD for building, deploying, and managing automated pipelines. Understanding of Ingress controllers (e.g., NGINX, Traefik) and SSL termination for secure routing. Troubleshooting and Incident Management Strong problem-solving skills to identify root causes using logs, metrics, and system-level debugging. Ability to document detailed troubleshooting steps and solutions for recurring issues. Collaboration and Communication Ability working with cross-functional teams (DevOps, development, and operations) to resolve incidents. Skills in effective and proactive communication to escalate issues and provide updates during shift handovers. Proficiency with tools like Slack, JIRA, Confluence, or Google Workspace for collaboration and issue tracking. Salary Package- 4-6 LPA Experience Required Technical Support Engineer with 0.5 years of experience

Posted 1 week ago

Apply

3.0 - 5.0 years

37 - 40 Lacs

Chennai

Work from Office

Naukri logo

About us One team. Global challenges. Infinite opportunities. At Viasat, we re on a mission to deliver connections with the capacity to change the world. For more than 35 years, Viasat has helped shape how consumers, businesses, governments and militaries around the globe communicate. We re looking for people who think big, act fearlessly, and create an inclusive environment that drives positive impact to join our team. What you'll do The Customer Engineering team is a group of highly technical engineers who are tasked with maintaining and developing the reliability, scalability, and performance of the Service to different Enterprise Customers. The Customer Engineering Team is empowered to drive technical resolutions across the technology stack from hardware through to application and all stops in between. The team is also responsible to build and maintain Alerts to proactively monitor the service and act as the technical liaison between Customer facing teams and the Engineering teams. The day-to-day As a Site Reliability Engineer, you will Identify and investigate potential and actual customer performance problems, recommend, and prioritize remediation, and assess effectiveness of remediation actions Participate in and provide feedback on product design, especially regarding reliability and availability Drive initiatives with partner teams to improve the reliability and performance of the Service through improved system design Drive a culture of intolerance to manual activity which results in a highly automated environment delivering scalable solution Work Closely with Customer facing teams (Technical Account Mangers and Program Teams) to understand and prioritize the Customer issues Drive monitoring and automation initiatives Create and present Performance reports for technical and management stakeholders Work closely with Engineering teams to communicate and prioritize the service impacting issues Reproduce and test the Customer issues in the Lab Develop Automated scripts and tools to Enable monitoring of the Service Be part of on-call rotations What you'll need 5+ years experience in troubleshooting and triage of network issues in a fast paced environment, to support customers. 5+ years experience in Network Operations or Product Support Advanced knowledge of modern programming languages, especially Python An ability to understand large complex systems and a passion to constantly improve environments Strong networking knowledgeTCP/IP, IPSEC, VPN, NAT, Routing Protocols, AAA L2 L3 protocol OSPF, BGP, ISIS, STP, VLAN, NAT, Firewall, DHCP Set priorities and work efficiently in a fast-paced environment Demonstrated ability to deliver results on time with high quality and attention to detail Demonstrated ability to work with ambiguous requirements, adapt, and learn Experience with data analytics tools(Splunk, Kibana) Keen (data-driven) decision making skills under incomplete information Excellent face-to-face and remote customer rapport Bachelor s degree in electrical engineering, Computer Science, or Computer Engineering Up to 10% travel What will help you on the job Preferences Experience analyzing data and trending to gain operational efficiencies Telecom or related operational service experience, especially wireless networks Previous technical role in a DevOps/SRE workflow Experience with Satcom technology Experience/knowledge GCP, AWS, Big Query EEO Statement Viasat is proud to be an equal opportunity employer, seeking to create a welcoming and diverse environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, ancestry, physical or mental disability, medical condition, marital status, genetics, age, or veteran status or any other applicable legally protected status or characteristic. If you would like to request an accommodation on the basis of disability for completing this on-line application, please click

Posted 1 week ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are seeking an experienced and strategic Business Analyst / Functional Lead to drive solution definition, business alignment, and successful delivery of Real-Time Decisioning initiatives and play a critical role in translating complex business needs into actionable functional requirements, guiding cross-functional teams, and shaping customer-centric decisioning strategies across digital channels. Responsibilities Gather, analyze, and document business and functional requirements for decisioning use cases (e.g., next-best-action, personalized offers, customer journeys). Act as the primary liaison between business stakeholders, product owners, and technical teams for real-time decisioning solutions. Define and maintain decision logic, business rules, and outcome scenarios in alignment with marketing and CX goals. Facilitate all Agile ceremonies including sprint planning, daily stand-ups, reviews, and retrospectives. Guide the team in Agile practices, track sprint progress, and manage delivery risks. Remove blockers and coordinate across business, design, tech, QA, and operations teams. Maintain ADO board, backlog grooming, sprint metrics, and continuous improvement initiatives. Collaborate with solution architects to design customer-centric, scalable real-time decisioning frameworks. Lead discovery and requirement workshops with marketing, data, and technology stakeholders. Own the functional design documents, user stories, and solution blueprints; ensure clarity, accuracy, and traceability. Work with engineering teams to define test scenarios and validate decisioning outputs. Support rollout, training, and adoption of decisioning platforms across business units. Continuously monitor and optimize decisioning logic and KPIs in partnership with analytics teams. Qualifications Candidate should have total 9 14 years of total IT experience and at least 3+ years of relevant work experience as an RTD Functional Lead and business analysis, functional consulting, or similar roles in MarTech, AdTech, or CX platforms. Bachelor's or Master's degree in computer science, information technology, or a related field. Strong understanding of real-time decisioning platforms such as Salesforce Marketing Cloud Personalization / Interaction Studio CleverTap Proven ability to map customer journeys and define decision strategies based on personas, behavior, and context. Skilled in requirement gathering, functional documentation, user story writing, and backlog management. Excellent understanding of data flows, business rules, segmentation, and targeting. Ability to translate business needs into logical rules, decision tables, and KPIs. Strong communication and stakeholder management skills across business and technical audiences.

Posted 1 week ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are looking for a strategic and hands-on Architect specializing in Real-Time Decisioning(RTD) to lead the design and implementation of intelligent, data-driven customer engagement solutions. With over 9 years of experience, the ideal candidate will bring deep technical expertise in real-time decision-making platforms and marketing technologies to drive personalization, automation, and optimized customer experiences across digital channels. The main purpose of the role is to provide architectural design governance and technical leadership, develop and deliver customized solutions within the Real Time Decisioning (RTD) platforms to support critical business functions, and meet project Responsibilities Design and implement Real-Time Decisioning initiatives to support next-best-action strategies across web, mobile, email, and contact center touchpoints. Translate business goals into decisioning logic, data models, and integration strategies. Work with RTD Business Analyst, Sector Product Owner, Sector IT, and Marketing teams to transform new requirements into best practice-led technical design. Design and implement real-time personalization use cases using Salesforce Marketing Cloud Personalization(MCP) or Interaction Studio, CleverTap capabilities (triggers, campaigns, decisions, and Einstein). Work directly with stakeholders to design/govern highly usable, scalable, extensible, and maintainable Salesforce solutions. Define and implement data integrations between Salesforce Marketing Cloud, CDP, CRM, and other platforms. Deal with ambiguous problems, take responsibility for finding solutions and drive towards simple solutions to complex problems. Troubleshoot key implementation issues and demonstrate the ability to drive to a successful resolution. Use deep business knowledge of RTD to assist with estimation for major new initiatives. Provide oversight to the development team (up to 5 resources) and ensure sound technical delivery of the product Design and implement complex solutions in the personalization of Mobile Apps and Websites, etc Implement integrations with other systems using SDKs and APIs Contribute to RTD CoE building activities by creating reference architectures, common patterns, data models, and re-usable assets that empower our stakeholders to maximize business value using the breadth of the Salesforce solutions available, also harvesting knowledge from existing implementations. Evangelize and educate internal stakeholders about RTD technologies Qualifications Bachelors degree in IT, Computer Science, or equivalent 9-14 plus years of IT experience with at least 5+ years of experience with Real-time decisioning and personalization tools like Marketing Cloud personalization (MCP) or Interaction Studio, CleverTap, etc Strong understanding of customer data models, behavioural analytics, segmentation, and machine learning models. Experience with APIs, real-time event processing, and data pipelines. Familiarity with cloud environments (AWS, Azure, GCP) and data platforms (e.g., Snowflake, BigQuery). Hands-on experience with rules engines, decision tables, and AI-based recommendations. Excellent problem-solving, communication, and stakeholder management skills. Experience developing customer-facing user interfaces with Lightning Components Agile delivery experience, Self-motivated and creative, Good communication and interpersonal skills Experience in providing technical governance on architectural design and leading a development team in a technical capacity Motivated self-starter, able to change directions quickly when priorities shift and quickly think through problems to design and deliver solutions Passion for technology and for learning

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 18 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Naukri logo

Location:Bangalore/Gurgaon/Hyderabad/Mumbai Must have skills: Must have skills:Data Scientist / Transformation Leader & at least 5 years in Telecom Analytics Good to have skills:GEN AI, Agentic AI, Job Summary : About Global Network Data & AI:- Accenture Strategy & Consulting Global Network - Data & AI practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition About Comms & Media practice: Comms & Media (C&M) is one of the Industry Practices within Accentures S&C Global Network team. It focuses in serving clients across specific Industries Communications, Media & Entertainment. Communications Focuses primarily on industries related with telecommunications and information & communication technology (ICT). This team serves most of the worlds leading wireline, wireless, cable and satellite communications and service providers Media & Entertainment Focuses on industries like broadcast, entertainment, print and publishing Globally, Accenture Comms & Media practice works to develop value growth strategies for its clients and infuse AI & GenAI to help deliver top their business imperatives i.e., revenue growth & cost reduction. From multi-year Data & AI transformation projects to shorter more agile engagements, we have a rapidly expanding portfolio of hyper-growth clients and an increasing footprint with next-gen solutions and industry practices. Roles & Responsibilities: A Telco domain experienced and data science consultant is responsible to help the clients with designing & delivering AI solutions. He/she should be strong in Telco domain, AI fundamentals and should have good hands-on experience working with the following: Ability to work with large data sets and present conclusions to key stakeholders; Data management using SQL. Propose solutions to the client based on gap analysis for the existing Telco platforms that can generate long term & sustainable value to the client. Gather business requirements from client stakeholders via interactions like interviews and workshops with all stakeholders Track down and read all previous information on the problem or issue in question. Explore obvious and known avenues thoroughly. Ask a series of probing questions to get to the root of a problem. Ability to understand the as-is process; understand issues with the processes which can be resolved either through Data & AI or process solutions and design detail level to-be state Understand customer needs and identify/translate them to business requirements (business requirement definition), business process flows and functional requirements and be able to inform the best approach to the problem. Adopt a clear and systematic approach to complex issues (i.e. A leads to B leads to C). Analyze relationships between several parts of a problem or situation. Anticipate obstacles and identify a critical path for a project. Independently able to deliver products and services that empower clients to implement effective solutions. Makes specific changes and improvements to processes or own work to achieve more. Work with other team members and make deliberate efforts to keep others up to date. Establish a consistent and collaborative presence with clients and act as the primary point of contact for assigned clients; escalate, track, and solve client issues. Partner with clients to understand end clients business goals, marketing objectives, and competitive constraints. Storytelling Crunch the data & numbers to craft a story to be presented to senior client stakeholders. Professional & Technical Skills: Overall 10+ years of experience in Data Science & at least 5 years in Telecom Analytics Masters (MBA/MSc/MTech) from a Tier 1/Tier 2 and Engineering from Tier 1 school Demonstrated experience in solving real-world data problems through Data & AI Direct onsite experience (i.e., experience of facing client inside client offices in India or abroad) is mandatory. Please note we are looking for client facing roles. Proficiency with data mining, mathematics, and statistical analysis Advanced pattern recognition and predictive modeling experience; knowledge of Advanced analytical fields in text mining, Image recognition, video analytics, IoT etc. Execution level understanding of econometric/statistical modeling packages Traditional techniques like Linear/logistic regression, multivariate statistical analysis, time series techniques, fixed/Random effect modelling. Machine learning techniques like - Random Forest, Gradient Boosting, XG boost, decision trees, clustering etc. Knowledge of Deep learning modeling techniques like RNN, CNN etc. Experience using digital & statistical modeling software (one or more) Python, R, PySpark, SQL, BigQuery, Vertex AI Proficient in Excel, MS word, Power point, and corporate soft skills Knowledge of Dashboard creation platforms Excel, tableau, Power BI etc. Excellent written and oral communication skills with ability to clearly communicate ideas and results to non-technical stakeholders. Strong analytical, problem-solving skills and good communication skills Self-Starter with ability to work independently across multiple projects and set priorities Strong team player Proactive and solution oriented, able to guide junior team members. Execution knowledge of optimization techniques is a good-to-have Exact optimization Linear, Non-linear optimization techniques Evolutionary optimization Both population and search-based algorithms Cloud platform Certification, experience in Computer Vision are good-to-haves Qualification Experience: Overall 10+ years of experience in Data Science & at least 5 years in Telecom Educational Qualification: Masters (MBA/MSc/MTech) from a Tier 1/Tier 2 and Engineering from Tier 1 school

Posted 1 week ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Bengaluru

Work from Office

Naukri logo

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. As GCP Data Engineer at Kyndryl, you will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment using GCP data services. You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. Responsibilities: Design, develop, and maintain scalable data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage. Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements. Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs. Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows. Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services. Develop and maintain Python / PySpark for data processing and integrate with GCP services for seamless data operations. Develop and optimize SQL queries for data analysis and reporting. Monitor and troubleshoot data pipeline issues to ensure timely resolution. Implement data governance and security best practices within GCP. Perform data quality checks and validation to ensure accuracy and consistency. Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines. Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Provide technical support and guidance to junior data engineers and other team members. Participate in code reviews and contribute to continuous improvement of data engineering practices. Implement best practices for cost management and resource utilization within GCP. If you're ready to embrace the power of data to transform our business and embark on an epic data adventure, then join us at Kyndryl. Together, let's redefine what's possible and unleash your potential. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Experience: Bachelor’s or master’s degree in computer science, Engineering, or a related field with over 8 years of experience in data engineering More than 3 years of experience with the GCP data ecosystem Hands-on experience and Strong proficiency in GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion. Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in PySpark and/or Python, specifically for building cloud-native data pipelines. Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. Knowledge of data governance, security, and compliance best practices. Experience with private and public cloud architectures, pros/cons, and migration considerations. Excellent problem-solving, analytical, and critical thinking skills. Ability to manage multiple projects simultaneously, while maintaining a high level of attention to detail. Communication Skills: Must be able to communicate with both technical and nontechnical. Able to derive technical requirements with the stakeholders. Ability to work independently and in agile teams. Preferred Technical And Professional Experience GCP Data Engineer Certification is highly preferred. Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. Experience working as a Data Engineer and/or in cloud modernization. Knowledge of Databricks, Snowflake, for data analytics. Experience in NoSQL databases Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with BI dashboards and Google Data Studio is a plus. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies