Home
Jobs

83 Pubsub Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

9 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

Job Description: We are hiring a Java Developer with strong GCP experience, or a GCP Engineer proficient in Java. The candidate should be capable of developing scalable cloud-native applications using Google Cloud services. Key Skills: Java, Spring Boot, RESTful APIs Google Cloud Platform (GCP) Cloud functions, Pub/Sub, BigQuery (preferred) CI/CD, Docker, Kubernetes

Posted 11 hours ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 13 hours ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 13 hours ago

Apply

4.0 - 9.0 years

10 - 17 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using BigQuery, Dataproc, PubSub, and Cloud Storage on Google Cloud Platform (GCP). Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Troubleshoot issues related to data pipeline failures or errors in real-time using logs analysis and debugging techniques. Develop automation scripts using Python to streamline data processing tasks and improve efficiency. Ensure compliance with security standards by implementing access controls, encryption, and monitoring mechanisms.

Posted 1 day ago

Apply

3.0 - 6.0 years

6 - 16 Lacs

Gurugram

Work from Office

Naukri logo

Salesforces Development Exp range of the candidate: 3 or 3+ years. Roles & responsibilities: Design, code, unit test, system test, performance test, debug, implement and support Salesforce.com application and integrations. Excellent hands-on knowledge of LWC (Lightning Web Components), Events, PubSub Model, Aura PubSub Model, Component Life Cycle, Using Public Properties, Wire methods, Usage of Apex Classes with Wire and imperative Calls. LDS in both LWC and Aura Components. Well-experienced in Lightning Aura Components, VisualForce pages, Apex Classes, Apex Triggers, Batch Apex, Test Classes, Components, Custom Settings, and workflows. Rules, Approval Process, Validation Rules, Roles, Profiles, Organization-Wide. Defaults and Sharing rules, Reports, Dashboards, and other Salesforce.com standard features. Creation of Mock Test Classes for HttpCallout classes, WebService Call Out classes. Worked in Inbound, and Outbound Services in both Web Services and REST API. Using Postman, SOAP tools for Integration. Have good working knowledge in querying Salesforce database using SOQL & SOSL queries using Force.com Explorer. Maintain user roles, and security controls, create shared security rules, profiles, and workflow rules, creation of new users, and custom fields, Modify standard layouts Skills and Qualifications: At least 3 years of Salesforce development experience as well as experience in systems integration environments with large, complex third-party solutions coupled with proven expertise in integrating solutions with other applications within the overall technology environment. Demonstrative success with at least one large Salesforce.com implementation/integration project. Experience in Agile development methodologies. Detail-oriented, organized and possesses good writing and communication skills. Ability to operate effectively, and with a sense of possibility, in a fast-paced, deadline-driven environment.

Posted 1 day ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Project description You will join an Integration Centre of Excellence as part of a financial organisation. Our IT teams design and develop modern systems, customer tools, handling transactions and providing integration support to Core Banking and online channel applications. Team is responsible for the operation and continuous development of an integration layer developed using IBM API Connect platform. The systems are used to implement business processes spanning different inhouse applications and third-party providers and regulatory authorities. Responsibilities Develop integration applications based on user requirements in our existing IIB (IBM Integration Bus) and ACE (App Connect Enterprise) platforms Analysis and design of existing processes and systems Monitor and maintain the technical environment Contribute collaboratively to the team and appreciate the needs of the user to recommend and develop sound technical solutions Offer good communication skills as you will be expected to build strong and effective relationships with other teams. Skills Must have 4 + years of experience in Software Development with proficiency in designing, modelling and developing enterprise applications in integration layer using IBM Integration Bus (IIB), IBM App Connect Enterprise (ACE), WebSphere MQ, WebSphere DataPower, IBM API Connect, IBM Cloud Pak for Integration Good knowledge of Core Java debugging and Java based IIB APIs Strong communication skills and high motivation to learn new technology with excellent solving skills. Excellent knowledge of REST, JSON and XML Extensive experience with software development methodologies like Waterfall, scrum and agile Experience in developing micro service, API's RESTful Web Services that interact with a wide range of systems. Extensive experience in migrating on-prem services developed in IBM integration Bus, IBM DataPower, IBM API Connect platforms. Implemented robust security patterns by adopting various authentication protocols such as Two-way SSL, OAuth, SAML, Kebros, LDAP, TLS, JWT etc. Extensive experience working with SOA, Web Services, SOAP, WSDL, WS Security, XSLT Style sheets, XML Schema, LDAP. Conversant with all phases of SDLC in Requirement gathering, Analysis, Design, Development, Implementation and Testing. Extensive in-depth knowledge in designing, development and testing of EAI applications. Extensive experience in point-to-point and pub/sub messaging features. Experience in the performance tuning for applications for optimal performance. Able to work collaboratively with developers, testers, technical support engineers and other team members in the overall enhancement of the software product quality. Nice to have ITIL knowledge and experience with ticketing systems (ServiceNow, Tivoli etc) Banking Domain Knowledge Knowledge of Microsoft Azure and Cloud Concepts Other Languages EnglishC1 Advanced Seniority Regular

Posted 1 day ago

Apply

3.0 - 6.0 years

30 - 39 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities: * Design, develop, test & maintain full-stack applications using Python, React.JS, Node.JS, NestJS, TypeScript & NextJS with REST APIs on AWS/Azure/GCP cloud platforms. Annual bonus

Posted 2 days ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Noida, Bhubaneswar, Greater Noida

Work from Office

Naukri logo

GA4, Firebase Analytics BigQuery (SQL, optimization, partitioning, clustering) Looker / Looker Studio (dashboarding, modeling) GCP tools: Cloud Storage, Pub/Sub, Dataflow, Functions GCP certifications, A/B testing, product analytics preferred

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelors degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Primary Location: IND-Pune-Equifax Analytics-PEC Function: Function - Tech Dev and Client Services Schedule: Full time

Posted 3 days ago

Apply

5.0 - 10.0 years

9 - 10 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelors degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Primary Location: IND-Trivandrum-Equifax Analytics-PEC Function: Function - Tech Dev and Client Services Schedule: Full time

Posted 3 days ago

Apply

5.0 - 10.0 years

9 - 10 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you ll do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need Bachelors degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS, Rest API 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Primary Location: IND-Trivandrum-Equifax Analytics-PEC Function: Function - Tech Dev and Client Services Schedule: Full time

Posted 3 days ago

Apply

3.0 - 6.0 years

30 - 39 Lacs

Gurugram

Work from Office

Naukri logo

Responsibilities: * Design, develop, test & maintain full-stack applications using Python, React.JS, Node.JS, NestJS, TypeScript & NextJS with REST APIs on AWS/Azure/GCP cloud platforms. Annual bonus

Posted 3 days ago

Apply

10.0 - 14.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Assoc Mgr Qualifications: Any Graduation Years of Experience: 10 to 14 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for Artificial Neural Networks (ANNS)Machine LearningResults orientationProblem-solving skillsAbility to perform under pressureStrong analytical skillsWritten and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 4 days ago

Apply

4.0 - 8.0 years

10 - 18 Lacs

Hyderabad

Hybrid

Naukri logo

About the Role: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4-6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.

Posted 4 days ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 5 - 15 Yrs Location: Pan India Job Description: Minimum 2 years hands on experience in GCP Development ( Data Engineering ) Position : Developer / Tech Lead / Architect Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 1 week ago

Apply

10.0 - 15.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

1. Candidate must be good at writing automation software using Python 2. Good understanding of networking concepts and TCP/IP 3. Must be good at software fundamentals 4. Comfortable with CI/CD 5. Willingness to learn new programming language and technologies 6. Comfortable to REST and Microservices Development 7. Comfortable in writing software in GCP, Google PubSub 8. Strong hands on experience in Linux 9. Exposure to PostgreSQL, Google Datastore SQL, Redis is Preferable 10. Strong design and programming skills 11. Experience in Cloud Technologies 12. Work effectively with geographically distributed teams Apply for Python Automation- Developer position

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Remote

Naukri logo

Canterr is looking for talented and passionate professionals for exciting opportunities with a US-based MNC product company! You will be working permanently with Canterr and deployed to a top-tier global tech client. Key Responsibilities: Design and develop data pipelines and ETL processes to ingest, process, and store large volumes of data. Implement and manage big data technologies such as Kafka, Dataflow, BigQuery, CloudSQL, Kafka, PubSub Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions. Monitor and troubleshoot data pipeline issues and implement solutions to prevent future occurrences. Required Skills and Experience: Generally, we use Google Cloud Platform (GCP) for all software deployed at Wayfair. Data Storage and Processing BigQuery CloudSQL PostgreSQL DataProc Pub/Sub Data modeling: Breaking the business requirements (KPIs) to data points. Building the scalable data model ETL Tools: DBT SQL Data Orchestration and ETL Dataflow Cloud Composer Infrastructure and Deployment Kubernetes Helm Data Access and Management Looker Terraform Ideal Business Domain Experience: Supply chain or warehousing experience: The project is focused on building a normalized data layer which ingests information from multiple Warehouse Management Systems (WMS) and projects it for back-office analysis

Posted 1 week ago

Apply

4.0 - 9.0 years

0 - 2 Lacs

Chennai

Hybrid

Naukri logo

Job Description: We are seeking a skilled and proactive GCP Data Engineer with strong experience in Python and SQL to build and manage scalable data pipelines on Google Cloud Platform (GCP) . The ideal candidate will work closely with data analysts, architects, and business teams to enable data-driven decision-making. Key Responsibilities: Design and develop robust data pipelines and ETL/ELT processes using GCP services Write efficient Python scripts for data processing, transformation, and automation Develop complex SQL queries for data extraction, aggregation, and analysis Work with tools like BigQuery, Cloud Storage, Cloud Functions , and Pub/Sub Ensure high data quality, integrity, and governance across datasets Optimize data workflows for performance and scalability Collaborate with cross-functional teams to define and deliver data solutions Monitor, troubleshoot, and resolve issues in data workflows and pipelines Required Skills: Hands-on experience with Google Cloud Platform (GCP) Strong programming skills in Python for data engineering tasks Advanced proficiency in SQL for working with large datasets Experience with BigQuery , Cloud Storage , and Cloud Functions Familiarity with streaming and batch processing (e.g., Pub/Sub , Dataflow , or Dataproc )

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Technical Skills: Strong proficiency in .Net, .Net Core, C#, REST API. Strong expertise in PostgreSQL. Additional preferred Skills: Docker, Kubernetes. Cloud : GCP and Services: Google Cloud Storage, Pub/Sub. Monitor Tools: Dynatrace, Grafana, API Security and tooling (SonarQube) Agile Methodology Key Responsibilities: Design, develop and maintain scalable C# applications and microservice implementation. Implement RESTful APIs for efficient communication between client and server applications. Collaborate with product owners to understand requirements and create technical specifications. Build robust database solutions and ensure efficient data retrieval. Write clean, maintainable, and efficient code. Conduct unit testing, integration testing, and code reviews to maintain code quality. Work on implementation of Industry Standard protocols related to API Security including OAuth. Implement scalable and high-performance solutions that integrate with Pub/Sub messaging systems and other GCP services BQ, Dataflows, Spanner etc. Collaborate with cross-functional teams to define, design, and deliver new features. Integrate and manage data flows between different systems using Kafka, Pub/Sub, and other middleware technologies. Qualifications: Bachelors Degree or International equivalent 8+ years of IT experience in .NET. Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Technical Skills: Strong proficiency in .Net, .Net Core, C#, REST API. Strong expertise in PostgreSQL. Additional preferred Skills: Docker, Kubernetes. Cloud : GCP and Services: Google Cloud Storage, Pub/Sub. Monitor Tools: Dynatrace, Grafana, API Security and tooling (SonarQube) Agile Methodology Key Responsibilities: Design, develop and maintain scalable C# applications and microservice implementation. Implement RESTful APIs for efficient communication between client and server applications. Collaborate with product owners to understand requirements and create technical specifications. Build robust database solutions and ensure efficient data retrieval. Write clean, maintainable, and efficient code. Conduct unit testing, integration testing, and code reviews to maintain code quality. Work on implementation of Industry Standard protocols related to API Security including OAuth. Implement scalable and high-performance solutions that integrate with Pub/Sub messaging systems and other GCP services BQ, Dataflows, Spanner etc. Collaborate with cross-functional teams to define, design, and deliver new features. Integrate and manage data flows between different systems using Kafka, Pub/Sub, and other middleware technologies. Qualifications: Bachelors Degree or International equivalent 8+ years of IT experience in .NET/C#. Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Naukri logo

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Hope you are doing well! This is Abirami from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers for our automotive client in Chennai Sholinganallur location. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to abirami.rsk@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Abirami Getronics Recruitment team

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies