Jobs
Interviews

207 Dataproc Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 30 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Mandatory Skills: Apache beam,Big-Query,Dataflow,DataProc,Composer,Airflow,Pyspark,Python,SQL.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 6+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist

Posted 2 weeks ago

Apply

1.0 - 2.0 years

3 - 6 Lacs

Dhule

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 16 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

What youll be doing We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Company. Understanding the business requirements and converting them to the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. Understanding devops process and contributing for devops pipelines What were looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. You’ll need to have: Bachelor’s degree or four or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in GCP cloud platform - (BigQuery/Cloud Composer/Data Proc(or Hadoop+Spark))/Cloud Function). Experience in any programming language preferably Python. Proficiency in graph data modeling, including experience with graph data models and graph query language. Exposure in working on GenAI use cases. Experience in troubleshooting the data issues. Experience in writing complex SQL and performance tuning. Experience in DevOps Experience in GraphDB , Core Java Experience in real time streaming and lambda architecture. Role & responsibilities Preferred candidate profile

Posted 2 weeks ago

Apply

1.0 - 2.0 years

3 - 5 Lacs

Ahmedabad

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 weeks ago

Apply

13.0 - 17.0 years

32 - 35 Lacs

Noida, Gurugram

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 weeks ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 2 weeks ago

Apply

12.0 - 18.0 years

50 - 90 Lacs

Bengaluru

Hybrid

What youll be doing: As a Sr Manager for the Data Engineering team, you will be managing data platforms and implementing new technologies and tools to further enhance and enable data science/analytics, focus to drive scalable data management and governance practices. Leading the team of data engineers & solutions architects to deliver solutions to business teams. Driving the vision with the leadership team for data platform enrichment covering the areas like Data Warehousing/Data Lake/BI across the portfolio. Defining and executing on a plan to achieve that vision. Building a high-quality Data engineering team and continuing to drive to scale up. Ensuring the team adheres to the standard methodologies on data engineering practices. Building cross-functional relationships with Data Scientists, Data Analysts and Business teams to understand data needs and deliver data for insight solutions. Driving the design, building, and launching of new data models and data pipelines. Driving data quality across all data pipelines and related business areas. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What we’re looking for: You are curious and passionate about Data and highly scalable data platforms. People count on you for your expertise in data management in all phases of the software development cycle. You create environments where teams thrive and feel valued, respected and supported. You enjoy the challenge of managing resources and competing priorities in a dynamic, complex and deadline-oriented environment. Building effective working relationships with other managers across the organization comes naturally to you. You’ll need to have: Bachelor’s degree or four or more years of work experience Six or more years of relevant work experience Two or more years of experience in leading the team and tracking the end-to-end deliverables Experience in end-to-end delivery of Data Platform Solutions and working on large scale data transformation Experience working with Google Cloud Platform, BigQuery & Data Proc Experience working with Bigdata Technologies & Utilities - Hadoop/Spark/Scala/Kafka/NiFi Experience with relational SQL and NoSQL databases Experience in working with globally distributed teams Good Communication and Presentation skills Knowledge of Data Governance and Data Quality Experience in building / mentoring the team Ability to meet tight deadlines, multi-task, and prioritize workload Role & responsibilities Preferred candidate profile

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Mumbai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role :Analytics and Modelor Project Role Description :Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills :Google BigQuery, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job :Key Responsibilities :Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)1:Proven track record of delivering data integration, data warehousing soln2:Strong SQL And Hands-on (No FLEX)2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX)4:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and KubernetesExp in cloud solutions, mainly data platform services , GCP Certifications 5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience :1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of BigQuery SQL scripts6:GCP Certified preferred7:Working in agile environment Professional Attributes :1:Must have good communication skills 2:Must have ability to collaborate with different teams and suggest solutions 3:Ability to work independently with little supervision or as a team 4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune

Work from Office

Description: Hiring Data Engineer with AWS or GCP Cloud Requirements: Role Summary: The Data Engineer will be responsible for designing, implementing, and maintaining the data infrastructure and pipelines necessary for AI/ML model training and deployment. They will work closely with data scientists and engineers to ensure data is clean, accessible, and efficiently processed Required Experience: • 6-8 years of experience in data engineering, ideally in financial services. • Strong proficiency in SQL, Python, and big data technologies (e.g., Hadoop, Spark). • Experience with cloud platforms (e.g., AWS, Azure, GCP) and data warehousing solutions. • Familiarity with ETL processes and tools. • Knowledge of data governance, security, and compliance best practices. Job Responsibilities: Key Responsibilities: • Build and maintain scalable data pipelines for data collection, processing, and analysis. • Ensure data quality and consistency for training and testing AI models. • Collaborate with data scientists and AI engineers to provide the required data for model development. • Optimize data storage and retrieval to support AI-driven applications. • Implement data governance practices to ensure compliance and security. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 3 weeks ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Hyderabad

Work from Office

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact! IBM’s Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies; defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform

Posted 3 weeks ago

Apply

2.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities includeComprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues and deploy applications to the cloud platform

Posted 3 weeks ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings! We have permanent opportunities for GCP Data Engineers in Chennai Location . Experience Required : 3 Years and above Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Skill Required: GCP Data Engineer, Advanced SQL, ETL Data pieplines, BigQuery, Dataflow, Bigtable, Data fusion, cloud spanner, python, java, javascript, If interested, kindly share the below details along with updated CV and to Narmadha.baskar @getronics.com Regards, Narmadha Getronics Recruitment team

Posted 3 weeks ago

Apply

12.0 - 19.0 years

30 - 40 Lacs

Pune, Chennai, Bengaluru

Work from Office

Strong understanding of data warehousing and data modeling Proficient understanding of distributed computing principles - Hadoop v2, MapReduce, HDFS Strong data engineering skills on GCP cloud platforms Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, Big Query Experience with building stream-processing systems, using solutions such as Storm or Spark- Streaming Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala Experience with Spark, SQL, and Knowledge of various ETL techniques and frameworks, such as Flume, Apache NiFi, or Experience with various messaging systems, such as Kafka or Good understanding of Lambda Architecture, along with its advantages and drawbacks

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Why this job matters We are searching for a proficient AI/ML engineer who can help us to extract value from our data. The resource will be responsible for E2E processes including data collection, cleaning & pre-processing, training of the models and deployment in all production and non-production environments. What youll be doing Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress. Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Analysing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Verifying data quality, and/or ensuring it via data cleaning. Supervising the data acquisition process if more data is needed. Defining validation strategies Defining the pre-processing or feature engineering to be done on given data. Defining data augmentation pipelines. Training models and tuning their hyperparameters. analysing the errors of the model and designing strategies to overcome them. Perform statistical analysis and fine-tuning using test results. Train and retrain systems when necessary. Strong knowledge on model deployment pipeline MLOPS and knowledge of AWS/GCP deployment. Skills Required Proven experience (4 or more years) as a Machine Learning Engineer/ Artificial Intelligence Engineer or similar role. Solving business problems using Machine Learning algorithms, Deep Learning/Neural Network algorithms, Sequential model development, and Time series data modelling. Experience with Computer Vision techniques, Convolutional Neural Networks (CNN), Generative AI, and Large Language Models (LLMs) Experience with deploying models using MLOps pipelines. Proficiency in handling both structured and unstructured data, including SQL, BigQuery, and DataProc. Hands-on experience with API development using frameworks like Flask, Django, and FastAPI. Automating business and functional operations using AIOps. Experience with cloud platforms such as GCP and AWS, and tools like Qlik (Added advantage) Understanding of data structures, data modelling and software architecture. Expertise in visualizing and manipulating big datasets. Deep knowledge of math, probability, statistics and algorithms. Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas. Knowledge in R or Java is a plus. Proficiency in TensorFlow or Keras and OpenCV is a plus. Excellent communication skills. Team player. Outstanding analytical and problem-solving skill. Familiarity with Linux environment. Low to medium familiarity with JIRA, GIT, Nexus, Jenkins etc is a plus. Minimum educational qualification: BE/B.Tech or similar degree in relevant field. The skills youll need Troubleshooting Agile Development Database Design/Development Debugging Programming/Scripting Microservices/Service Oriented Architecture Version Control IT Security Cloud Computing Continuous Integration/Continuous Deployment Automation & Orchestration Software Testing Application Development Algorithm Design Software Development Lifecycle Decision Making Growth Mindset Inclusive Leadership

Posted 3 weeks ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Sholinganallur

Hybrid

Skills Required:Big Query,, BigTable, Data Flow, Pub/Sub, Data fusion, Dataproc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Function, App Engine, AIRFLOW, Cloud Storage, BigTable, Cloud Spanner Skills Preferred:ETL Experience Required:• 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines Thanks & Regards, Varalakshmi V 9019163564

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 5 Lacs

Kochi, Thiruvananthapuram

Work from Office

Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 - Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 - Software Engineering and Developer 2 - Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 - Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: UST is looking for Java Senior developers to build end to end business solutions and to work with one of the leading financial services organization in the UK. The ideal candidate must possess strong background on frontend and backend development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team. Responsibilities: As a Java developer, you will - Maintain active relationships with Product Owner to understand business requirements, lead requirement gathering meetings and review designs with the product owner - Own backlog items and coordinate with other team members to develop the features planned for each sprint - Perform technical design reviews and code reviews - Mentor, Lead and Guide the team on technical skills - Be Responsible for prototyping, developing, and troubleshooting software in the user interface or service layers - Perform peer reviews on source code to ensure reuse, scalability and the use of best practices - Participate in collaborative technical discussions that focus on software user experience, design, architecture, and development - Perform demonstrations for client stakeholders on project features and sub features, which utilizes the latest Front end and Backend development technologies Requirements: - 5+ years of experience in Java/JEE development - Skills in developing applications using multi-tier architecture - 2+ years of experience in GCP service development is preferred - Skills in developing applications in GCP is preferred - Should be an expert in Cloud Composer, Data Flow, Dataproc, Cloud pub/sub, DAG creation - Python scripting knowledge is preferred - Apache Beam knowledge is mandatory - Java/JEE, Spring, Spring boot, REST/SOAP web services, Hibernate, SQL, Tomcat, Application servers (WebSphere), SONAR, Agile, AJAX, Jenkins... - Skills in UML, application designing/architecture, Design Patterns.. - Skills in Unit testing application using Junit or similar technologies - Capability to support QA teams with test plans, root cause analysis and defect fixing - Strong experience in Responsive design, cross browser web applications - Strong knowledge of web service models - Strong knowledge in creating and working with APIs - Experience with Cloud services, specifically on Google cloud - Strong exposure in Agile, Scaled Agile based development models - Familiar with Interfaces such as REST web services, swagger profiles, JSON payloads. - Familiar with tools/utilities such as Bitbucket / Jira / Confluence. Required Skills Java,Spring ,Spring Boot,Microservices

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! Solid experience designing, building, and maintaining cloud-based data platforms and infrastructure. Deep proficiency in GCP Cloud Services, including significant experience with Big Query, Cloud Storage, Data Proc, APIGEE, Cloud Run, Google Kubernetes Engine (GKE), Postgres, Artifact Registry, Secret Manager, and Access Management (IAM). Hands-on experience implementing and managing CI/CD Pipelines using tools like Tekton and potentially Astronomer. Strong experience with Job Scheduling and workflow orchestration using Airflow. Proficiency with Version Control systems, specifically Git. Strong programming skills in Python. Expertise in SQL and experience with relational databases like SQL Server, MY SQL, Postgres SQL. Experience with or knowledge of data visualization tools like Power BI. Familiarity with code quality and security scanning tools such as FOSSA and SonarQube. Foundational Knowledge on Artificial Intelligence and Machine Learning concepts and workflows. problem-solving skills and the ability to troubleshoot complex distributed systems. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure, GCP)Skills Required:GCP , Big Query,, AI/ML Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering/AIML Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Hyderabad

Work from Office

About this role: Wells Fargo is seeking a Senior Cloud Platform engineer IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Python skills. In this role, you will: Understanding of Cloud Platform Technologies (GCP preferred) in the big data and data warehousing space (BigQuery, Dataproc, Dataflow, Data Catalog, Cloud Composer/Airflow, GKE/Anthos). Hands-on experience in IaaC (Infrastructure as a Code) tools such as Terraform, Docker/OCI image creation, Kubernetes, Helm Charts, Self-healing mechanisms, Load-balancing, API Gateway. In-depth knowledge of Cloud tools/solutions such as Cloud Pub/Sub, GKE, IAM, Scalability, Fault-tolerant design, Availability, BCP. Ability to quickly learn and adapt to the new cloud platforms technologies Strong development experience in Python Extensive experience in working with Python API based solution design and integration. Required Qualifications, International: 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Bachelors or Masters Degree in Comp. Science or equivalent Desired Qualifications: GCP DevOps, Terraform and K8s Certification.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Should have developed/Worked for atleast 1 Gen AI project. Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

0 - 0 Lacs

Chennai

Work from Office

Who we are looking for: The data engineering team's mission is to provide high availability and high resiliency as a core service to our ACV applications. The team is responsible for ETL’s using different ingestion and transformation techniques. We are responsible for a range of critical tasks aimed at ensuring smooth and efficient functioning and high availability of ACVs data platforms. We are a crucial bridge between Infrastructure Operations, Data Infrastructure, Analytics, and Development teams providing valuable feedback and insights to continuously improve platform reliability, functionality, and overall performance. We are seeking a talented data professional as a Senior Data Engineer to join our Data Engineering team. This role requires a strong focus and experience in software development, multi-cloud based technologies, in memory data stores, and a strong desire to learn complex systems and new technologies. It requires a sound foundation in database and infrastructure architecture, deep technical knowledge, software development, excellent communication skills, and an action-based philosophy to solve hard software engineering problems. What you will do: As a Data Engineer at ACV Auctions you HAVE FUN !! You will design, develop, write, and modify code. You will be responsible for development of ETLs, application architecture, optimizing databases & SQL queries. You will work alongside other data engineers and data scientists in the design and development of solutions to ACV’s most complex software problems. It is expected that you will be able to operate in a high performing team, that you can balance high quality delivery with customer focus, and that you will have a record of delivering and guiding team members in a fast-paced environment. Design, develop, and maintain scalable ETL pipelines using Python and SQL to ingest, process, and transform data from diverse sources. Write clean, efficient, and well-documented code in Python and SQL. Utilize Git for version control and collaborate effectively with other engineers. Implement and manage data orchestration workflows using industry-standard orchestration tools (e.g., Apache Airflow, Prefect).. Apply a strong understanding of major data structures (arrays, dictionaries, strings, trees, nodes, graphs, linked lists) to optimize data processing and storage. Support multi-cloud application development. Contribute, influence, and set standards for all technical aspects of a product or service including but not limited to, testing, debugging, performance, and languages. Support development stages for application development and data science teams, emphasizing in MySQL and Postgres database development. Influence company wide engineering standards for tooling, languages, and build systems. Leverage monitoring tools to ensure high performance and availability; work with operations and engineering to improve as required. Ensure that data development meets company standards for readability, reliability, and performance. Collaborate with internal teams on transactional and analytical schema design. Conduct code reviews, develop high-quality documentation, and build robust test suites Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Mentor junior data engineers. Assist/lead technical discussions/innovation including engineering tech talks Assist in engineering innovations including discovery of new technologies, implementation strategies, and architectural improvements. Participate in on-call rotation What you will need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English. 4+ years of experience programming in Python 3+ years of experience with ETL workflow implementation (Airflow, Python) 3+ years work with continuous integration and build tools. 3+ years of experience with Cloud platforms preferably in AWS or GCP Knowledge of database architecture, infrastructure, performance tuning, and optimization techniques. Deep Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Proficient in databases (RDB), SQL, and can contribute to schema definitions. Self-sufficient debugger who can identify and solve complex problems in code. Deep understanding of major data structures (arrays, dictionaries, strings). Experience with Domain Driven Design. Experience with containers and Kubernetes. Experience with database monitoring and diagnostic tools, preferably Data Dog. Hands-on skills and the ability to drill deep into the complex system design and implementation. Proficiency in SQL query writing and optimization. Experience with database security principles and best practices. Experience with in-memory data processing Experience working with data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment. Experience working with: SQL data-layer development experience; OLTP schema design Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Github, Jenkins, Python, Docker, Kubernetes Nice to Have Qualifications Experience with Airflow, Docker, Visual Studio, Pycharm, Redis, Kubernetes, Fivetran, Spark, Dataflow, Dataproc, EMR Experience with database monitoring and diagnostic tools, preferably DataDog Hands-on experience with Kafka or other event streaming technologies. Hands-on experience with micro-service architecture

Posted 3 weeks ago

Apply

4.0 - 9.0 years

11 - 19 Lacs

Chennai

Work from Office

Role & responsibilities Python, Dataproc, Airflow PySpark, Cloud Storage, DBT, DataForm, NAS, Pubsub, TERRAFORM, API, Big Query, Data Fusion, GCP, Tekton Preferred candidate profile Data Engineer in Python - GCP Location Chennai Only 4+ Years of Experience

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies