Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
16 - 25 years
18 - 27 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Sr Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for? Machine Learning Process-orientation Thought leadership Commitment to quality Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 1 month ago
4 - 9 years
16 - 31 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile
Posted 1 month ago
7 - 12 years
13 - 17 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
7 - 10 years
16 - 21 Lacs
Mumbai
Work from Office
Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.
Posted 1 month ago
11 - 21 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 3 - 20 Yrs Location- Pan India Job Description : - Skills: GCP, BigQuery, Cloud Composer, Cloud DataFusion, Python, SQL 5-20 years of overall experience mainly in the data engineering space, 2+ years of Hands-on experience in GCP cloud data implementation, Experience of working in client facing roles in technical capacity as an Architect. must have implementation experience of GCP based clous Data project/program as solution architect, Proficiency of using Google Cloud Architecture Framework in Data context Expert knowledge and experience of core GCP Data stack including BigQuery, DataProc, DataFlow, CloudComposer etc. Exposure to overall Google tech stack of Looker/Vertex-AI/DataPlex etc. Expert level knowledge on Spark.Extensive hands-on experience working with data using SQL, Python Strong experience and understanding of very large-scale data architecture, solutioning, and operationalization of data warehouses, data lakes, and analytics platforms. (Both Cloud and On-Premise) Excellent communications skills with the ability to clearly present ideas, concepts, and solutions If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in or you can reach me @ 8939853050 With Regards, Sankar G Sr. Executive - IT Recruitment
Posted 1 month ago
5 - 10 years
9 - 13 Lacs
Chennai
Work from Office
Overview GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) +Teradata Responsibilities GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata Requirements GCP all services (Pubsub, BQ, Airflow, data proc, cloud composer, gcs) + Teradata
Posted 1 month ago
4 - 8 years
15 - 30 Lacs
Bengaluru
Remote
Job Title: Senior GCP Data DevOps Engineer Job Type: Remote Exp: 4+ years Position Overview: As a Senior DevOps Engineer specializing in Google Cloud Platform (GCP), you will play a crucial role in designing, implementing, and managing our cloud infrastructure to ensure optimal performance, scalability, and reliability. You will collaborate closely with cross-functional teams to streamline development processes, automate deployment pipelines, and enhance overall system efficiency. Responsibilities: Design, implement, and manage scalable and highly available cloud infrastructure on Google Cloud Platform (GCP) to support our applications and services. Develop and maintain CI/CD pipelines to automate the deployment, testing, and monitoring of applications and microservices. Collaborate with software engineering teams to optimize application performance, troubleshoot issues, and ensure smooth deployment processes. Implement and maintain infrastructure as code (IaC) using tools such as Terraform , Ansible, or Google Deployment Manager. Monitor system health, performance, and security metrics, and implement proactive measures to ensure reliability and availability. Implement best practices for security, compliance, and data protection in cloud environments. Continuously evaluate emerging technologies and industry trends to drive innovation and improve infrastructure efficiency. Mentor junior team members and provide technical guidance and support as needed. Qualifications: Bachelor's degree in Computer Science, Engineering, or related field. 4-8 years of experience in a DevOps role, with a focus on Google Cloud Platform (GCP). In-depth knowledge of GCP services such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL , Pub/Sub, and BigQuery . Proficiency in scripting languages such as Python , Bash, or PowerShell. Experience with containerization technologies such as Docker and container orchestration platforms like Kubernetes. Strong understanding of CI/CD concepts and experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of infrastructure as code (IaC) principles and experience with tools such as Terraform, Ansible, or Google Deployment Manager. Experience with monitoring and logging tools such as Prometheus, Grafana, Stackdriver, or ELK Stack. Knowledge of security best practices and experience implementing security controls in cloud environments. Excellent problem-solving skills and ability to troubleshoot complex issues in distributed systems. Strong communication skills and ability to collaborate effectively with cross-functional teams. Preferred Qualifications: Google Cloud certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with other cloud platforms such as AWS or Azure. Familiarity with agile methodologies and DevOps practices. Experience with software development using languages such as Java, Node.js, or Go. Knowledge of networking concepts and experience with configuring network services in cloud environments. Skills: Gcp CloudSQLBigqueryKubernetesIac ToolsCi Cd PipelineTerraformPythonAirflowSnowflakePower BiIacData FlowPubsubCloud StorageCloud Computing
Posted 1 month ago
5 - 10 years
13 - 23 Lacs
Mangalore, Bengaluru
Work from Office
Position Overview We are looking for a Technical Lead with hands-on experience in React, Node.js, and cloud platforms like AWS or Azure. Youll drive the development of scalable, high-performance systems using modern architectures, collaborate on migration strategies, and build robust APIs. Strong knowledge of cloud services, containerization, and IoT technologies is essential. Job Role: Technical Lead Job Type: Full Time Experience: Minimum 5+ years Job Location: Bangalore/ Mangalore Technical Skills:AWS Cloud, Azure Cloud, TypeScript, Node, React About Us: We are a multi-award-winning creative engineering company. Since 2011, we have worked with our customers as a design and technology enablement partner, helping them on their digital transformation journey. Roles and Responsibilities: Evaluate existing systems and propose enhancements to improve efficiency, security, and scalability. Create technical documentation and architectural guidelines for the development team. Experience in developing software platforms using event-driven architecture Develop high-performance and throughput systems. Ability to define, track and deliver items to schedule. Collaborate with cross-functional teams to define migration strategies, timelines, and milestone Technical Skills: Hands-on experience in React & Node Hands-on experience in any one of the cloud provider like AWS, GCP or Azure Multiple database proficiency including SQL and NoSQL Highly skilled at facilitating and documenting requirements Experience developing REST API with JSON, XML for data transfer. lAbility to develop both internal facing and external facing APIs using JWT and OAuth2.0 Good understanding of cloud technologies, such as Docker, Kubernetes, MQTT, EKS, Lambda, IoT Core, and Kafka. Good understanding of messaging systems like SQS, PubSub Ability to establish priorities and proceed with objectives without supervision. Familiar with HA/DR, scalability, performance, code optimizations Good organizational skills and the ability to work on more than one project at a time. Exceptional attention to detail and good communication skills. Experience with Amazon Web Services, JIRA, Confluence, GIT, Bitbucket. Other Skills: Experience working with Go & Python Good understanding of IoT systems. Exposure to or knowledge of the energy industry. What we offer: A competitive salary and comprehensive benefits package. The opportunity to work on international projects and cutting-edge technology. A dynamic work environment that promotes professional growth, continuous learning, and mentorship. If you are passionate to work in a collaborative and challenging environment, apply now!
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane