Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 7 Lacs
Pune, Ahmedabad
Work from Office
We are seeking a skilled and motivated Google / AWS Cloud DevOps Engineer with over 3 years of hands-on experience in building and maintaining scalable, reliable, and secure cloud infrastructure. You will be part of a dynamic team that focuses on delivering robust DevOps solutions using Google Cloud Platform (GCP), AWS, helping streamline CI/CD pipelines, automate infrastructure provisioning, and optimize cloud-based deployments. Key Responsibilities: Design, implement, and manage scalable and secure infrastructure on Google Cloud Platform / AWS. Develop and maintain CI/CD pipelines using tools such as Cloud Build, Jenkins, GitLab CI/CD, or similar. Implement infrastructure as code (IaC) using Terraform or Pulumi. Monitor system health and performance using AWS / GCPs operations suite (formerly Stackdriver). Automate manual processes to improve system reliability and deployment frequency. Collaborate with software engineers to ensure best DevOps practices are followed in application development and deployment. Handle incident response and root cause analysis for production issues. Ensure compliance with security and governance policies on AWS / GCP. Optimize cost and resource utilization across cloud services. Required Qualifications: 3+ years of hands-on experience with DevOps tools and practices in a cloud environment. Strong experience with Google Cloud Platform (GCP) / AWS services (Compute Engine, Kubernetes Engine, Cloud Functions, Cloud Storage, VPC, etc.). Google / AWS Cloud Professional Cloud DevOps Engineer certification is mandatory. Proficiency with CI/CD tools and version control systems (e.g., Git, GitHub/GitLab, Cloud Build). Solid scripting skills in Bash, Python, or similar languages. Experience with Docker and Kubernetes. Familiarity with monitoring/logging tools such as Prometheus, Grafana, and Cloud Monitoring. Knowledge of networking, security best practices, and IAM on GCP / AWS. Preferred Qualifications: Experience with multi-cloud or hybrid cloud environments. Familiarity with Agile and DevOps culture and practices. Experience with serverless architectures and event-driven design patterns. Knowledge of cost optimization and GCP/AWS billing.
Posted 13 hours ago
1.0 - 3.0 years
2 - 4 Lacs
Kolkata
Hybrid
Required Skills Strong proficiency in Python (3.x) and Django (2.x/3.x/4.x) Hands-on experience with Django REST Framework (DRF) Expertise in relational databases like PostgreSQL or MySQL Proficiency with Git and Bitbucket Solid understanding of RESTful API design and integration Experience in domain pointing and hosting setup on AWS or GCP Deployment knowledge on EC2 , GCP Compute Engine , etc. SSL certificate installation and configuration Familiarity with CI/CD pipelines (GitHub Actions, Bitbucket Pipelines, GitLab CI) Basic usage of Docker for development and containerization Ability to independently troubleshoot server/deployment issues Experience managing cloud resources like S3 , Load Balancers , and IAM roles Preferred Skills Experience with Celery and Redis / RabbitMQ for asynchronous task handling Familiarity with front-end frameworks like React or Vue.js Exposure to Cloudflare or similar CDN/DNS tools Experience with monitoring tools: Prometheus , Grafana , Sentry , or CloudWatch Why Join Us? Work on impactful and modern web solutions Growth opportunities across technologies and cloud platforms Collaborative, inclusive, and innovation-friendly work environment Exposure to challenging and rewarding projects
Posted 4 days ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
We are looking for an experienced and motivated Senior GCP Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and optimizing data pipelines, implementing advanced analytics solutions, and maintaining robust data infrastructure using Google Cloud Platform (GCP) services. You will play a key role in enabling data-driven decision-making and enhancing the performance and scalability of our data ecosystem. Key Responsibilities: Design, implement, and optimize data pipelines using Google Cloud Platform (GCP) services, including Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Lead the design and optimization of schema for large-scale data systems, ensuring data consistency, integrity, and scalability. Work closely with cross-functional teams to understand data requirements and deliver efficient, high-performance solutions. Design and execute complex SQL queries for BigQuery and other databases, ensuring optimal performance and efficiency. Implement efficient data processing workflows and streaming data solutions using Cloud Pub/Sub and Dataflow . Develop and maintain data models, schemas, and data marts to ensure consistency and scalability across datasets. Ensure the scalability, reliability, and security of cloud-based data architectures. Optimize cloud storage, compute, and query performance, driving cost-effective solutions. Collaborate with data scientists, analysts, and software engineers to create actionable insights and drive business outcomes. Implement best practices for data management, including governance, quality, and monitoring of data pipelines. Provide mentorship and guidance to junior data engineers and collaborate with them to achieve team goals. Required Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent work experience). 5+ years of experience in data engineering, with a strong focus on Google Cloud Platform (GCP) . Extensive hands-on experience with GCP Compute Engine , BigQuery , Cloud Pub/Sub , Dataflow , Cloud Storage , and AlloyDB . Strong expertise in SQL for query optimization and performance tuning in large-scale datasets. Solid experience in designing data schemas, data pipelines, and ETL processes. Strong understanding of data modeling techniques, and experience with schema design for both transactional and analytical systems. Proven experience optimizing BigQuery performance, including partitioning, clustering, and cost optimization strategies. Experience with managing and processing streaming data and batch data processing workflows. Knowledge of AlloyDB for managing transactional databases in the cloud and integrating them into data pipelines. Familiarity with data security, governance, and compliance best practices on GCP. Excellent problem-solving skills, with the ability to troubleshoot complex data issues and find efficient solutions. Strong communication and collaboration skills, with the ability to work with both technical and non-technical stakeholders. Preferred Qualifications: Bachelor's/Masters degree in Computer Science, Data Engineering, or a related field. Familiarity with infrastructure as code tools like Terraform or Cloud Deployment Manager . GCP certifications (e.g., Google Cloud Professional Data Engineer or Cloud Architect ).
Posted 1 week ago
6.0 - 10.0 years
12 - 18 Lacs
Hyderabad
Hybrid
Role & Responsibilities Role Overview: We are seeking a talented and forward-thinking DevOps Engineer for one of the large financial services GCC based in Hyderabad with responsibilities including designing, implementing, and maintaining CI/CD pipelines, monitoring system performance, automating deployments, ensuring infrastructure scalability and security, collaborating with development and IT teams, and optimizing workflow efficiency. Technical Requirements: Experienced in setting and delivering DevOps strategy Proficient in collaborating with engineering teams to understand their needs Skilled in setting up, maintaining, optimizing, and evolving DevOps tooling and infrastructure Strong knowledge of automating development, quality engineering, deployment, and release processes Familiarity with Agile and Waterfall methodologies and supporting toolchains Ability to identify technical problems and develop effective solutions Hands-on experience with a variety of technologies including Git, Kubernetes, Docker, Jenkins, and scripting/programming languages Competence in implementing DevOps and Agile patterns such as CI/CD pipelines, source code management, automation, and infrastructure as code Understanding of IT management practices, software currency, and security measures Experience in GCP infrastructure, Terraform, Harness for CI/CD automation, and deployments Proficiency in team leadership, communication, and problem-solving skills Functional Requirements: Demonstrated team leadership and DevOps experience Exposure to GCP infrastructure including Compute Engine, VPC, IAM, Cloud Functions, and GKE Hands-on experience with various DevOps technologies such as Git, Kubernetes, Docker, Jenkins, SonarQube, and scripting/programming languages Strong organizational, time management, and multitasking skills Ability to work collaboratively, build relationships, and adapt to various domains and disciplines Passion for developing new technologies and optimizing software delivery processes Understanding of security compliance, networking, and firewalls Willingness to learn, grow, and develop within a supportive and inclusive environment Ability to propose new technologies and methodologies for software delivery optimization This role offers a compelling opportunity for a seasoned DevOps Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements. Qualification Engineering Grad / Postgraduate CRITERIA Helm experience Networking and security (firewalls, IAM roles) experience Security compliance understanding Relevant Experience: 6-9 years
Posted 2 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Chennai, Tamil Nadu
Work from Office
Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent
Posted 1 month ago
4.0 - 7.0 years
8 - 14 Lacs
Noida
Hybrid
Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France