Jobs
Interviews

1912 Azure Cloud Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

0 Lacs

Hyderabad

Work from Office

8+ years of hands-on experience in .NET development Strong expertise in React for front-end development Proficiency with Microsoft Azure cloud services Full-stack development experience with modern web technologies Health insurance Provident fund

Posted 1 month ago

Apply

3.0 - 8.0 years

11 - 21 Lacs

Bengaluru

Work from Office

Experience with #Cloudplatforms ( #AWS , #Azure , #GCP ). Familiarity with #CICD pipelines and #DevOps practices. Exposure to front-end technologies ( #HTML , #CSS , #JavaScript ). Knowledge of containerization tools like #Docker and #Kubernetes . #KeyResponsibilities : Design, develop, and maintain Java-based applications. Write clean, efficient, and well-documented code. Collaborate with cross-functional teams to define, design, and ship new features. Troubleshoot and resolve software defects and issues. Participate in code reviews and contribute to team knowledge sharing. Ensure the performance, quality, and responsiveness of applications. Stay up to date with emerging technologies and industry trends. #RequiredSkills and #Qualifications :- Bachelors or master’s degree Proven experience in #Java development (e.g., #Java SE, #JavaEE , #Javaversions (8-17)). Strong understanding of object-oriented programming principles. Experience with frameworks such as #Springboot and related libraries Familiarity with #RESTfulAPIs and web services. Knowledge of #Databases (e.g., #MySQL , #PostgreSQL , #Oracle ). Proficient in version control tools like #Git . Good problem-solving and analytical skills.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune, Chennai

Work from Office

Job Summary This position performs duties and tasks to support full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides development, maintenance, and support for applications. Responsibilities: Assists with Information Systems projects. Assists in system analysis and design. Designs and develops low to moderately complex applications. Generates application documentation. Supports integration builds. Assists with maintenance and support. Primary Skills: Azure Cloud, .NET core, Angular 14/16, SQL Server. Secondary Skills: SAFe Framework, Scrum Masters Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field - Preferred.

Posted 1 month ago

Apply

8.0 - 12.0 years

22 - 27 Lacs

Indore, Chennai

Work from Office

We are hiring a Senior Python DevOps Engineer to develop scalable apps using Flask/FastAPI, automate CI/CD, manage cloud and ML workflows, and support containerized deployments in OpenShift environments. Required Candidate profile 8+ years in Python DevOps with expertise in Flask, FastAPI, CI/CD, cloud, ML workflows, and OpenShift. Skilled in automation, backend optimization, and global team collaboration.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Mumbai

Work from Office

Job Summary This position performs duties and tasks to support full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides development, maintenance, and support for applications. Responsibilities: Assists with Information Systems projects. Assists in system analysis and design. Designs and develops low to moderately complex applications. Generates application documentation. Supports integration builds. Assists with maintenance and support. Primary Skills: Azure Cloud, .NET core, Angular JS, SQL Server, Secondary Skills: Python, CI/CD Technologies, Kubernetes. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field - Preferred

Posted 1 month ago

Apply

3.0 - 6.0 years

12 - 20 Lacs

Bengaluru

Hybrid

WHAT YOU'LL DO As an Application Expert in Warehouse 3PL Global, you will be instrumental in driving initiatives from concept to completion. Your role will focus on ensuring smooth integration between our partners' Warehouse Management System (WMS) and H&Ms internal systems. Youll collaborate closely on gathering requirements, supporting the design phase, and testing warehouse processes to ensure seamless operations. Key Responsibilities Collaborate with warehouses to gather requirements, conduct tests, and document processes. Coordinate and perform system integration tests with WMS and related systems. Contribute to improvement projects in partnership with the 3PL Global product team. Work closely with WMS/3PL partners to ensure smooth integration within the H&M ecosystem. Foster engagement with both internal and external stakeholders across global locations. Partner with the product manager and other cross-functional teams to share best practices, discuss design solutions, and drive continuous improvement. Who You'll Work With In the Warehouse 3PL Global product team, you will collaborate with logistics warehouse partners (3PL) on various projects and continuous improvement initiatives. This could include tasks such as setting up new warehouses with either new or existing 3PL partners or integrating new sales channels, like a new marketplace. The 3PL Global team oversees and collaborates with our 3PL partners across the Americas and APAC regions. Youll work closely with the Product Manager, Service Owner, Business Expert, Software Engineers and other Application Experts within the team, as well as with cross-functional teams. Who You Are We are looking for people with 3-6 years of experience in application support for SaaS applications. A proven track record in stakeholder management and coordination. Hands-on experience working with WMS solutions. Familiarity with eCommerce operations and processes. A solid understanding of the Azure cloud platform. And people who are Strong team players with excellent interpersonal skills. Proactive, self-driven, and motivated to take initiative. Detail-oriented, organized, and able to manage multiple tasks efficiently. Adaptable and open to change in dynamic environments. Problem solvers with a strategic, solution-focused mindset. Capable of thriving in a fast-paced, ever-changing environment. WHO WE ARE H&M is a global company of strong fashion brands and ventures. Our goal is to prove that there is no compromise between exceptional design, affordable prices, and sustainable solutions. We want to liberate fashion for the many, and our customers are at the heart of every decision we make. We are made up of thousands of passionate and talented colleagues united by our shared culture and values. Together, we want to use our power, our scale, and our knowledge to push the fashion industry towards a more inclusive and sustainable future. Help us re-imagine fashion and together we will re-shape our industry. Learn more about H&M here. WHY YOULL LOVE WORKING HERE At H&M , we are proud to be a vibrant and welcoming company. We offer our employees attractive benefits with extensive development opportunities around the globe. We offer all our employees at H&M attractive benefits with extensive development opportunities around the globe. All our employees receive a staff discount card, usable on all our H&M brands in stores and online. Brands covered by the discount are H&M (Beauty and Move included), COS, Weekday, Monki, H&M HOME, & Other Stories, ARKET, Afound. In addition to our staff discount, all our employees are included in our H&M Incentive Program HIP. You can read more about our H&M Incentive Program here. In addition to our global benefits, all our local markets offer different competitive perks and benefits. Please note that they may differ between employment types and countries. JOIN US Our uniqueness comes from a combination of many things our inclusive and collaborative culture, our strong values, and opportunities for growth. But most of all, its our people who make us who we are. Take the next step in your career together with us. The journey starts here. *We are committed to a recruitment process that is fair, equitable, and based on competency. We therefore kindly ask you to not attach a cover letter in your application.

Posted 1 month ago

Apply

2.0 - 4.0 years

7 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

POSITION Senior Data Engineer / Data Engineer LOCATION Bangalore/Mumbai/Kolkata/Gurugram/Hyd/Pune/Chennai EXPERIENCE 2+ Years JOB TITLE: Senior Data Engineer / Data Engineer OVERVIEW OF THE ROLE: As a Data Engineer or Senior Data Engineer, you will be hands-on in architecting, building, and optimizing robust, efficient, and secure data pipelines and platforms that power business-critical analytics and applications. You will play a central role in the implementation and automation of scalable batch and streaming data workflows using modern big data and cloud technologies. Working within cross-functional teams, you will deliver well-engineered, high-quality code and data models, and drive best practices for data reliability, lineage, quality, and security. HASHEDIN BY DELOITTE 2025 Mandatory Skills: Hands-on software coding or scripting for minimum 3 years Experience in product management for at-least 2 years Stakeholder management experience for at-least 3 years Experience in one amongst GCP, AWS or Azure cloud platform Key Responsibilities: Design, build, and optimize scalable data pipelines and ETL/ELT workflows using Spark (Scala/Python), SQL, and orchestration tools (e.g., Apache Airflow, Prefect, Luigi). Implement efficient solutions for high-volume, batch, real-time streaming, and event-driven data processing, leveraging best-in-class patterns and frameworks. Build and maintain data warehouse and lakehouse architectures (e.g., Snowflake, Databricks, Delta Lake, BigQuery, Redshift) to support analytics, data science, and BI workloads. Develop, automate, and monitor Airflow DAGs/jobs on cloud or Kubernetes, following robust deployment and operational practices (CI/CD, containerization, infra-as-code). Write performant, production-grade SQL for complex data aggregation, transformation, and analytics tasks. Ensure data quality, consistency, and governance across the stack, implementing processes for validation, cleansing, anomaly detection, and reconciliation. Collaborate with Data Scientists, Analysts, and DevOps engineers to ingest, structure, and expose structured, semi-structured, and unstructured data for diverse use-cases. Contribute to data modeling, schema design, data partitioning strategies, and ensure adherence to best practices for performance and cost optimization. Implement, document, and extend data lineage, cataloging, and observability through tools such as AWS Glue, Azure Purview, Amundsen, or open-source technologies. Apply and enforce data security, privacy, and compliance requirements (e.g., access control, data masking, retention policies, GDPR/CCPA). Take ownership of end-to-end data pipeline lifecycle: design, development, code reviews, testing, deployment, operational monitoring, and maintenance/troubleshooting. Contribute to frameworks, reusable modules, and automation to improve development efficiency and maintainability of the codebase. Stay abreast of industry trends and emerging technologies, participating in code reviews, technical discussions, and peer mentoring as needed. Skills & Experience: Proficiency with Spark (Python or Scala), SQL, and data pipeline orchestration (Airflow, Prefect, Luigi, or similar). Experience with cloud data ecosystems (AWS, GCP, Azure) and cloud-native services for data processing (Glue, Dataflow, Dataproc, EMR, HDInsight, Synapse, etc.). © HASHEDIN BY DELOITTE 2025 Hands-on development skills in at least one programming language (Python, Scala, or Java preferred); solid knowledge of software engineering best practices (version control, testing, modularity). Deep understanding of batch and streaming architectures (Kafka, Kinesis, Pub/Sub, Flink, Structured Streaming, Spark Streaming). Expertise in data warehouse/lakehouse solutions (Snowflake, Databricks, Delta Lake, BigQuery, Redshift, Synapse) and storage formats (Parquet, ORC, Delta, Iceberg, Avro). Strong SQL development skills for ETL, analytics, and performance optimization. Familiarity with Kubernetes (K8s), containerization (Docker), and deploying data pipelines in distributed/cloud-native environments. Experience with data quality frameworks (Great Expectations, Deequ, or custom validation), monitoring/observability tools, and automated testing. Working knowledge of data modeling (star/snowflake, normalized, denormalized) and metadata/catalog management. Understanding of data security, privacy, and regulatory compliance (access management, PII masking, auditing, GDPR/CCPA/HIPAA). Familiarity with BI or visualization tools (PowerBI, Tableau, Looker, etc.) is an advantage but not core. Previous experience with data migrations, modernization, or refactoring legacy ETL processes to modern cloud architectures is a strong plus. Bonus: Exposure to open-source data tools (dbt, Delta Lake, Apache Iceberg, Amundsen, Great Expectations, etc.) and knowledge of DevOps/MLOps processes. Professional Attributes: Strong analytical and problem-solving skills; attention to detail and commitment to code quality and documentation. Ability to communicate technical designs and issues effectively with team members and stakeholders. Proven self-starter, fast learner, and collaborative team player who thrives in dynamic, fast-paced environments. Passion for mentoring, sharing knowledge, and raising the technical bar for data engineering practices. Desirable Experience: Contributions to open source data engineering/tools communities. Implementing data cataloging, stewardship, and data democratization initiatives. Hands-on work with DataOps/DevOps pipelines for code and data. Knowledge of ML pipeline integration (feature stores, model serving, lineage/monitoring integration) is beneficial. © HASHEDIN BY DELOITTE 2025 EDUCATIONAL QUALIFICATIONS: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience). Certifications in cloud platforms (AWS, GCP, Azure) and/or data engineering (AWS Data Analytics, GCP Data Engineer, Databricks). Experience working in an Agile environment with exposure to CI/CD, Git, Jira, Confluence, and code review processes. Prior work in highly regulated or large-scale enterprise data environments (finance, healthcare, or similar) is a plus.

Posted 1 month ago

Apply

5.0 - 7.0 years

12 - 16 Lacs

Pune

Work from Office

Roles and Responsibilities Design, develop, test, deploy, and maintain cloud-based applications using Microsoft Azure platform. Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Develop microservices architecture for scalability, reliability, and maintainability of applications. Implement event-driven architectures using Azure Event Hubs and Azure Functions. Ensure high availability of services by implementing load balancing strategies.

Posted 1 month ago

Apply

0.0 - 1.0 years

0 Lacs

Pune

Work from Office

Company Name: Quick heal technologies LTD Work Location: Pune, Maharashtra, India Job Profile: Trainee Software Engineer / Trainee System admin Experience Required: Freshers Degree Required: BE/BTech/BCA/MCA Job Description for Campus Freshers Hiring: Quick Heal Technologies Limited is one of the leading providers of IT Security and Data Protection Solutions with a strong footprint in India and an evolving global presence. Quickheal a leading cybersecurity firm having product lines across Antivirus, EPS, Zero trust , Data privacy etc is looking for freshers for its IT department for various technology area. Candidates who are willing to make their career in Cyber security world and open to learn new technology, should apply to this position. To be successful as a trainee software engineer or trainee sysadmin, you should always be expanding your engineering knowledge and sharpening your communication skills. Outstanding candidates learn from their mentors, but also feel confident sharing their own ideas with the team. The idle candidate should possess a high degree of initiative and flexibility. He or she should have excellent spoken and written communication skills. Selected candidates will be given training on required technologies and will be put on the job training. After successful completion of 6 months of trainee program, you will be offered a permanent position subjected to your performance during the training period. The candidate should have knowledge in one or more below mentioned areas. Operating system concepts (Linux / Windows) Knowledge of Cloud technologies (Aws/Azure etc) knowledge in any database Knowledge of Web Services, Microservices and API. Programming language like C/C++, java , Python, Ansible etc Basic IT Infrastructure knowledge The available IT roles are as below. Network engineer IT Application support engineer ( SAP , CRM/CTI ) System administrator (Linux / Windows / VMware) Monitoring and observability DEVOPS SRE Backup and Storage If intertested please share your resume at banushree.c@quickheal.com Thanks&Regards Banushree

Posted 1 month ago

Apply

0.0 - 1.0 years

4 - 4 Lacs

Thiruvananthapuram

Work from Office

Roles and Responsibilities : Must be a graduate in BE/BTech/MCA in IT/CS or a related field. Excellent technical and interpersonal communication skills. Basic understanding of programming languages such as JavaScript, C#, Python, SQL, and TypeScript. Familiarity with web development frameworks like React, Angular, Blazor, or Vue.js. Knowledge of cloud platforms such as Azure, AWS, or Google Cloud Platform. Understanding of containerization technologies like Docker. Basic knowledge of database management systems such as MSSQL, MySQL, PostgreSQL, or MongoDB. Good understanding of SDLC, STLC, Agile methodologies, and business process analysis. Familiarity with RESTful APIs, microservices concepts, and DevOps practices. Basic understanding of security best practices in software development. Familiarity with version control systems like Git. Preferred/Good to Have Internship or project experience in software development, DevOps, cloud, or related fields. Familiarity with business intelligence tools (e.g., Power BI, Tableau). Knowledge of identity and access management solutions. Interest or coursework in artificial intelligence, machine learning, or data science. Experience with scripting languages (e.g., Bash, PowerShell). Active participation in technical forums (e.g., Stack Overflow) or GitHub contributions. Exposure to UI/UX design principles. Experience with mobile app development (Android/iOS/Flutter/React Native) is a plus. Awareness of CI/CD pipelines and infrastructure as code tools (e.g., Terraform, Ansible). Experience working with orchestration technologies like Kubernetes. Strong logical, analytical, and problem-solving skills; experience in hackathons, coding challenges, or open-source contributions is a plus. Any certifications (e.g., Microsoft, AWS, Google, Scrum) are an added advantage.

Posted 1 month ago

Apply

7.0 - 12.0 years

12 - 18 Lacs

Pune, Chennai, Coimbatore

Hybrid

Hiring "Azure & devops" for Pune/Chennai/Coimbatore Locations. Overall Experience: 6- 12 yrs If you are interested in the below-mentioned position, please share your updated CV to sandhya_allam@epam.com along with the following details: Shortlisted applicants will be contacted directly. 1. Have you applied for a role in EPAM in the recent times 2. Years of Experience in Azure Cloud and DevOps Solutions 3. Years of Experience in Docker & Kubernetes 4. Years of Experience in Terraform 5. Experience in python/Bash/powershell : 6. Current Salary 7.Expected Salary 8. Notice Period (Negotiable or Mandate Responsibilities : Responsible for fault-tolerance, high-availability, scalability, and security on AZURE Infra and Platform. Responsible for implementation of CI/CD pipelines with automated build and test systems. Responsible for Production Deployment using Multiple Deployment Strategies. Responsible for Automating the AZURE Infrastructure and Platform Deployment with IAAC. Responsible for Automating System Configurations using Configuration Management Tools. Hands on Production Experience with AZURE Compute Service: VM Management, VMSS, AKS, Container Instance, Autoscaling, Load Balancers, Spot Instances, App Service,. Hands on Production Experience with AZURE Network Service: VNET, Subnets, Express Route, Azure Gateway, VPN, Load Balancer, DNS, Traffic Manager, CDN, Front Door, Private Link, Network Watcher Good Automation Skills using AZURE Orchestration Tools- Terraform, Ansible, ARM & CLI. Hands on Production experience in Docker and Container Orchestration using AKS, ACR. Ability to write scripts (Linux/shell/Python/PowerShell/Bash/CLI) to automate Cloud Automation Tasks

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Bengaluru

Work from Office

Role & responsibilities Extensive knowledge and proven experience as a software developer, with exposure to elements of our back end technology stack (C#, .Net). • Front-End JavaScript Frameworks, especially Angular • Good experience working with AWS/Azure, Microservices, API development • Knowledge and application of software engineering practices (e.g. Unit testing, TDD, CI/CD, SOLID, BDD etc.). • Atlassian Jira, Confluence & JFrog Artifactory • Software security best practices and implementation (e.g. OWASP, PKI, X509 Certificates, TLS) • Software development for regulated environments (e.g. IVD / Medical devices)

Posted 1 month ago

Apply

5.0 - 8.0 years

12 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Key Skills: DevOps, Google, AWS Cloud, Azure Cloud, Cloud, Google Cloud Platform, Azure Devops Roles and Responsibilities: Design, implement, and manage cloud-based infrastructure using AWS, Azure, or Google Cloud Platform (GCP). Automate infrastructure provisioning and configuration using tools like Terraform and Ansible. Build and maintain CI/CD pipelines using Jenkins, Azure DevOps, or similar tools. Work with Docker and Kubernetes for containerization and orchestration of services. Collaborate with development and operations teams to ensure smooth and efficient software delivery. Monitor system performance and reliability; implement tools and processes for incident response and problem resolution. Ensure infrastructure security and compliance best practices are followed. Develop and maintain scripts for automation and system management using Bash, Python, or other scripting languages. Skills Required: Proven experience (3 to 18 years) in DevOps Engineering or Site Reliability Engineering (SRE). Hands-on experience with one or more cloud platforms: AWS, Azure, or GCP. Proficiency in scripting languages such as Bash, Python, or PowerShell. Expertise in Docker, Kubernetes, Terraform, Jenkins, CI/CD, and Ansible. Strong understanding of networking, monitoring, and system security best practices. Excellent troubleshooting, communication, and collaboration skills. Education : Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent practical experience).

Posted 1 month ago

Apply

6.0 - 9.0 years

10 - 18 Lacs

Bengaluru

Hybrid

Experience: 6 to 9 years Location: Bangalore Notice Period : immediate or 15 days Senior Devops Engineer

Posted 1 month ago

Apply

12.0 - 20.0 years

35 - 40 Lacs

Navi Mumbai

Work from Office

Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.

Posted 1 month ago

Apply

4.0 - 9.0 years

11 - 21 Lacs

Noida, Hyderabad, Pune

Hybrid

Experienced .NET Senior Engineer with over 4+ years of expertise in software development, specializing in Microsoft technologies, cloud solutions, and modern development frameworks. Proficient in Angular, Azure services, backend and frontend development, and agile methodologies like Scrum and Kanban. TECHNICAL SUMMARY: ? Experience in building .NET Core 6.0 WebAPI services for robust and scalable communication. ? HTML, CSS, React JS, Angular, Node.js, EXT JS, JavaScript, Struts, CI/CD, Jenkins, GIT, JUnit etc. ? Optimized database interactions with Azure SQL Managed Instance and Entity Framework Code First. ? Designed and deployed secure Azure Functions for background processing and API integrations. ? Leveraged Azure Key Vault for secure storage and management. ? Programming Languages: C#, VB.Net, TypeScript, JavaScript, SQL, LINQ. ? Frameworks: .NET Core, MVC, Entity Framework, Angular, React, Ionic. ? Azure Services: Azure App Services, Azure Functions, Azure SQL Managed Instance, Docker. ? Tools: Visual Studio, Git, Docker, Kendo UI, RxJS, Redis. ? Development Practices: CQRS, TDD, SOLID, Microservices, REST APIs, GraphQL, Clean Architecture. Certifications: ? Microsoft Azure certified developer ? Microsoft Certified Professional SQL Server 6.5 Exps- 4yrs to 10yrs

Posted 1 month ago

Apply

4.0 - 7.0 years

8 - 15 Lacs

Pune

Work from Office

Design, implement & optimize secure, deployment, automation & maintenance of cloud applications & services. Strong scripting skills in Python, Bash or PowerShell AWS/Azure - EC2, S3, RDS, Lambda, VPC, IAM CI/CD tools ( Gitlab CI/CD, Jenkins) Required Candidate profile AWS/GCP/Azure Certified DevOps Engineer, AWS/ GCP/Azure Certified Sys Ops Administrator AWS/ GCP/Azure Certified Solutions Architect

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Noida, Hyderabad, Pune

Work from Office

Key Responsibilities: • Design and architect cloud-native Java applications using microservices principles. • Lead development teams to ensure best practices for coding, architecture, and cloud deployment. • Develop and implement cloud migration and optimization strategies. • Ensure system performance, security, and scalability. Required Skills: • Strong experience in Java, Spring Boot, and microservices architecture. • Proficiency in cloud platforms (Azure, or GCP). • Hands-on experience with CI/CD pipelines, containers (Docker), and Kubernetes. • Solid understanding of cloud-native design patterns and security practices. Preferred: • Cloud certifications (AWS Solutions Architect, Azure Architect, or similar). • Experience with distributed systems and hybrid cloud environments

Posted 1 month ago

Apply

3.0 - 4.0 years

1 - 6 Lacs

Pune

Work from Office

About Invezza Technologies Invezza is a technology consulting and outsourced product development company. We believe in growing together and creating long-term relationships with the common purpose of delivering innovative solutions and cutting-edge digital experiences. We are technically creative innovators with a deep passion for technology. Work Location: Baner, Pune Job Description Responsibilities : Collaborate with software engineering teams to understand application requirements and architecture and integrate DevOps practices into the development lifecycle. Manage and maintain cloud infrastructure, and ensuring scalability, performance, and security. Design, implement, and maintain automated CI/CD pipelines to facilitate rapid and reliable software delivery. Monitor system health, performance, and security using tools like Prometheus, Grafana, and other monitoring solutions. Implement and manage containerization technologies like Docker and orchestration tools like Kubernetes for deploying and managing applications. Collaborate with security teams to implement and maintain best practices for security and compliance in the DevOps processes. Troubleshoot and resolve infrastructure and application-related issues in development, testing, and production environments. Stay up to date with the latest trends and technologies in DevOps, cloud computing, and automation. Qualifications: 2-4 years of professional experience in DevOps administration, or software development roles. Strong experience with Linux Commands, Shell Scripting, Docker, MySQL Configuration, and cloud platforms such as AWS, Azure, or GCP. Proficiency in scripting languages such as Bash and Python for automation tasks. Hands-on experience with CI/CD tools such as Jenkins, GitLab CI/CD, or CircleCI. Solid understanding of networking concepts and security best practices. Experience with version control systems such as Git. Strong problem-solving skills and the ability to troubleshoot complex issues. Excellent communication and collaboration skills to work effectively in cross-functional teams. Relevant certifications like AWS Certified DevOps Engineer, or similar, are a plus. Job Type: Full-time.

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 18 Lacs

Pune

Work from Office

Responsibilities: * Design and deliver corporate training programs using Python * Ensure proficiency in Python, Pyspark, data structures, NumPy, Pandas, Aws, Azure, GCP Cloud, Data visualization, Big Data tools * Experience in core python skills Food allowance Travel allowance House rent allowance

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune

Work from Office

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Your primary responsibilities include: Develop & maintain data pipelines for batch & stream processing using informatica power centre or cloud ETL/ELT tools. Liaise with business team and technical leads, gather requirements, identify data sources, identify data quality issues, design target data structures, develop pipelines and data processing routines, perform unit testing and support UAT. Work with data scientist and business analytics team to assist in data ingestion and data-related technical issues. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue Resolution: Working on the end-to-end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue Resolution: Collaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology Integration: Being eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Relevant experience with APIM, Azure API Management experience Proficient in .Net * Proficient with Azure Platform Development (Azure Functions, Azure Services etc) Candidate should be from .NET background Azure Services like Azure Functions , API Integration , Logic Apps, APIM, Azure Storage (Blob, Table), Cosmos DB etc Preferred technical and professional experience .Net Azure Full stack Proficient in .Net Core with hands on coding in .Net core

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Kochi

Work from Office

Create Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platform. Contribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and Estimation. Contribute to reusable components / asset / accelerator development to support capability development Participate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologies Participate in customer PoCs to deliver the outcomes Participate in delivery reviews / product reviews, quality assurance and work as design authority Required education Bachelor's Degree Preferred education Non-Degree Program Required technical and professional expertise Experience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systems Experience in data engineering and architecting data platforms. Experience in architecting and implementing Data Platforms Azure Cloud Platform Experience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, Airflow Experience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or Hortonworks Preferred technical and professional experience Experience in architecting complex data platforms on Azure Cloud Platform and On-Prem Experience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data Fabric Exposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc

Posted 1 month ago

Apply

2.0 - 7.0 years

2 - 6 Lacs

New Delhi, Gurugram

Work from Office

CUSTOMER SUPPORT ROLE FOR INTERNATIONAL VOICE PROCESS KAJAL - 8860800235 TRAVEL/BANKING/TECHNICAL GRAD/UG/FRESHER/EXPERIENCE SALARY DEPENDING ON LAST TAKEHOME(UPTO 7 LPA) LOCATION - GURUGRAM WFO, 5 DAYS, 24*7 SHIFTS CAB+ INCENTIVES IMMEDIATE JOINERS

Posted 1 month ago

Apply

2.0 - 4.0 years

3 - 6 Lacs

Ahmedabad

Work from Office

Job Summary: We are urgently looking for a skilled Azure Cloud Engineer with strong expertise in Microsoft technologies to join our team. The ideal candidate will take ownership of Azure infrastructure, Office 365 administration, and Intune implementation. The goal is to ensure smooth operations, security compliance, and end-user satisfaction. Key Responsibilities: Azure Cloud Engineering Design, deploy, and manage Azure-based infrastructure and services (IaaS, PaaS). Implement and maintain Azure virtual machines, networking, storage, and security components. Automate deployments using PowerShell, ARM templates, or Terraform. Monitor and optimize cloud performance, cost, scalability, and availability. Ensure alignment with cloud security best practices and compliance requirements. Office 365 Administration Manage Exchange Online, SharePoint Online, Teams, OneDrive, and other Microsoft 365 services. Administer the O365 tenant, including licensing, user/group management, and security policies. Monitor service health and usage through Microsoft 365 Admin Center. Implement security protocols including DKIM, SPF, DMARC, and threat protection tools. Intune & Endpoint Management Lead deployment and configuration of Microsoft Intune for MDM and MAM. Configure security and compliance policies for Windows, Android, and iOS devices. Manage Conditional Access and App Protection policies. Provide support for Intune-related issues during device onboarding. Required Skills & Qualifications: Minimum 2 years of hands-on experience in cloud engineering, preferably in Microsoft Azure. Proven experience with Azure services (VMs, VNets, Azure AD, Storage, NSG, etc.). In-depth knowledge of Microsoft 365 admin center, Exchange Online, Teams, and SharePoint. Experience with Microsoft Intune and Endpoint Manager. Strong PowerShell scripting skills. Understanding of IAM concepts (Azure AD, MFA, Conditional Access). Familiarity with cloud security and compliance standards (ISO preferred). Preferred Certifications (Optional): Microsoft Certified: Azure Administrator Associate (AZ-104) Microsoft Certified: Azure Fundamentals (AZ-900) Soft Skills: Strong troubleshooting and problem-solving abilities. Excellent communication and documentation skills. Proactive, team-oriented, and ownership-driven mindset. Ability to handle multiple projects and changing priorities. Nice to Have: Experience with Microsoft Defender, Sentinel, or other security tools. Familiarity with hybrid Exchange environments.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies