Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
14 - 17 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 month ago
5.0 - 10.0 years
14 - 17 Lacs
Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Mumbai
Work from Office
Skill—Java AWS Experience:6-9Yrs Ro leT2 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs.ac Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).Develop and maintain robust backend services and RESTful APIs using Java and Spring Boot. Design and implement microservices that are scalable, maintainable, and deployable in AWS. Integrate backend systems with AWS services including but not limited to Lambda, S3, DynamoDB, RDS, SNS/SQS, and CloudFormation. Collaborate with product managers, architects, and other developers to deliver end-to-end features. Participate in code reviews, design discussions, and agile development processes.
Posted 1 month ago
8.0 - 13.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Experience: 8 years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.
Posted 1 month ago
9.0 - 14.0 years
12 - 16 Lacs
Gurugram
Work from Office
1. AWS Design experience to architect and implement AWS solutions 2. Required proficiency in deploying, managing, and optimizing Amazon EKS. 3. Advanced skills in implementing using Terraform for infrastructure as code, ensuring consistent and repeatable deployments. 4. Experience in implementing Security Best Practices including developer solutions, IAM roles, policies, and encryption. 5. Proficiency in setting up efficient logging, monitoring and alerting solution. 6. Experience in Cost Management with ability to optimize AWS costs and manage budgets effectively. 7. Experience in developing and managing scalable solutions using EKS, Istio, API gateway, Lambda and serverless computing.
Posted 1 month ago
14.0 - 19.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
8 years of hands-on experience in AWS, Kubernetes, Prometheus, Cloudwatch,Splunk.Datadog Terraform, Scripting (Python/Go), Incident Management Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 month ago
10.0 - 15.0 years
12 - 16 Lacs
Gurugram
Work from Office
1. Experience of working in AWS cloud service (i.e. S3, AWS Glue, Glue catalogue, Step Functions, lambda, Event Bridge etc) 2. Must have hands on DQ Libraries for data quality checks 3. Proficiency in data modelling and database management. 4. Strong programming skills in python, unix and on ETL technologies like Informatica 5. Experience of DevOps and Agile methodology and associated toolsets (including working with code repositories) and methodologies 6. Knowledge of big data technologies like Hadoop and Spark. 7. Must have hands on Reporting tools Tableau, quick sight and MS Power BI 8. Must have hands on databases like Postgres, MongoDB 9. Experience of using industry recognised frameworks and experience on Streamsets & Kafka is preferred 10. Experience in data sourcing including real time data integration 11. Proficiency in Snowflake Cloud and associated data migration from on premise to Cloud with knowledge on databases like Snowflake, Azure data lake, Postgres
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
: 8 + years of java , J2EE , Microservices experience working as a Tech Lead for atleast 3 years. Hands on experience in developing Java Microservices applications. Should be able to demonstrate expertise in J2EE Design patterns, ability to provide design solutions and implement reusable Java APIs. Guide team in implementing solutions and Review solutions and code. Very strong hands on development experience in Java-8 features like Generics , exception handling, collection API, Functional Interfaces , Multithreading , Lambda Expression , Stream API etc Expertise in implementing solutions using Spring , Spring Boot , Microservices at least for 2-3 years. Mandatory to have knowledge in deploying Java and microservices components in ECS environment Experience in writing basic Oracle PL / SQL queries. Good to have Angular , CSS , Banking domain , capital markets
Posted 1 month ago
8.0 - 13.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 6+ years of hands-on experience in JAVA FULL STACK - ANGULAR + JAVA SPRING BOOT Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.
Posted 1 month ago
8.0 - 13.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs.ac Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).Develop and maintain robust backend services and RESTful APIs using Java and Spring Boot. Design and implement microservices that are scalable, maintainable, and deployable in AWS. Integrate backend systems with AWS services including but not limited to Lambda, S3, DynamoDB, RDS, SNS/SQS, and CloudFormation. Collaborate with product managers, architects, and other developers to deliver end-to-end features. Participate in code reviews, design discussions, and agile development processes.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
P2-C2-STS You are passionate about driving SRE / DevSecOps mindset and culture in a fast-paced, challenging environment where you get the opportunity to work with a spectrum of latest tools and technologies to drive forward Automation, Observability and CI/CD automation You are actively looking to improve implemented solutions, understand the efficacy of collaboration, work with cross functional teams to build and improve CI/CD pipeline and improve automation (reduce Toil). As a member of this team, you possess the ability to inspire and leverage your experience to inject new knowledge and skills into an already high performing team. Help Identifying areas of improvement, especially when it comes to Observability, Proactiveness, Automation & Toil Management. Strategic approach with clear objectives to improve System Availability, Performance Optimization, and improve Incident MTBuild and maintain Reliable Engineering Systems using SRE and DevSecOps models with special focus on Event Management (monitoring/alerts), Self Healing and Reliability testing Strong programming skills with experience in API and Webhook development using Dynatrace, GitHub workflows, Ansible, CDK, Type/Java script, Python, Node.js, Ruby, PowerShell, and Shell Scripting languages. Strong understanding of Cloud computing (AWS) Strong understanding of SDLC and DevSecOps Experience in CI/CD pipeline tools such as JIRA, GitHub, Bitbucket, Artifactory, Ansible, or equivalent Working knowledge of Lambda, Glue and CDK Knowledge of cloud servicesApplication integration, functions, Cloud Databases, data warehouse and analytics, Machine Learning, Developer Tools, Security and identity management Knowledge of software development practices, concepts, and technology obtained through formal training and/or work experience. Knowledge of required programming languages and can code with minimum guidance. Understand functional aspects and technical behavior of the underlying operating system, development environment, and deployment practices.
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Immediate Job Openings on VMWare _ Pan India_ Contract Experience 9+ Years Skill VMWare Location Pan India Notice Period Immediate . Employment Type Contract VMWare Expert in IaC using CloudFormation and Terraform (2-3Yrs) - Experiance with Containers Services (K8S, EKS, Fargate) - Very Good understanding of Cloud observability using CloudWatch - Experience in automation techniques and architectures using EventBridge , System Manager and Lambda in Vmware ( 3-4Yrs) - Understanding of enterprise integration patterns and practices (like event-driven architectures) (4-5Yrs) - Expert in software development practices including source code management (Git), Agile principles, Scrum and Kanban (6-7Yrs) - Good knowledge of CI/CD practices and tools with Vmware and Git (4-5Yrs) - Experience in working with Ansible and Jenkins
Posted 1 month ago
7.0 - 12.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Roles and Responsibility Collaborate with cross-functional teams to design and implement SAP SD solutions. Analyze business requirements and provide technical expertise to meet customer needs. Develop and maintain technical documentation for SAP SD projects. Provide training and support to end-users on SAP SD modules. Troubleshoot and resolve issues related to SAP SD implementation. Work closely with stakeholders to identify process improvements and optimize business operations. Job Requirements Strong knowledge of SAP SD module, including sales order processing, delivery, and billing. Experience working with customers and understanding their business needs. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Strong analytical and problem-solving skills. Familiarity with industry-specific regulations and standards.
Posted 1 month ago
4.0 - 9.0 years
7 - 11 Lacs
Pune
Work from Office
Roles and Responsibility Design, implement, and manage scalable and secure cloud infrastructure on AWS. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for cloud infrastructure and applications. Troubleshoot and resolve complex technical issues related to cloud infrastructure. Ensure compliance with industry standards and best practices for cloud security and governance. Participate in code reviews and contribute to the improvement of the overall code quality. Job Requirements Strong understanding of cloud computing concepts, including IaaS, PaaS, and SaaS. Experience with AWS services such as EC2, RDS, S3, Lambda, and CloudFormation. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills, with experience working with distributed teams. Ability to design and implement scalable and secure cloud infrastructure. Experience with version control systems such as Git.
Posted 1 month ago
9.0 - 12.0 years
4 - 8 Lacs
Hyderabad
Work from Office
We have Immedaite openings for AWS IAM engineer. JD: Primary Skills: Extensive experience with AWS services : IAM, S3, Glue, CloudFormation and CloudWatch In-depth understanding of AWS IAM policy evaluation for permissions and access control Proficient in using Bitbucket, Confluence, GitHub, and Visual Studio Code Proficient in policy languages, particularly Rego scriptingGood to Have Skills : Experience with the WIZ tool for security and compliance Good programming skills in Python Advanced knowledge of additional AWS services : ECS, EKS, Lambda, SNS and SQSRoles & ResponsibilitiesSenior Developer on the Wiz team specializing in Rego and AWS- -Project Manager - One to Three Years,AWS Cloud Formation - Four to Six Years,AWS IAM - Four to Six Years-PSP Defined SCU in Data engineer.
Posted 1 month ago
6.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Excellent knowledge of C# with Multi-threading, Async. Good command over OOP concepts, SOLID principles, Design patterns. Experience in REST API/Microservices using ASP.Net WebAPI. Only ASP.Net MVC is not acceptable . Securing web applications using Authentication Tokens, Certificates, OAuth etc. Caching, Distributed caching. Pub/Sub, Queues/Topics Message Broker, Any one Queuing system. SQL Server knowledge, Joins, Stored Procedures, Functions, writing complex queries. SSIS/SSRS Experience on Lambda, CloudWatch, API Gateway, S3 Bucket, EC2, SNS, SQS, ELB, Docker/Kubernetes & Kafka MQ, IAM, Authorization, and Access control, SaaS etc.
Posted 1 month ago
3.0 - 7.0 years
6 - 11 Lacs
Gurugram
Work from Office
Primary skills: .Net core, C#, SQL, MVC,Angular, Client-side scripting HTML, CHTML, JavaScript etc), Architecture experience in AWS, .NET Core, and Serverless architecture. domain-Driven Architecture, Microservices, AWS Lambda, and Cloud development Required Candidate profile 3+ Yrs of.Net Core, Angular, SQL, Azure Application design and solutions (on-prem and Cloud development experience with AWS
Posted 1 month ago
4.0 - 8.0 years
3 - 6 Lacs
Pune
Work from Office
Primary skills- Python, Cloud formation, Lambda, Storage gateway . ,File share, SQS, S3, event rule, API gateway, Cloud Watch, AURORA DB Roles Responsibilities Gathering and understanding the requirement. Creation and Documentation of Design encompassing a multiple layered process, infrastructure, and IT services management system. CICD Automation Production support . ------ ------PYTHON - Four to Six Years ,Developer / Software Engineer - One to Three Years, AWS Cloud Formation - Four to Six Years ------PSP Defined SCU in Cloud_PE_ServiceNow Engineer
Posted 1 month ago
6.0 - 10.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to dive headfirst into the captivating world of data engineering at Kyndryl? As a Data Engineer, you'll be the visionary behind our data platforms, crafting them into powerful tools for decision-makers. Your role? Ensuring a treasure trove of pristine, harmonized data is at everyone's fingertips. As a AWS Data Engineer at Kyndryl, you'll be at the forefront of the data revolution, crafting and shaping data platforms that power our organization's success. This role is not just about code and databases; it's about transforming raw data into actionable insights that drive strategic decisions and innovation. In this role, you'll be engineering the backbone of our data infrastructure, ensuring the availability of pristine, refined data sets. With a well-defined methodology, critical thinking, and a rich blend of domain expertise, consulting finesse, and software engineering prowess, you'll be the mastermind of data transformation. Your journey begins by understanding project objectives and requirements from a business perspective, converting this knowledge into a data puzzle. You'll be delving into the depths of information to uncover quality issues and initial insights, setting the stage for data excellence. But it doesn't stop there. You'll be the architect of data pipelines, using your expertise to cleanse, normalize, and transform raw data into the final dataset—a true data alchemist. Armed with a keen eye for detail, you'll scrutinize data solutions, ensuring they align with business and technical requirements. Your work isn't just a means to an end; it's the foundation upon which data-driven decisions are made – and your lifecycle management expertise will ensure our data remains fresh and impactful. So, if you're a technical enthusiast with a passion for data, we invite you to join us in the exhilarating world of data engineering at Kyndryl. Let's transform data into a compelling story of innovation and growth. Your Future ar Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience • 10+ years of experience in data engineering with a minimum of 6 years on AWS. • Proficiency in AWS data services, including S3, Redshift, DynamoDB, Glue, Lambda, and EMR. • Strong SQL skills and experience with NoSQL databases on AWS. • Programming skills in Python, Java, or Scala for data processing and ETL tasks. • Solid understanding of data warehousing concepts, data modeling, and ETL best practices. • Experience with machine learning model deployment on AWS SageMaker. • Familiarity with data orchestration tools, such as Apache Airflow, AWS Step Functions, or AWS Data Pipeline. • Excellent problem-solving and analytical skills with attention to detail. • Strong communication skills and ability to collaborate effectively with both technical and non-technical stakeholders. • Experience with advanced AWS analytics services such as Athena, Kinesis, QuickSight, and Elasticsearch. • Hands-on experience with Amazon Bedrock and generative AI tools for exploring and implementing AI-based solutions. • AWS Certifications, such as AWS Certified Big Data – Specialty, AWS Certified Machine Learning – Specialty, or AWS Certified Solutions Architect. • Familiarity with CI/CD pipelines, containerization (Docker), and serverless computing concepts on AWS. Preferred Skills and Experience •Experience working as a Data Engineer and/or in cloud modernization. •Experience in Data Modelling, to create conceptual model of how data is connected and how it will be used in business processes. •Professional certification, e.g., Open Certified Technical Specialist with Data Engineering Specialization. •Cloud platform certification, e.g., AWS Certified Data Analytics– Specialty, Elastic Certified Engineer, Google CloudProfessional Data Engineer, or Microsoft Certified: Azure Data Engineer Associate. •Understanding of social coding and Integrated Development Environments, e.g., GitHub and Visual Studio. •Degree in a scientific discipline, such as Computer Science, Software Engineering, or Information Technology. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 1 month ago
5.0 - 9.0 years
9 - 13 Lacs
Gurugram, Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 11 The RoleLead Cloud Engineer The Team: We are looking for a dynamic AWS Cloud Support Engineer to join our team, working across multiple AWS accounts to ensure seamless cloud operations. This is a varied role that requires deep technical expertise, strategic planning, and strong stakeholder communication. Collaboration is at the core of our team, so if you thrive in a fast-paced, problem-solving environment, we'd love to hear from you. The Impact: Contribute significantly to the growth of the firm byDeveloping innovative functionality in existing and new products/ Supporting and maintaining high revenue products. Whats in it for you: A collaborative team culture that values innovation and problem-solving. Opportunity to work on diverse projects spanning multiple AWS accounts. A chance to shape cloud strategy and architecture in a growing organizational division. Actively supported in taking learning opportunities . Exciting open-door collaboration within the EDO Agentic AI experience. Key Responsibilities: Architecture Planning: Design and refine AWS architectures to meet business needs, ensuring security, scalability, and cost-effectiveness. Cost Management: Keep an eye on infrastructure costs and recommendations, propose changes to stakeholders to reduce cloud spend and waste. Multi-Account Management: Oversee cloud environments across numerous AWS accounts, maintaining best practices for governance and security. Troubleshooting & Incident Response: Diagnose and resolve complex technical issues related to AWS services, infrastructure, and networking. Stakeholder Collaboration: Communicate effectively with teams across the organization, providing insights, technical recommendations, and status updates. Automation & Optimization: Develop scripts and tools to automate deployments, monitoring, and management processes. Security & Compliance: Ensure adherence to security policies and regulatory requirements within AWS environments. Continuous Improvement: Stay updated with AWS advancements and recommend improvements for existing cloud strategies. : Proven experience in AWS cloud infrastructure and services. Strong understanding of networking, security, and cloud architecture best practices. Proficiency in Terraform, CloudFormation, or other Infra as Code tools is a plus. Hands-on experience with EC2, S3, RDS, Lambda, VPC , Bedrock and other AWS services preferred. Ability to troubleshoot complex system and network issues across cloud environments. Excellent communication skills and the ability to work collaboratively in a team-oriented environment . AWS certifications (Solutions Architect, SysOps, or Developer) are preferred but not mandatory. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.2 - Middle Professional Tier II (EEO Job Group)
Posted 1 month ago
10.0 - 15.0 years
27 - 32 Lacs
Noida, Gurugram
Work from Office
About the Role: Grade Level (for internal use): 13 What we are looking for : We are looking for a Lead Full-stack Developer with expertise in both backend and frontend development to play a crucial role in the planning, preparation, and development of software for a key platform. This position requires an experienced engineer who can work effectively both independently and collaboratively within a team setting. The ideal candidate should be focused on delivery and possess experience and skills in financial applications . What's in for you: The selected candidate will assist in the design, development, and maintenance of solutions across multiple technology platforms supported by the technology team. Responsibilities: Design and develop a Java full-stack platform.Migrate existing applications to Java-based microservices, deploying them using Docker and containers.Engage in various software development processes, including requirement analysis, design, coding, testing, and documentation.Develop software applications based on clear business specifications.Work on new initiatives while supporting existing Index applications.Conduct application and system performance tuning and troubleshoot performance issues.Build applications using object-oriented concepts and apply design patterns.Set up development environments/sandboxes for application development.Perform unit testing of application code and resolve errors.Interface with databases to extract information and create reports.Effectively communicate with customers, business users, and IT staff. Qualifications: A Bachelor's degree in Computer Science, Information Systems, or Engineering is required, or a demonstrated equivalence in work experience.Over 10 years of experience in designing, developing, testing, and successfully deploying critical and complex projects.Strong Java skills with experience in developing concurrent and distributed systems.Hands-on experience with HTML, CSS, Bootstrap, JavaScript, jQuery, and Angular or React.Advanced experience with Spring-based technologies (Spring Boot, Spring Cloud, etc.) and caching frameworks like Hazelcast.Experience in designing and implementing microservices-based solutions.Proficiency in writing unit and integration tests.Experience in writing SQL queries and a solid understanding of data models.Good understanding of AWS cloud services (EC2, ECS, Load Balancer, Security Group, Lambda, S3, etc.).Experience in DevOps development and deployment using Docker and containers.Strong knowledge of infrastructure and exposure to CI/CD.Strong analytical and problem-solving skills. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.2 - Middle Professional Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 month ago
6.0 - 11.0 years
3 - 7 Lacs
Hyderabad
Work from Office
- Develop automation scripts and infrastructure as code to deploy and manage cloud-based services - Create technical artefacts related asset built - Work with TEST/SRE team to resolve identified bugs and issues based on criticality - Ensure code is written with security first principle and adherence to cloud design patten" - Very good knowledge of AWS platform(2-3Yrs) - Experiance in IaC using Terraform and AWS( CloudFormation)(2-3Yrs) - Experiance with scripting in PowerShell , Python and Bash scripts. (1-2Yrs) - Experience in building Cloud Automation using Lambda and EventBridge ( 2-3Yrs) - Proficient and hands-on experience with Atlassian stack (Jira, Confluence). - Experiance with Containers Services (K8S, EKS, Fargate) - Knowledge of Agile Lean methodologies.(3-4Yrs) - Good knowledge of CI/CD practices and tools (AWS/GitLab/GitHub) (1-2yrs) - Knowledge of working with Ansible and Jenkins - Must have AWS Developer certification
Posted 1 month ago
8.0 - 12.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Responsible for leading the team and having good experience in developing and architecting applications in Angular and Java Spring Boot Responsible for the design of the overall solution and holistic ecosystem of Manufacturing portfolio Collaborates with all stakeholders to ensure the architecture is aligned with business requirements and FCA security and architecture requirements. Acts as a consultant to both business and the feature architects on theoretical and actual solutions for projects. Design of the end-to-end solution for assigned features. Develops high level sequence flows, data flow diagrams with respect to the overall solution in terms of patterns and guidelines. Develops sequence flows, data flow diagrams at the feature level supporting the FA during the design. Providing technical leadership to the application development team. Responsible for developing DFMEAs. Responsible for infrastructure design and deployment. Leads the Feature feasibility discussion from the architecture perspective Scrutinize project constraints to analyze alternatives, mitigate risks, and conduct process re-engineering as necessary. Select the technology stack and perform a resource evaluation. Comply with strategic guidelines and architecture constraints. Work with all the stakeholder teams in an Agile way to ensure timely progress and remove all the blockers as needed Responsible for the detailed design of the feature Responsible for the detailed design (API and Parm level) of the feature Works with the requirements team (Feature Product Owners) in assimilating (Feature) requirements Lead interface design in collaboration with all the vendor stakeholders and Solution Architect, for all components Bachelors in computer related field such as Computer Science or a related degree Essential: 8-10 Years of relevant work experience in designing and building applications/ platforms, APIs, and/or large/complex systems Having 8-10 years of experience in Java/J2EE Having experience in Angular and Spring Boot Operating systems (Unix/Linux/WIndows) Infrastructure and engineering design DevOps (Jenkins/CICD/Artifactory/AWS Tools) Basic security knowledge of networks and software best practices System analysis and evaluation Database management Cloud development (AWS preferred) Java with experience with Spring, Spring Boot, and other frameworks Strong analytical skills and problem solving skills to solve technical challenges and collect, organize and analyze data Strong communication skills and Strong interpersonal skills to effectively guide and mentor employees Desirable: Certified (CA) architect preferred Kubernetes experience preferred Expertise in project monitoring software such as JIRA, Confluence, etc. Strong presentation skills preferred System Architecture including UML diagrams and sequence diagrams. Experience with Java, Linux, AWS (EC2, Lambda, Dynamo, Networking), Databases, Cloud Architecture, Basic knowledge of PKI and security schemes, Network systems including IP addressing, firewalls, and other basic infrastructure. ENGLISH Overlap with US upto 12 Noon EST time Collaborating with the North American stakeholder
Posted 1 month ago
0.0 - 5.0 years
2 - 7 Lacs
Gurugram
Work from Office
Company: Oliver Wyman Description: RoleData Engineer Who We Are Oliver Wyman is a global leader in management consulting. With offices in 50+ cities across 30 countries, Oliver Wyman combines deep industry knowledge with specialized expertise in strategy, finance, operations, technology, risk management, and organizational transformation. Our 4000+ professionals help clients optimize their business, improve their IT, operations, and risk profile, and accelerate their organizational performance to seize the most attractive opportunities. Our professionals see what others don't, challenge conventional thinking, and consistently deliver innovative, customized solutions. As a result, we have a tangible impact on clients top and bottom lines. Our clients are the CEOs and executive teams of the top global 1000 companies. Oliver Wyman is a business of Marsh McLennan [NYSEMMC] For more information, visit www.oliverwyman.com Follow Oliver Wyman on Twitter @OliverWyman Practice Overview PracticeData and Analytics (DNA) - Analytics Consulting LocationGurugram, India At Oliver Wyman DNA, we partner with clients to solve tough strategic business challenges with the power of analytics, technology, and industry expertise. We drive digital transformation, create customer-focused solutions, and optimize operations for the future. Our goal is to achieve lasting results in collaboration with our clients and stakeholders. We value and offer opportunities for personal and professional growth. Join our entrepreneurial team focused on delivering impact globally. Our Mission and Purpose Mission Leverage Indias high-quality talent to provide exceptional analytics-driven management consulting services that empower clients globally to achieve their business goals and drive sustainable growth, by working alongside Oliver Wyman consulting teams. Purpose Our purpose is to bring together a diverse team of highest-quality talent, equipped with innovative analytical tools and techniques to deliver insights that drive meaningful impact for our global client base. We strive to build long-lasting partnerships with clients based on trust, mutual respect, and a commitment to deliver results. We aim to build a dynamic and inclusive organization that attracts and retains the top analytics talent in India and provides opportunities for professional growth and development. Our goal is to provide a sustainable work environment while fostering a culture of innovation and continuous learning for our team members. The Role and Responsibilities We have open positions ranging from Associate Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Interview Process The application process will include testing technical proficiency, case study, and team-fit interviews. Please include a brief note introducing yourself, what youre looking for when applying for the role, and your potential value-add to our team. Roles and levels We are hiring for engineering role across the levels from Associate Data Engineer to Lead Data Engineer level for experience ranging from 0-8 years. In addition to the base salary, this position may be eligible for performance-based incentives. We offer a competitive total rewards package that includes comprehensive health and welfare benefits as well as employee assistance programs. Oliver Wyman is an equal-opportunity employer. Our commitment to diversity is genuine, deep, and growing. Were not perfect, but were working hard right now to make our teams balanced, representative, and diverse. Marsh McLennan and its Affiliates are EOE Minority/Female/Disability/Vet/Sexual Orientation/Gender Identity employers.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City