Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Practice Overview PracticeData and Analytics (DNA) - Analytics Consulting Role Associate Director - Data & Analytics.LocationGurugram, India At Oliver Wyman DNA, we partner with clients to solve tough strategic business challenges with the power of analytics, technology, and industry expertise. We drive digital transformation, create customer-focused solutions, and optimize operations for the future. Our goal is to achieve lasting results in collaboration with our clients and stakeholders. We value and offer opportunities for personal and professional growth. Join our entrepreneurial team focused on delivering impact globally. Our Mission and Purpose Mission: Leverage Indias high-quality talent to provide exceptional analytics-driven management consulting services that empower clients globally to achieve their business goals and drive sustainable growth, by working alongside Oliver Wyman consulting teams. Purpose Our purpose is to bring together a diverse team of highest-quality talent, equipped with innovative analytical tools and techniques to deliver insights that drive meaningful impact for our global client base. We strive to build long-lasting partnerships with clients based on trust, mutual respect, and a commitment to deliver results. We aim to build a dynamic and inclusive organization that attracts and retains the top analytics talent in India and provides opportunities for professional growth and development. Our goal is to provide a sustainable work environment while fostering a culture of innovation and continuous learning for our team members. The Role and Responsibilities We are looking to hire an Associate Director in Data Science & Data Engineering Track. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data scientist track, you will be primarily responsible for managing and delivering analytics projects and helping teams design analytics solutions and models that consistently drive scalable high-quality solutions. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with global clients to understand their business challenges Exploring large-scale data and crafting models to answer core business problems Working with partners and principals to shape proposals that showcase our data science and analytics capabilities Explaining, refining, and crafting model insights and architecture to guide stakeholders through the journey of model building Advocating best practices in modelling and code hygiene Leading the development of proprietary statistical techniques, ML algorithms, assets, and analytical tools on varied projects Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art modelling and data science techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or Masters degree in a quantitative discipline from a top academic program (Data Science, Mathematics, Statistics, Computer Science, Informatics, and Engineering) Prior experience in data science, machine learning, and analytics Passion for problem-solving through big-data and analytics Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with the ability to manage workload and meet deadlines in a fast-paced environment Impactful presentation skills that succinctly and efficiently convey findings, results, strategic insights, and implications Excellent verbal and written communication skills and complete command of English Willingness to travel Collaborative team player Respect for confidentiality Technical Background (Data Science) Proficiency in modern programming languages (Python is mandatory; SQL, R, SAS desired) and machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Prior experience in designing and deploying large-scale technical solutions leveraging analytics Solid foundational knowledge of the mathematical and statistical principles of data science Familiarity with cloud storage, handling big data, and computational frameworks Valued but not required : Compelling side projects or contributions to the Open-Source community Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Technical Background (Data Engineering) Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence Interview Process The application process will include testing technical proficiency, case study, and team-fit interviews. Please include a brief note introducing yourself, what youre looking for when applying for the role, and your potential value-add to our team. Roles and levels In addition to the base salary, this position may be eligible for performance-based incentives. We offer a competitive total rewards package that includes comprehensive health and welfare benefits as well as employee assistance programs.
Posted 1 month ago
6.0 - 11.0 years
9 - 13 Lacs
Chennai
Work from Office
About the Team: We are a motivated team in central R&D at CVS helping to change the game through product digitalization and vehicle intelligence. Our focus is on building solutions for truck, bus and trailer OEMs considering both onboard and offboard (SaaS & PaaS) needs and requirements. Purpose: Connect the vehicle (Cyber) secure the vehicle Master the vehicle architecture Diagnose the vehicle Gain intelligence from the vehicle What you can look forward to as Fullstack Developer Design, develop, and deploy scalable applications using AWS Serverless (Lambda, API Gateway, DynamoDB, etc.) and Container technologies (ECS, EKS, Fargate). Build and maintain RESTful APIs and microservices architectures in .NET core (Entity Framework) Write clean, maintainable code in Node.js, JavaScript, C#, or React JS or React Native. Work with both SQL and NoSQL databases to design efficient data models. Apply Object-Oriented Analysis (OOA) and Object-Oriented Design (OOD) principles in software development. Utilize multi-threading and messaging patterns to build robust distributed systems. Collaborate using GIT and follow Agile methodologies and Lean principles. Participate in code reviews, architecture discussions, and contribute to continuous improvement. Your profile as Tech Lead: Bachelors or Masters degree in Computer Science or a related field. Minimum 6+ years of hands-on software development experience. Strong understanding of AWS cloud hosting technologies and best practices. Proficiency in at least one of the following: Node.js, JavaScript, C#, React (JS / Native). Experience with REST APIs, microservices, and cloud-native application development. Familiarity with design patterns, messaging systems, and distributed architectures. Strong problem-solving skills and a passion for optimizing business solutions.
Posted 1 month ago
8.0 - 10.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 4 - 8 years Location : Bangalore The Data Engineer will contribute to building state-of-the-art data Lakehouse platforms in AWS, leveraging Python and Spark. You will be part of a dynamic team, building innovative and scalable data solutions in a supportive and hybrid work environment. You will design, implement, and optimize workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires previous experience of building data products using AWS services, familiarity with Python and Spark, problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable previous experience as a data engineer. Technical knowledge of data engineering solutions and practices. Implementation of data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Proficient in Python and Spark, with a focus on ETL data processing and data engineering practices. Nice To Have Tech Skills: Familiar with data services in a Lakehouse architecture. Familiar with technical design practices, allowing for the creation of scalable, reliable data products that meet both technical and business requirements A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Writes high quality code, ensuring solutions meet business requirements and technical standards. Works with architects, Product Owners, and Development leads to decompose solutions into Epics, assisting the design and planning of these components. Creates clear, comprehensive technical documentation that supports knowledge sharing and compliance. Experience in decomposing solutions into components (Epics, stories) to streamline development. Actively contributes to technical discussions, supporting a culture of continuous learning and innovation. Key Skills: Proficient in Python and familiar with a variety of development technologies. Previous experience of implementing data pipelines, including use of ETL tools to streamline data ingestion, transformation, and loading. Solid understanding of AWS services and cloud solutions, particularly as they pertain to data engineering practices. Familiar with AWS solutions including IAM, Step Functions, Glue, Lambda, RDS, SQS, API Gateway, Athena. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Experienced in Agile development, including sprint planning, reviews, and retrospectives Educational Background: Bachelor’s degree in computer science, Software Engineering, or related essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices. Familiar with implementing and optimizing CI/CD pipelines. Understands the processes that enable rapid, reliable releases, minimizing manual effort and supporting agile development cycles.
Posted 1 month ago
5.0 - 8.0 years
7 - 11 Lacs
Noida
Work from Office
Tech Stack Java + Spring Boot AWS (ECS, Lambda, EKS) Drools (preferred but optional) APIGEE API observability, traceability, and security Skills Required: Strong ability to understand existing codebases, reengineer and domain knowledge to some extent. Capability to analyze and integrate the new systems various interfaces with the existing APIs. Hands-on experience with Java, Spring, Spring Boot, AWS, and APIGEE . Familiarity with Drools is an added advantage. Ability to write and maintain JIRA stories (10-15% of the time) and keeping existing technical specifications updated would be required. Should take end-to-end ownership of the project , create design, guide team and work independently on iterative tasks. Should proactively identify and highlight risks during daily scrum calls and provide regular updates. Mandatory Competencies Java - Core JAVA Others - Micro services Java Others - Spring Boot Cloud - AWS Lambda Cloud - Apigee Beh - Communication and collaboration
Posted 1 month ago
4.0 - 8.0 years
8 - 12 Lacs
Noida
Work from Office
Required Skills & Qualifications: - 57 years of industry experience building and deploying machine learning models. - Strong proficiency with machine learning algorithms including XGBoost, linear regression, and classification models. - Hands-on experience with AWS SageMaker for model development, training, and deployment. - Solid programming skills in Python (and relevant libraries such as scikit-learn, pandas, NumPy, etc.). - Strong understanding of model evaluation metrics, cross-validation, hyperparameter tuning, and performance optimization. - Experience in working with structured and unstructured datasets. - Knowledge of best practices in model deployment and monitoring in a production environment (ML Ops desirable). - Familiarity with tools like Docker, Git, CI/CD pipelines, and AWS ML services. - Excellent problem-solving skills, critical thinking, and attention to detail. - Strong communication and collaboration skills. Nice to Have: - Experience with additional AWS services like Lambda, S3, Step Functions, CloudWatch. - Exposure to deep learning frameworks like TensorFlow or PyTorch. - Familiarity with DataOps practices and agile methodologies. Mandatory Competencies Data Science - Machine learning Python - Numpy Data Science - Python Python - Panda Data Science - AWS Sagemaker
Posted 1 month ago
3.0 - 5.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Hello Visionary! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for Software Engineer- Python + AWS Youll make a difference by Establish, maintain, and evolve concepts in continuous integration and deployment (CI/CD) pipelines for existing and new services. Collaborate with Engineering and Operations teams to improve automation of workflows, effective monitoring, infrastructure, code testing, scripting capabilities, deployment with lower costs and reduce non-conformance costs. System troubleshooting and problem resolution across various applications. Participate in on call rotation Conduct root cause analysis of incidents Implement enhancements to the monitoring solution to minimize the false positives and identify service health regressions Communicate findings in verbal and written format to the application team Generate weekly data reports summarizing the health of the application Youll win us over by You must have BE / B. Tech / MCA / ME / M. Tech qualification with 3 - 5 Years of confirmed ability Must have experience in Windows & Linux and Networking & security (ExampleIAM, Authorization) topics Awareness of DevOps principles, Design Patterns, Enterprise Architecture Patterns, Micro service Architecture, ability to learn/use a wide variety of open-source technologies and tools You are expert and love to work in large project in an agile way (SAFe-Framework) Experience in AWS servicesServerless Services (Lambda, DynamoDB, API Gateway), Container Services(like ECS, ECR), Monitoring Services(like Cloudwatch, X-Ray), Orchestration Tools ( Kubernetes, Docker) Security Services(IAM, Secrets Manager), Network Services(VPC), EC2, Backup, S3, CDK, CloudFormation, Step functions Experience in Scripting languages:Python, Bash Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at
Posted 1 month ago
5.0 - 10.0 years
10 - 14 Lacs
Pune
Work from Office
Hello Visionary! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for a Golang + Angular Developer. Youll make a difference by: Being proficient in Designing, developing, and maintaining robust backend services using Go, including RESTful APIs and microservices. Being proficient in Build and maintain smaller frontend applications in Angular, supporting full-stack feature delivery. Having ability to Operate, monitor, and troubleshoot existing applications to ensure performance, scalability, and reliability. Contributing to the development of complex, composite applications in a distributed system. Leading and maintaining CI/CD pipelines, ensure high code quality through Test-Driven Development (TDD). Utilizing container technologies like Docker and orchestration tools like Kubernetes (GitOps experience is a plus). Driving innovation by contributing new ideas, PoCs, or participating in internal hackathons. Youll win us over by: Holding a graduate BE / B.Tech / MCA/M.Tech/M.Sc with good academic record. 5+ Years of Experience in software development with a strong focus on Go (Golang). Working experience in building and maintaining production-grade microservices and APIs. Strong grasp of cloud platforms (AWS) including services like Lambda, ECS and S3. Hands-on experience with CI/CD, Git, and containerization (Docker). Working knowledge of Angular (intermediate or above) and full-stack technologies. Familiarity with distributed systems, message queues, and API design best practices. Having Experience with observability tools for logging, monitoring, and tracing. Passion for innovation and building quick PoCs in a startup-like environment. Personal Attributes: Excellent problem-solving and communication skills, able to articulate technical ideas clearly to stakeholders. Adaptable to fast-paced environments with a solution-oriented, startup mindset. Proactive and self-driven, with a strong sense of ownership and accountability. Actively seeks clarification and asks questions rather than waiting for instructions. Create a better #TomorrowWithUs! This role, based in Pune, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at
Posted 1 month ago
8.0 - 13.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make optimal use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective processes. We are looking for Sr. AWS Cloud Architect Architect and Design Develop scalable and efficient data solutions using AWS services such as AWS Glue, Amazon Redshift, S3, Kinesis(Apache Kafka), DynamoDB, Lambda, AWS Glue(Streaming ETL) and EMR Integration Integrate real-time data from various Siemens organizations into our data lake, ensuring seamless data flow and processing. Data Lake Management Design and manage a large-scale data lake using AWS services like S3, Glue, and Lake Formation. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Snowflake Integration Implement and manage data pipelines to load data into Snowflake, utilizing Iceberg tables for optimal performance and flexibility. Performance Optimization Optimize data processing pipelines for performance, scalability, and cost-efficiency. Security and Compliance Ensure that all solutions adhere to security best practices and compliance requirements. Collaboration Work closely with cross-functional teams, including data engineers, data scientists, and application developers, to deliver end-to-end solutions. Monitoring and Troubleshooting Implement monitoring solutions to ensure the reliability and performance of data pipelines. Troubleshoot and resolve any issues that arise. Youd describe yourself as: Experience 8+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Strong knowledge of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. Find out more about Siemens careers at
Posted 1 month ago
4.0 - 6.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Hello talented techie! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for Senior Dev-ops Engineer Youll make a difference by Being an SRE L1 Commander, who is responsible for ensuring the stability, availability, and performance of critical systems and services. As the first line of defense in incident management and monitoring, the role requires real-time response, proactive problem solving, and strong coordination skills to address production issues efficiently. Monitoring and AlertingProactively supervise system health, performance, and uptime using monitoring tools like Datadog, Prometheus. Serving as the primary responder for incidents to tackle and resolve issues quickly, ensuring minimal impact on end-users. Accurately categorizing incidents, prioritize them based on severity, and raise to L2/L3 teams when vital. Ensuring systems meet Service Level Objectives (SLOs) and maintain uptime as per SLAs. Collaborating with DevOps and L2 teams to automate manual processes for incident response and operational tasks. Performing root cause analysis (RCA) of incidents using log aggregators and observability tools to identify patterns and recurring issues. Following predefined runbooks/playbooks to resolve known issues and document fixes for new problems. Youd describe yourself as Experienced professional with 4 to 6 years of validated experience in SRE, DevOps, or Production Support with monitoring tools (e.g., Prometheus, Datadog). Proven understanding of Linux/Unix operating systems and basic scripting skills (Python, Gitlab actions) cloud platforms (AWS, Azure, or GCP). Familiarity with container orchestration (Kubernetes, Docker, Helmcharts) and CI/CD pipelines. Exposure with ArgoCD for implementing GitOps workflows and automated deployments for containerized applications. Possessing experience in MonitoringDatadog, InfrastructureAWS EC2, Lambda, ECS/EKS, RDS, NetworkingVPC, Route 53, ELB and StorageS3, EFS, Glacier. Strong analytical skills to resolve production incidents efficiently. Basic understanding of networking concepts (DNS, Load Balancers, Firewalls). Good communication and interpersonal skills for incident communication and issue. Having preferred certificationsAWS Certified SysOps Administrator- Associate, AWS Certified Solutions Architect- Associate or AWS Certified DevOps Engineer- Professional Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at
Posted 1 month ago
4.0 - 9.0 years
8 - 12 Lacs
Mumbai
Work from Office
Skill—Java Microservices Springboot Experience4-6 Yrs Ro leT3 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).
Posted 1 month ago
2.0 - 5.0 years
6 - 9 Lacs
Pune
Work from Office
Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Comp. Applications,Master Of Comp. Applications,Master Of Technology,Master Of Engineering,Bachelor Of Science,Master Of Science Service Line Engineering Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional : Primary skillsAmazon connect, Lex, Lambda, PythonTechnology-Communication-IVR, CCT, Technology-Functional Testing-IVR Testing, Technology-Infrastructure-Contact Center-Contact Center model, Technology-Infrastructure-Contact Center-IVR Concepts Preferred Skills: Technology-Infrastructure-Contact Center-ContactCenter model Technology-Infrastructure-Contact Center-IVR Concepts Technology-Functional Testing-IVR Testing Technology-Communication-IVR/CCT
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Gurugram
Work from Office
About the Opportunity Title Senior Analyst Programmer - Java/AWS Cloud Department FIL India Technology - GPS Location Gurgaon, India Reports To Project Manager Level Level 3 Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our Reconciliation & Product management Capability in GPS business and feel like youre part of something bigger. About your team The UK Retail Group is charged with the development and operating model for Fidelitys Wrap platforms for the Personal Investing business and FundsNetworkTM, including implementation of new tax wrappers, web front end and service layer, interfaces, document composition and production, STP/automation and migrations from legacy systems. Our objective is to be the supplier of choice for all IP services to our businesses as well as having the clear mandate to identify the future opportunities that will maintain and extend Fidelitys lead in online financial services. The department is currently expanding in order to satisfy the high level of demand for both short term tactical project delivery and the execution of a programme of strategic development which will see the complete reshaping of our platform capability and customer experience through the replacement and enhancement of our existing platform capability which supports our growing direct and intermediated businesses in the region. About your role This role is for an experienced Java and AWS Cloud developer (minimum of 4 years experience) to work within the Retail Technology. The successful candidate will be expected to work on Java utilities of Feed Processing Layer (that source the Funds data from PODS/PHUB). This places the candidate in a front line position with considerable business user interaction. As such, the role requires a high level of flexibility and the ability to work under pressure. Excellent written and verbal communications skills are essential to communicate effectively and appropriately to both business users and systems colleagues at all levels. The role requires close working with colleagues in the UK offices, and as such the candidate sometimes may be required to work UK business hours. On occasions the candidate may also be required to provide on-call support outside the normal business hours, and to undertake additional out of hours work to cover changes and release implementations. This demanding role would perfectly suit a dynamic individual looking to work in a fast paced environment to ensure the smooth running of business critical systems. About you You are expected to possess the following skills for this position. Ideal candidate having 4 to 6 years of experience. Essential Skills: Java, React, Spring Boot, JaxB AWS services like Cloudfront, S3, R53, Cognito, ECS, Lambda, API Gateway etc Deep knowledge and Experience of understanding the low/high level design aspects writing the unit testable code Experience of Source Control Tools such as Github, BitBucket Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams Desirable Skills include: Understanding on XML. Understanding of Unix Command. Business Domain related Exposure to the Finance industry would be an advantage. Strong interest and willingness to understand the Mutual Funds Business. Behavioural Strong interest in Technology and its applications. Self-motivation. Team Player. Good interpersonal skills
Posted 1 month ago
10.0 - 13.0 years
12 - 15 Lacs
Bengaluru
Work from Office
About the Opportunity Job TypeApplication 31 July 2025 TitlePrincipal Data Engineer (Associate Director) DepartmentISS LocationBangalore Reports ToHead of Data Platform - ISS Grade 7 Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.Be accountable for technical delivery and take ownership of solutions.Lead a team of senior and junior developers providing mentorship and guidance.Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.Drive technical innovation within the department to increase code reusability, code quality and developer productivity.Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.Experience designing event-based or streaming data architectures using Kafka.Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements.Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.Experience implementing CDC ingestion.Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes.Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making.Communication:Strong in strategic communication and stakeholder engagement.Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.
Posted 1 month ago
10.0 - 15.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Experience: 8+ years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.
Posted 1 month ago
12.0 - 17.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Excellent understanding of AWS Services components with experience in multiple projects Strong Terraform Scripting Skills. Creating the CI/CD pipelines Good Hands-on in provisioning the Containers in AWS Container Instances and AKS etc. AWS ECS, Postgres, Lambda, S3, Route53, SNS, SQS Python (for Lambda functions) Strong Java Knowledge required. People skills Ability to quickly absorb knowledge as it relates to our application existing recorded KT sessions will be provided and core SME team will be available for any questions or addl guidance) Good communication & partnership with others Motivated in what they are working on if not clear or unsure about something, immediately raise up Ability to problem solve issues that arise
Posted 1 month ago
6.0 - 11.0 years
8 - 12 Lacs
Mumbai
Work from Office
Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).
Posted 1 month ago
5.0 - 10.0 years
8 - 12 Lacs
Mumbai
Work from Office
Skill—Java Microservices Springboot Experience4-6 Yrs Ro leT3 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Basic foundation on AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs. Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).
Posted 1 month ago
14.0 - 19.0 years
11 - 16 Lacs
Hyderabad
Work from Office
10 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 month ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Overall 8+ years experience on Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 6+ years of hands-on experience in Java development, with a focus on microservices architecture. Should have Java Full Stack developer. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.
Posted 1 month ago
9.0 - 14.0 years
11 - 16 Lacs
Hyderabad
Work from Office
8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills
Posted 1 month ago
8.0 - 13.0 years
12 - 16 Lacs
Hyderabad
Work from Office
8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Mentor team members and contribute to DevOps best practicesIntegrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 month ago
8.0 - 13.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Bachelor's degree in Computer Science, Engineering, or related field (or equivalent work experience). 8+ years of professional experience in Java backend development. 2+ years of hands-on experience working with AWS cloud services. Expertise in Spring Boot, REST APIs, and microservice design. Strong understanding of cloud-native development, containers (Docker), and modern deployment techniques. Familiarity with relational and NoSQL databases such as PostgreSQL, MySQL, or DynamoDB. Experience with logging, monitoring, and performance tuning in a cloud environment. Proficiency in Git and experience with CI/CD tools (e.g., Jenkins, GitHub Actions, AWS CodePipeline). Develop and maintain robust backend services and APIs using Java (8+) and Spring Boot. Build and deploy applications in AWS using services such as Lambda, API Gateway, S3, DynamoDB, RDS, SQS/SNS, and CloudWatch. Contribute to the design of microservices and serverless architectures. Write efficient, maintainable, and testable code following software engineering best practices. Collaborate with product managers, architects, and other developers to deliver high-quality solutions. Participate in code reviews, unit testing, integration testing, and deployment processes. Help improve development processes and DevOps practices using tools such as Git, Jenkins, Docker, Terraform, or AWS CloudFormation. Troubleshoot and resolve application and system issues in a timely manner.
Posted 1 month ago
10.0 - 15.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 8+ years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Java 11+, Spring Boot, Angular/React, REST APIs, Docker, Kubernetes, Microservices Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 month ago
8.0 - 13.0 years
3 - 7 Lacs
Mumbai
Work from Office
Skill—Java AWS Experience:6-9 Yrs Ro leT2 Responsibilities: Strong proficiency in Java (8 or higher) and Spring Boot framework. Hands-on experience with AWS services such as EC2, Lambda, API Gateway, S3, CloudFormation, DynamoDB, RDS. Experience developing microservices and RESTful APIs.ac Understanding of cloud architecture and deployment strategies. Familiarity with CI/CD pipelines and tools such as Jenkins, GitHub Actions, or AWS CodePipeline. Knowledge of containerization (Docker) and orchestration tools (ECS/Kubernetes) is a plus. Experience with monitoring/logging tools like CloudWatch, ELK Stack, or Prometheus is desirable. Familiarity with security best practices for cloud-native apps (IAM roles, encryption, etc.).Develop and maintain robust backend services and RESTful APIs using Java and Spring Boot. Design and implement microservices that are scalable, maintainable, and deployable in AWS. Integrate backend systems with AWS services including but not limited to Lambda, S3, DynamoDB, RDS, SNS/SQS, and CloudFormation. Collaborate with product managers, architects, and other developers to deliver end-to-end features. Participate in code reviews, design discussions, and agile development processes.
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
8 to 12 years of experience in information technology with an emphasis on application development, demonstrated experience with applications development throughout the entire development lifecycle. In depth knowledge of the services industry and their IT systems Practical cloud native experience Experience in Computer Science, Engineering, Mathematics, or a related field and expertise in technology disciplines Java Full Stack Developmentability to create medium large sized Java web applications from start to finish on their own. This includes but is not limited to the followingclient interaction, validating requirements, system design, frontend/UI development, interaction with a Java EE application server, web services, experience with the various Java EE APIs, development builds, application deployments, integration/enterprise testing, and support of applications within a production environment. Experience with Java/J2EE with a deep understanding of the language and core APIs, web services, multi threadedor concurrent programming, XML, design patterns, Service Oriented Architecture. Experience in implementing Micro services using Spring Boot and Event Driven architecture. Work with a team that develops smart and scalable solutions and provide a solid experience for our users. Develop an understanding of our products and the problems we are attempting to solve. Analyze infrastructure problems/constraints, inefficiencies, process gaps, risk and regulatory issues and engineer software or automation solutions Work in partnership with infrastructure engineers and architects to understand and identify operational improvements. Tech skills Java API, Microservices UI React Javascript AWS ECS Postgres Lambda, S3, Route53, SNS, SQS Infrastracture as Code concepts TestingjUnit, AFT (Selenium/Cucumber/Gherkin), Blazemeter perf testing Python (for Lambda functions) People skills Ability to quickly absorb knowledge as it relates to our application (existing recorded KT sessions will be provided and core SME team will be available for any questions or addl guidance) Good communication & partnership with others Motivated in what they are working on If not clear or unsure about something, immediately raise up Ability to problem solve issues that arise
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City