Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
14 - 19 Lacs
Mumbai, Pune
Work from Office
Company: Marsh McLennan Agency Description: Marsh McLennan is seeking candidates for the following position based in the Pune office. Senior Engineer/Principal Engineer What can you expect We are seeking a skilled Data Engineer with 3 to 5 years of hands-on experience in building and optimizing data pipelines and architectures. The ideal candidate will have expertise in Spark, AWS Glue, AWS S3, Python, complex SQL, and AWS EMR. What is in it for you Holidays (As Per the location) Medical & Insurance benefits (As Per the location) Shared Transport (Provided the address falls in service zone) Hybrid way of working Diversify your experience and learn new skills Opportunity to work with stakeholders globally to learn and grow We will count on you to: Design and implement scalable data solutions that support our data-driven decision-making processes. What you need to have: SQL and RDBMS knowledge - 5/5. Postgres. Should have extensive hands-on Database systems carrying tables, schema, views, materialized views. AWS Knowledge. Core and Data engineering services. Glue/ Lambda/ EMR/ DMS/ S3 - services in focus. ETL data:dge :- Any ETL tool preferably Informatica. Data warehousing. Big data:- Hadoop - Concepts. Spark - 3/5 Hive - 5/5 Python/ Java. Interpersonal skills:- Excellent communication skills and Team lead capabilities. Understanding of data systems well in big organizations setup. Passion deep diving and working with data and delivering value out of it. What makes you stand out Databricks knowledge. Any Reporting tool experience. Preferred MicroStrategy. Marsh McLennan (NYSEMMC) is the worlds leading professional services firm in the areas ofrisk, strategy and people. The Companys more than 85,000 colleagues advise clients in over 130 countries.With annual revenue of $23 billion, Marsh McLennan helps clients navigate an increasingly dynamic and complex environment through four market-leading businesses.Marshprovides data-driven risk advisory services and insurance solutions to commercial and consumer clients.Guy Carpenter develops advanced risk, reinsurance and capital strategies that help clients grow profitably and pursue emerging opportunities. Mercer delivers advice and technology-driven solutions that help organizations redefine the world of work, reshape retirement and investment outcomes, and unlock health and well being for a changing workforce. Oliver Wyman serves as a critical strategic, economic and brand advisor to private sector and governmental clients. For more information, visit marshmclennan.com , or follow us onLinkedIn andX . Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people regardless of their sex/gender, marital or parental status, ethnic origin, nationality, age, background, disability, sexual orientation, caste, gender identity or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person. Marsh McLennan (NYSEMMC) is a global leader in risk, strategy and people, advising clients in 130 countries across four businessesMarsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit marshmclennan.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one anchor day per week on which their full team will be together in person.
Posted 1 month ago
5.0 - 10.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Stellantis is seeking a passionate, innovative, results-oriented Information Communication Technology (ICT) Manufacturing AWS Cloud Architect to join the team. As a Cloud architect, the selected candidate will leverage business analysis, data management, and data engineering skills to develop sustainable data tools supporting Stellantiss Manufacturing Portfolio Planning. This role will collaborate closely with data analysts and business intelligence developers within the Product Development IT Data Insights team. Job responsibilities include but are not limited to Having deep expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, cloud and other SQL technologies. Working with stakeholders to support their data infrastructure needs while assisting with data-related technical issues Maintain high-quality ontology and metadata of data systems Establish a strong relationship with the central BI/data engineering COE to ensure alignment in terms of leveraging corporate standard technologies, processes, and reusable data models Ensure data security and develop traceable procedures for user access to data systems Qualifications, Experience and Competency Education Bachelors or Masters degree in Computer Science, or related IT-focused degree Experience Essential Overall 10-15 years of IT experience Develop, automate and maintain the build of AWS components, and operating systems. Work with application and architecture teams to conduct proof of concept (POC) and implement the design in a production environment in AWS. Migrate and transform existing workloads from on premise to AWS Minimum 5 years of experience in the area of data engineering or data architectureconcepts, approach, data lakes, data extraction, data transformation Proficient in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies. Experience operating very large data warehouses or data lakes Investigate and develop new micro services and features using the latest technology stacks from AWS Self-starter with the desire and ability to quickly learn new technologies Strong interpersonal skills with ability to communicate & build relationships at all levels Hands-on experience from AWS cloud technologies like S3, AWS glue, Glue Catalog, Athena, AWS Lambda, AWS DMS, pyspark, and snowflake. Experience with building data pipelines and applications to stream and process large datasets at low latencies. Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Desirable Familiarity with data analytics, Engineering processes and technologies Ability to work successfully within a global and cross-functional team A passion for technology. We are looking for someone who is keen to leverage their existing skills while trying new approaches and share that knowledge with others to help grow the data and analytics teams at Stellantis to their full potential! Specific Skill Requirement AWS services (GLUE, DMS, EC2, RDS, S3, VPCs and all core services, Lambda, API Gateway, Cloud Formation, Cloud watch, Route53, Athena, IAM) andSQL, Qlik sense, python/Spark, ETL optimization , If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 1 month ago
12.0 - 16.0 years
14 - 20 Lacs
Pune
Work from Office
AI/ML/GenAI AWS SME Job Description Role Overview: An AWS SME with a Data Science Background is responsible for leveraging Amazon Web Services (AWS) to design, implement, and manage data-driven solutions. This role involves a combination of cloud computing expertise and data science skills to optimize and innovate business processes. Key Responsibilities: Data Analysis and Modelling: Analyzing large datasets to derive actionable insights and building predictive models using AWS services like SageMaker, Bedrock, Textract etc. Cloud Infrastructure Management: Designing, deploying, and maintaining scalable cloud infrastructure on AWS to support data science workflows. Machine Learning Implementation: Developing and deploying machine learning models using AWS ML services. Security and Compliance: Ensuring data security and compliance with industry standards and best practices. Collaboration: Working closely with cross-functional teams, including data engineers, analysts, DevOps and business stakeholders, to deliver data-driven solutions. Performance Optimization: Monitoring and optimizing the performance of data science applications and cloud infrastructure. Documentation and Reporting: Documenting processes, models, and results, and presenting findings to stakeholders. Skills & Qualifications Technical Skills: Proficiency in AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker). Strong programming skills in Python. Experience with AI/ML project life cycle steps. Knowledge of machine learning algorithms and frameworks (e.g., TensorFlow, Scikit-learn). Familiarity with data pipeline tools (e.g., AWS Glue, Apache Airflow). Excellent communication and collaboration abilities.
Posted 1 month ago
7.0 - 12.0 years
19 - 22 Lacs
Bengaluru
Work from Office
Project description Luxoft is one of the leading service provider for Banking and Capital Market customers. Luxoft has been engaged by an large Australian bank for providing L1/L2 Application Monitoring and Production Support services for business-critical applications and interfaces on 24/5 basis on a managed outcome basis in the Global Markets business area. We are looking for motivated individuals who have relevant skills & experience and are willing to work in shifts. Responsibilities Develop and maintain Unix shell scripts for automation tasks. Write and optimize Python scripts for process automation and data handling. Design, implement, and maintain scalable cloud infrastructure using AWS services (EC2, S3, Lambda, etc.). Monitor and troubleshoot cloud environments for optimal performance. Monitor and optimize system resources and automate routine administrative tasks and BAU tasks. Production Environment monitoring & Issue Resolution. Control SLA and notify management or the client in case of unexpected behavior. Support end-to-end data flows and health and sanity checks of the systems and applications. Escalate the issues (internally to Group lead/PM) with environment and application health. Logs review and data discovery in database tables for investigation of workflow failures. Investigate and supply analysis to fix application/configuration issues in the production environment. Contact/chase responsible support/upstream/downstream/cross teams and ask for root cause analysis from them on issues preventing end-to-end flow to work as designed. Regular update on issue status until addressed, notifying the client on status changes; expected time to address. Participate in ad-hoc/regular status calls on application health with the client to discuss critical defects/health check status. Working with business users service requests, which includes investigation of business logic and application behavior. Work with different data format transformation processes (XML, Pipeline). Work with source control tools (GIT/SVN) in order to investigate configuration or data transformation-related issues. Work with middleware and schedulers on data flow and batch process control. Focus on continuous proactive service improvement and continuous learning. Ensure customer service excellence and guaranteed response within SLA timeline by actively monitoring support emails/tickets and actively working on them till the issue is fully remediated. Ensuring all incident tickets are resolved in a timely and comprehensive manner. Track and identify frequently occurring, high-impact support issues as candidates for permanent resolution. Bachelor's Degree from a reputed university with good passing scores. Skills Must have 7 to 12 years as a L2/L3 Production Support along with Site Reliability Engineer having strong knowledge of Unix shell scripting Develop and maintain Unix shell scripts for automation tasks. Write and optimize Python or Shell scripts for process automation and data handling. Good knowledge of any scripting language would be fine. Basic Knowledge on AWS services (EC2, S3, etc.). Monitor and optimize system resources and automate routine administrative tasks and BAU tasks. Good Understanding of Incident/Change/Problem Management process. Required Skills: Strong experience with Unix Shell Scripting. Proficiency in Python Scripting for automation. Proficiency in any scripting language and have hands-on experience in automation. Strong Knowledge of Database Basic understanding of AWS services and cloud Basic knowledge and experience supporting cloud applications. Ability to troubleshoot and resolve technical issues in a Production Environment. Nice to have Preferred Skills (Optional): Experience with containers (Docker, Kubernetes). Familiarity with CI/CD pipelines, version control systems (e.g., Git). Knowledge of Infrastructure-as-Code tools like Terraform. Strong problem-solving and communication skills. OtherLanguagesHindiB1 Intermediate,EnglishC1 Advanced SenioritySenior
Posted 1 month ago
4.0 - 8.0 years
12 - 17 Lacs
Pune
Work from Office
Project description We're seeking a Senior React Developer with strong experience in TypeScript to build and maintain high-quality, performant user interfaces. The ideal candidate is passionate about clean code, UI/UX best practices, and collaborating in a modern, agile development environment. Experience with Node.js is a plus. Responsibilities Develop Scalable UIsBuild responsive, accessible, and maintainable web interfaces using React and TypeScript. Component ArchitectureDesign and implement reusable, modular components that follow best practices. State ManagementManage complex application state with tools like Redux, MobX, or Context API. API IntegrationCollaborate with backend teams to consume RESTful and/or GraphQL APIs. Performance OptimizationProfile and tune components to ensure optimal performance across devices and browsers. Testing & QualityWrite and maintain unit/integration tests using Jest, React Testing Library, or similar tools. Cross-functional CollaborationWork closely with designers, product managers, and fellow developers in an agile environment. Version ControlUse Git effectively in collaborative workflows (e.g., GitHub Flow). AI Tools (Optional)Leverage AI-assisted development tools like GitHub Copilot to improve productivity and code quality. Skills Must have React.js ExpertiseDeep understanding of React's core concepts (hooks, lifecycle, reconciliation). TypeScript & JavaScriptProficient in modern JavaScript (ES6+) and strong TypeScript typing practices. HTML/CSS MasteryAbility to craft responsive, semantic, and accessible front-end code. State LibrariesExperience with Redux, MobX, Zustand, or similar state management tools. Version ControlStrong command of Git, branching strategies, and pull request best practices. TestingExperience with frontend testing tools such as Jest, Enzyme, or React Testing Library. Build ToolsFamiliarity with Webpack, Vite, Babel, or other front-end tooling systems. UI/UX AwarenessUnderstanding of usability principles and pixel-perfect implementation of designs. Problem-SolvingStrong debugging skills and ability to propose practical solutions. Nice to have Node.jsExperience building or integrating with Node.js APIs or services. AWSFamiliarity with AWS services (e.g., S3, EC2, ECS, R53, Lambda, CloudFront). CI/CD PipelinesExposure to modern deployment practices and automation tools. GraphQLFamiliarity with GraphQL clients (e.g., Apollo Client). Design SystemsExperience working with component libraries or design systems (e.g., MUI, Chakra UI, Storybook). Other Languages EnglishC1 Advanced Seniority Senior
Posted 1 month ago
3.0 - 8.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Project description With a strong working knowledge of both Private and Public cloud, you will be the expert in these areas and work closely with clients and other parts of the business to design, implement and support innovative solutions. As the Cloud SME you will be expected to have either an AWS Solution Architect Associate level certification or Microsoft Certified Solutions DeveloperAzure Solutions Architect accreditation. You'll have an expert knowledge of either AWS / Azure Cloud platforms and a good working knowledge of Cloud solutions. Responsibilities Provide direct analysis and recommendations associated with the implementation or migration of specific applications to cloud services Implementing and maintaining cloud management solutions including initial and ongoing configuration of related automation, notifications, and reporting capabilities. Work with a variety of legacy applications and platforms and work with application teams to implement or migrate associated components to cloud services. Identify and troubleshoot cloud service events and issues as well as work with cloud service providers to efficiently solve issues or implement workarounds. Skills Must have Deep experience with targeted cloud environment (AWS, Azure or GCP), Infrastructure as a service (IaaS), Platform as a Service (PaaS) and other native capabilities. Hands-on experience architecting, designing, implementing, and supporting targeted cloud-based applications and solutions. Hands-on experience in all aspects of targeted cloud computing (compute, CI/CD, containers, storage, platforms, data, networking and security) Knowledge of networking and web standards such as DNS, DHCP, TCP/IP, HTTP, web security, switches, routers, load balancers, firewalls is desired Minimum 3+ years hands-on experiences in provisioning and high level of understanding on AWS related services.(especially ECS, SQS,MQ, Lambda). Deep understanding on AWS/Azure/GCP network components. High level understanding on Terraform. CI/CD Tools any industry standard tools is acceptable but knowledge in YAML scripting is must. Microservices provisioning and application deployments using EKS,ECS (any of these). Experience with Serverless components, especially Lambda Experience of configuration management Ansible, Puppet , Chef or Saltstack. Programming experiences (Python with boto3 preferred) Nice to have API experience (Kong / API Gateway) Experience with Kafka / Solace message brokers. Other Languages EnglishC1 Advanced Seniority Regular
Posted 1 month ago
5.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Hybrid
PF Detection is mandatory : Managing data storage solutions on AWS, such as Amazon S3, Amazon Redshift, and Amazon DynamoDB. Implementing and optimizing data processing workflows using AWS services like AWS Glue, Amazon EMR,and AWS Lambda. Working with Spotfire Engineers and business analysts to ensure data is accessible and usable for analysisand visualization. Collaborating with other engineers, and business stakeholders to understand requirements and deliversolutions. Writing code in languages like SQL, Python, or Scala to build and maintain data pipelines and applications. Using Infrastructure as Code (IaC) tools to automate the deployment and management of datainfrastructure. A strong understanding of core AWS services, cloud concepts, and the AWS Well-Architected Framework Conduct an extensive inventory/evaluation of existing environments workflows. Designing and developing scalable data pipelines using AWS services to ensure efficient data flow andprocessing. Integrating / combining diverse data sources to maintain data consistency and reliability. Working closely with data engineers and other stakeholders to understand data requirements and ensureseamless data integration. Build and maintain CI/CD pipelines. Kindly Acknowledge back to this mail with updated Resume.
Posted 1 month ago
4.0 - 9.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Extensive experience with AWS services : IAM, S3, Glue, CloudFormation and CloudWatch In-depth understanding of AWS IAM policy evaluation for permissions and access control Proficient in using Bitbucket, Confluence, GitHub, and Visual Studio Code Proficient in policy languages, particularly Rego scriptingGood to Have Skills : Experience with the WIZ tool for security and compliance Good programming skills in Python Advanced knowledge of additional AWS services : ECS, EKS, Lambda, SNS and SQSRoles & ResponsibilitiesSenior Developer on the Wiz team specializing in Rego and AWS------ ------Project Manager - One to Three Years,AWS Cloud Formation - Four to Six Years,AWS IAM - Four to Six Years------PSP Defined SCU in Data engineer.
Posted 1 month ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Role Purpose Mandatory Skills Cloud (Azure/AWS/GCP), DevOps, Kubernetes, UNIX, certified professional,docker container Preferred Skills Cloud Services, Ansible/Chef, Puppet, CI/CD Implementation Hands on experience with Cloud service e.g. (AWS Services like VPC, EC2, S3, ELB, RDS, ECS/EKS, IAM, CloudFront, CloudWatch, SQS/SNS, Lambda) Experience with Infrastructure as Code Tooling Terraform, CloudFormation Hands on skills in scripting or coding with modern languages Python, Unix bash scripting Experience with Configuration management tooling (Ansible/Chef, Puppet good to have.) Strong Docker and Kubernetes skills desirable Some experience with windows or unix administration Experience with Continuous Integration and Continuous Deployment Pipelines and tooling (GitLab, Jenkins, Github, Jira, or related) Should be able to propose new solutions for CI/CD implementation Should be able to develop overall strategy for Build & Release management
Posted 1 month ago
1.0 - 5.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Role Purpose Mandatory Skills: Gen AI, LLM, RAG, Lang chain, Mistral,Llama, Vector DB, Azure/GCP/ Lambda, Python, Tensorflow, Pytorch Preferred Skills GPT-4, NumPy, Pandas, Keras, Databricks, Pinecone/Chroma/Weaviate, Scale/Labelbox, We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework. Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various GenAI base models Design and develop prompts suiting project needs Stakeholder management across business and domains as required for the projects Evaluating base models and benchmarking performance Implement prompt guardrails to prevent attacks like prompt injection, jail braking and prompt leaking Develop, deploy and maintain auto prompt solutions Design and implement minimum design standards for every use case involving prompt engineering You will be responsible for training the machine learning and deep learning model. Writing reusable, testable, and efficient code using Python Design and implementation of low-latency, high-availability, and performant applications Implementation of security and data protection Integration of data storage solutions and API Gateways Production change deployment and related support Reinvent your world.We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
5.0 - 7.0 years
4 - 8 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled Data Engineer with expertise in Java, PySpark, and big data technologies. The ideal candidate will have in-depth knowledge of Apache Spark, Python, and Java programming (Java 8 and above, including Lambda, Streams, Exception Handling, Collections, etc.). Responsibilities include developing data processing pipelines using PySpark, creating Spark jobs for data transformation and aggregation, and optimizing query performance using file formats like ORC, Parquet, and AVRO. Candidates must also have hands-on experience with Spring Core, Spring MVC, Spring Boot, REST APIs, and cloud services like AWS. This role involves designing scalable pipelines for batch and real-time analytics, performing data enrichment, and integrating with SQL databases.
Posted 2 months ago
0.0 - 1.0 years
10 - 14 Lacs
Pune
Work from Office
Role AWS Cloud Engineer. We are looking for an AWS Cloud Engineer with a strong DevOps and scripting background who can support and optimize cloud infrastructure, automate deployments, and collaborate cross-functionally in a fast-paced fintech environment. Core Responsibilities. Design and implement secure, scalable network and infrastructure solutions using AWS. Deploy applications using EC2, Lambda, Fargate, ECS, and ECR. Automate infrastructure using CloudFormation, scripting (Python/Bash), and AWS SDK. Manage and optimize relational (PostgreSQL, SQL) and NoSQL (MongoDB) databases. Set up and maintain monitoring using Grafana, Prometheus, and AWS CloudWatch. Perform cost optimization across the AWS infrastructure and execute savings strategies. Maintain high availability, security, and disaster recovery (DR) mechanisms. Implement Kubernetes (EKS), containers, and CI/CD pipelines. Proactively monitor system performance and troubleshoot production issues. Coordinate across product and development teams for deployments and DevOps planning. Must-Have Skills. 4+ years of hands-on experience with AWS Cloud Platform. Strong proficiency in Linux/Unix systems, Nginx, Apache, and Tomcat. Proficient in Python, Shell/Bash scripting for automation. Strong knowledge of SQL, PostgreSQL, and MongoDB. Familiarity with CloudFormation, IAM, VPC, S3, ECS/EKS/ECR. Monitoring experience with Prometheus, Grafana, and CloudWatch. Previous exposure to AWS cost optimization strategies. Excellent communication skills, self-driven, and a proactive attitude. Nice-to-Have Skills. Experience with Google Cloud Platform (GCP). Experience in container orchestration with Kubernetes (EKS preferred). Background in working with startups or fast-growing product environments. Knowledge of disaster recovery strategies and high availability setups. (ref:hirist.tech).
Posted 2 months ago
8.0 - 13.0 years
16 - 20 Lacs
Pune
Work from Office
AWS Solution Architect/DevOps1 :We are seeking a highly skilled AWS DevOps Engineer / Solution Architect with a strong background in designing and implementing data-driven and API-based solutions. The ideal candidate will have deep expertise in AWS architecture, a passion for creating scalable, secure, and high-performance systems, and the ability to align technology solutions with business goals.Key Responsibilities Design and Architect Solutions: Develop and architect scalable, secure, and efficient cloud-based solutions on AWS for data and API-related projects. Infrastructure as Code: Implement infrastructure automation using tools such as Terraform, CloudFormation, or AWS CDK. API Development and Integration: Architect and implement RESTful APIs, ensuring high availability, scalability, and security using AWS API Gateway, Lambda, and related services. Data Solutions: Design and optimize data pipelines, data lakes, and storage solutions using AWS services like S3, Redshift, RDS, and DynamoDB. CI/CD Pipelines: Build, manage, and optimize CI/CD pipelines to automate deployments, testing, and infrastructure provisioning (Jenkins, CodePipeline, etc.). Monitoring and Optimization: Ensure robust monitoring, logging, and alerting mechanisms are in place using tools like CloudWatch, Prometheus, and Grafana. Collaboration and Best Practices: Work closely with cross-functional teams (development, data engineering, security) to implement DevOps best practices and deliver innovative cloud solutions. Security and Compliance: Implement AWS security best practices, including IAM, encryption, VPC, and security monitoring to ensure solutions meet security and compliance standards. Cost Optimization: Continuously optimize AWS environments for performance, scalability, and cost-effectiveness. Qualifications 8+ years of experience in AWS cloud architecture, with a focus on data and API solutions. Expertise in AWS core services such as EC2, S3, Lambda, API Gateway, RDS, DynamoDB, Redshift, and CloudFormation. Hands-on experience with infrastructure as code (IaC) tools like Terraform, AWS CDK, or CloudFormation. Proficiency in API design and development , particularly RESTful APIs and serverless architectures. Strong understanding of CI/CD pipelines , version control (Git), and automation tools. Knowledge of networking, security best practices, and AWS Well-Architected Framework . Experience with containerization technologies such as Docker and orchestration tools like Kubernetes or AWS ECS/EKS. Excellent problem-solving skills and ability to work independently and in a team environment. AWS Certifications such as AWS Certified Solutions Architect(Associate/Professional)or AWS Certified DevOps Engineer are highly preferred.
Posted 2 months ago
8.0 - 13.0 years
8 - 12 Lacs
Pune
Work from Office
Full Stack .NET core + Angular with AWS Experience1 Education Bachelors or Masters degree in Computer Science, Engineering, or related field. Experience Minimum of 8 years of experience in .NET Core development with AWS. .NET Core Development Design, develop, and maintain high-quality, scalable, and efficient .NET Core applications. Implement best practices for coding and ensure code quality through unit testing and code reviews. Hands-on experience with containerization technologies like Docker and Kubernetes. Knowledge of database technologies like SQL Server, MongoDB, or Cosmos DB. AWS Experience Design and implement serverless solutions using AWS Lambda functions. Strong experience with AWS services, particularly Lambda and Step Functions. Deploy, manage on AWS cloud Optimise application performance and scalability in the cloud environment. Containerization Design and implement containerized applications using Docker and Kubernetes. Ensure proper orchestration and management of containers for high availability and resilience. Experience with serverless architecture and Azure Functions. Angular Development Develop and maintain front-end applications using Angular. Collaborate with UI/UX designers to implement responsive and user-friendly interfaces. Integrate front-end components with back-end services. Leadership and Mentorship Lead and mentor a team of developers, providing technical guidance and support. Conduct code reviews and ensure adherence to best practices and coding standards.
Posted 2 months ago
3.0 - 8.0 years
14 - 20 Lacs
Hyderabad
Work from Office
Job Area: Information Technology Group, Information Technology Group > IT Software Developer General Summary: Qualcomm OneIT team is looking for a talented senior Full-Stack Developer to join our dynamic team and contribute to our exciting projects. The ideal candidate will have strong understanding of Java, Spring Boot, Angular/React and AWS technologies as well as experience in designing, managing and deploying applications to the cloud.Key Responsibilities: Design, develop and maintain web applications using Java, Spring Boot, and Angular/React. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code. Ensure the performance, quality, and responsiveness of applications. Identify and correct bottlenecks and fix bugs. Help maintain code quality, organization, and automation. Stay up-to-date with the latest industry trends and technologies. Minimum Qualifications: 3+ years of IT-relevant work experience with a Bachelor's degree in a technical field (e.g., Computer Engineering, Computer Science, Information Systems). OR 5+ years of IT-relevant work experience without a Bachelor’s degree. 3+ years of any combination of academic or work experience with Full-stack Application Development (e.g., Java, Python, JavaScript, etc.) 1+ year of any combination of academic or work experience with Data Structures, algorithms, and data stores. Candidate should have: Bachelor's degree in Computer Science, Engineering, or a related field. 5-7 years of experience with minimum 3 years as a Full-Stack developer using Java, Spring Boot and Angular/React. Strong proficiency in Java and Spring Boot. Experience with front-end frameworks such as Angular or React. Familiarity with RESTful APIs and web services. Knowledge of database systems like Oracle, MySQL, PostgreSQL, or MongoDB. Experience with AWS services such as EC2, S3, RDS, Lambda, and API Gateway. Understanding of version control systems, preferably Git. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Experience with any other programming language like C#, Python Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with CI/CD pipelines and DevOps practices. Experience with Agile/Scrum/SAFe methodologies. Bachelors or Master’s degree in information technology, computer science or equivalent.
Posted 2 months ago
3.0 - 5.0 years
27 - 32 Lacs
Bengaluru
Work from Office
Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects
Posted 2 months ago
0.0 - 2.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 months ago
3.0 - 5.0 years
3 - 5 Lacs
Bengaluru
Work from Office
What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 months ago
2.0 - 5.0 years
30 - 32 Lacs
Bengaluru
Work from Office
Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 months ago
3.0 - 5.0 years
30 - 35 Lacs
Bengaluru
Work from Office
What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 months ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.
Posted 2 months ago
15.0 - 20.0 years
15 - 19 Lacs
Ahmedabad
Work from Office
Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Amazon Web Services (AWS) Good to have skills : Java Full Stack DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of fulltime education.Role:Technology Architect Project Role Description:Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have Skills :Amazon Web Services (AWS), SSINON SSI:Good to Have Skills :SSI:Java Full Stack Development NON SSI :Job :'',//field Key Responsibilities:1 Experience of designing multiple Cloud-native Application Architectures2 Experience of developing and deploying cloud-native application including serverless environment like Lambda 3 Optimize applications for AWS environment 4 Design, build and configure applications on AWS environment to meet business process and application requirements5 Understanding of security performance and cost optimizations for AWS6 Understanding to AWS Well-Architected best practices Technical Experience:1 8/15 years of experience in the industry with at least 5 years and above in AWS 2 Strong development background with exposure to majority of services in AWS3 AWS Certified Developer professional and/or AWs specialty level certification DevOps /Security 4 Application development skills on AWS platform with either Java SDK, Python SDK, Reactjs5 Strong in coding using any of the programming languages like Python/Nodejs/Java/Net understanding of AWS architectures across containerization microservices and serverless on AWS 6 Preferred knowledge in cost explorer, budgeting and tagging in AWS 7 Experience with DevOps tools including AWS native DevOps tools like CodeDeploy, Professional Attributes:a Ability to harvest solution and promote reusability across implementations b Self Motivated experts who can work under their own direction with right set of design thinking expertise c Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Educational Qualification:15 years of fulltime education. Additional Info:1 Application developers skills on AWS platform with either Java SDK, Python SDK, Nodejs, ReactJS 2 AWS services Lambda, AWS Amplify, AWS App Runner, AWS CodePipeline, AWS Cloud nine, EBS, Faregate,Additional comments:Only Bangalore, No Location Flex and No Level Flex Qualification 15 years of fulltime education.
Posted 2 months ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microservices and Light Weight Architecture Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time educationModernization Lead:Lead modernization initiatives by re-architecting legacy systems using Java, applying modern software design principles and AWS-based architecture patterns.Drive end-to-end modernization efforts, including re-architecture, refactoring of legacy systems, and cloud migration strategies.Provide architectural guidance and mentorship to engineering teams, fostering best practices in code quality, design, testing, and deployment.Apply Domain-Driven Design (DDD) principles to structure systems aligned with core business domains, ensuring modular and maintainable solutions.Design and implement scalable, decoupled services leveraging AWS services such as EKS, Lambda, API Gateway, SQS/SNS, and Oralce/RDS.Drive system decomposition, refactoring, and migration planning with a clear understanding of system interdependencies and data flows.Promote infrastructure-as-code, CI/CD automation, and observability practices to ensure system reliability, performance, and operational readiness.Proficient in architecting applications with Java and AWS technology stack, microservices, containers Qualification 15 years full time education
Posted 2 months ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :Data Engineering Sr. Advisor demonstrates expertise in data engineering technologies with the focus on engineering, innovation, strategic influence and product mindset. This individual will act as key contributor of the team to design, build, test and deliver large-scale software applications, systems, platforms, services or technologies in the data engineering space. This individual will have the opportunity to work directly with partner IT and business teams, owning and driving major deliverables across all aspects of software delivery.The candidate will play a key role in automating the processes on Databricks and AWS. They collaborate with business and technology partners in gathering requirements, develop and implement. The individual must have strong analytical and technical skills coupled with the ability to positively influence on delivery of data engineering products. The applicant will be working in a team that demands innovation, cloud-first, self-service-first, and automation-first mindset coupled with technical excellence. The applicant will be working with internal and external stakeholders and customers for building solutions as part of Enterprise Data Engineering and will need to demonstrate very strong technical and communication skills.Delivery Intermediate delivery skills including the ability to deliver work at a steady, predictable pace to achieve commitments, decompose work assignments into small batch releases and contribute to tradeoff and negotiation discussions.Domain Expertise Demonstrated track record of domain expertise including the ability to understand technical concepts necessary to do the job effectively, demonstrate willingness, cooperation, and concern for business issues and possess in-depth knowledge of immediate systems worked on.Problem Solving Proven problem solving skills including debugging skills, allowing you to determine source of issues in unfamiliar code or systems and the ability to recognize and solve repetitive problems rather than working around them, recognize mistakes using them as learning opportunities and break down large problems into smaller, more manageable ones& Responsibilities:The candidate will be responsible to deliver business needs end to end from requirements to development into production.Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities and reusable patterns.The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset.The applicant will ensure adherence to enterprise architecture direction and architectural standards.The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others.Experience Required:More than 12 years of experience in software engineering, building data engineering pipelines, middleware and API development and automationMore than 3 years of experience in Databricks within an AWS environmentData Engineering experienceExperience Desired:Expertise in Agile software development principles and patternsExpertise in building streaming, batch and event-driven architectures and data pipelinesPrimary Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc.Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, GlueGood understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curationExpertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundryExperience in multi-cloud software-as-a-service products such as Databricks, SnowflakeExperience in Infrastructure-as-Code (IaC) tools such as terraform and AWS cloudformationExperience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, AWS SNSExperience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as AWS Glue, Lambda, S3, Elastic Search, API Gateway, CloudFrontExperience with one or more of the following programming and scripting languages Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languagesExperience in building CI/CD pipelines using Jenkins, Github ActionsStrong expertise with source code management and its best practicesProficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD)Knowledge on Behavioral Driven Development (BDD) approachAdditional Skills: Ability to perform detailed analysis of business problems and technical environmentsStrong oral and written communication skillsAbility to think strategically, implement iteratively and estimate financial impact of design/architecture alternativesContinuous focus on an on-going learning and development Qualification 15 years full time education
Posted 2 months ago
7.0 - 12.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Project Role : Cloud Native Engineer Project Role Description : Select and deploy appropriate cloud-native tools to accelerate application development. Knowledge of the target cloud-native tools is necessary, and this role can specialize in one specific native cloud, ex. Azure, AWS, GCP, etc. Must have skills : Amazon Connect Good to have skills : Java, Python (Programming Language), Node.jsMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Native Engineer, you will be responsible for selecting and deploying suitable cloud-native tools to expedite application development. Having expertise in the target cloud-native tools is crucial, and this role may focus on a specific native cloud platform like Azure, AWS, GCP, etc. Roles & Responsibilities:-Manage and guide a team of engineers to design, build, and deploy cloud-native solutions using AWS services.-Own project planning, resource allocation, and delivery management while ensuring adherence to timelines and quality standards.-Architect scalable and secure solutions using AWS Connect, Amazon Lex, Lambda, and API Gateway.-Act as the primary point of contact for technical and delivery matters, liaising with cross-functional teams and business stakeholders.Promote best practices in cloud architecture, including the AWS Well-Architected Framework.-Drive team performance through regular coaching, performance reviews, and mentorship.-Ensure continuous integration and delivery pipelines are in place, and champion DevOps culture.-Oversee the implementation of Infrastructure as Code using Terraform or similar tools.Must have:- Deep experience with AWS Connect, Amazon Lex, AWS Lambda, and API GatewayStrong programming knowledge in at least one language:Python, Node.js, or JavaProven leadership experience managing cloud/DevOps teamsSolid grasp of serverless architecture, microservices, and cloud-native design patternsExcellent communication, planning, and stakeholder management skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Amazon Connect.- Good To Have Skills: Experience with Python (Programming Language), Node.js, Java.- Strong understanding of cloud-native architecture principles.- Hands-on experience in deploying cloud-native applications.- Knowledge of containerization technologies like Docker and Kubernetes.Familiarity with DevOps tools and practices-Hands-on experience with Terraform for managing infrastructure-In-depth understanding of the AWS Well-Architected Framework-AWS Certified Solutions Architect Associate/Professional-Certifications:Any Associate/Professional/Specialty level certification is Mandatory Additional Information:- The candidate should have a minimum of 7.5 years of experience in Amazon Connect.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City