Jobs
Interviews

864 Lambda Expressions Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

11 - 16 Lacs

Gurugram

Work from Office

Role Description As a Java Developer, you will be responsible for developing, maintaining, and enhancing Java-based applications, with a particular focus on integrating MQ (Message Queue) systems, deploying applications in AWS, and working with MongoDB for database integration. You will collaborate with cross-functional teams to provide reliable and efficient solutions for both internal and external applications, ensuring seamless performance and scalability. Key Responsibilities: Java Development: Design, develop, test, and maintain Java-based applications with a focus on performance, scalability, and reliability. MQ Integration: Integrate and maintain MQ queues (IBM MQ, ActiveMQ, RabbitMQ, etc.) within Java applications, ensuring reliable message delivery and seamless data exchange. AWS Deployment: Deploy Java applications to AWS cloud infrastructure using services such as EC2, S3, Lambda, and RDS. Manage and monitor application performance and resources in AWS. MongoDB Integration: Work with MongoDB to design, integrate, and optimize NoSQL database solutions within Java applications. Ensure data is stored, retrieved, and updated efficiently. Messaging Systems: Work with messaging protocols like JMS (Java Message Service) to implement and manage reliable message delivery between distributed systems. Troubleshooting & Optimization: Identify and resolve issues related to MQ messaging, AWS deployments, MongoDB integration, and overall application performance. Collaboration: Work closely with backend developers, QA teams, and devops teams to ensure smooth integration and deployment of applications. Quality Assurance: Write unit tests and integration tests to ensure code quality and reliability of the application. Documentation: Create and maintain comprehensive documentation for application configurations, deployment processes, and integrations. Technical Skills Programming Languages: Strong proficiency in Java (J2EE) with experience in designing and developing scalable applications. MQ Integration: Hands-on experience with configuring, managing, and troubleshooting MQ queues (IBM MQ, ActiveMQ, RabbitMQ, or similar) within Java applications. AWS Knowledge: Experience deploying, managing, and monitoring Java applications in AWS, using services such as EC2, S3, Lambda, RDS, and others. MongoDB Integration: Experience with integrating MongoDB into Java applications, including the use of MongoDB Java drivers, queries, indexing, and performance optimization. Messaging Protocols: Familiarity with messaging protocols such as JMS (Java Message Service) for message-driven applications. Frameworks & Tools: Experience with Java frameworks like Spring, Springboot, Hibernate, JPA, or similar framework, along with tools like Docker for containerization and CI/CD pipelines. Database Knowledge: Strong experience with MongoDB (NoSQL) as well as relational databases (SQL). Problem-Solving: Strong debugging and problem-solving skills to address issues related to MQ integration, AWS deployment, MongoDB integration, and application performance. Version Control: Familiarity with version control systems such as Git, SVN, or similar. Soft Skills: Strong communication skills and the ability to work collaboratively within a team. Nice-to-have skills Qualifications Experience: 6 to 9 years of hands-on experience in Java development with at least 3-5 years focused on MQ integration, MongoDB integration, and deploying applications in AWS. Certifications: Relevant certifications in Java, AWS, MongoDB, or MQ technologies would be a plus. Cloud Knowledge: Experience with AWS CloudFormation, Lambda, or other AWS services related to Java application deployment. DevOps Knowledge: Familiarity with CI/CD pipelines, automation tools, and best practices for deploying applications in the cloud.

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

guwahati, assam

On-site

The Developer role entails overseeing the development, implementation, and technical aspects of software projects to ensure the successful realization of the technical vision and strategy. This includes upholding technical standards, ensuring code quality, and maintaining the overall technical integrity of the project. The position requires a minimum of 7+ years of experience and a qualification of B.E./B. Tech in any specialization or MCA. The job location is in Guwahati, Assam. The ideal candidate should possess expertise in core Java concepts, object-oriented programming principles, Java features like lambda expressions and streams, as well as experience in developing enterprise-level applications using Java EE technologies. Proficiency in the Spring framework for building scalable applications, Spring Boot for rapid microservices development, and ORM concepts with frameworks like Hibernate is essential. Additionally, skills in web development using HTML, CSS, and JavaScript, along with experience in analyzing and optimizing Java applications for performance are required. Experience working in Agile/Scrum environments, relational databases like MariaDB, MySQL, PostgreSQL, or Oracle, and version control systems is crucial. Proficiency in CI/CD pipelines implementation using tools like Jenkins, GitLab CI, or Travis CI, automated testing and deployment processes, and familiarity with containerization technologies like Docker are preferred. Knowledge of building microservices-based architectures and understanding service discovery, load balancing, and API gateways are advantageous. Responsibilities include collaborating with stakeholders to grasp requirements and technical challenges, designing system architecture, writing and optimizing front-end and back-end code, integrating third-party services, implementing performance optimizations, and setting up CI/CD pipelines. Monitoring system health, providing maintenance, documenting code, working closely with the team, ensuring security best practices, and suggesting process improvements are also core duties. The Developer will be responsible for staying updated with new technologies, monitoring application response times, maintaining software documentation, recording support activities, collaborating with stakeholders, conducting feasibility studies, writing efficient code, executing tests, debugging and resolving issues, participating in team meetings and code reviews, and identifying areas for process improvement. Compliance with ISO 9001, ISO 20000, ISO 27001, and CMMI Level 5 standards is essential. Fluency in English and Hindi (speaking, reading, and writing) is required, with fluency in Assamese being preferred. The position was posted on June 30, 2025, and the last date for submission is July 31, 2025.,

Posted 1 month ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Pune

Hybrid

5+ yrs exp in AWS Ecosystem – EKS, EC2, DynamoDB, Lambda Should have worked with Observability team having Dynatrace experience Monitoring Site, trend analysis, log analysis, implement capacity planning strategies. Good Devops practices.

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As an experienced .NET Developer with 5-8 years of overall experience, you will be responsible for developing web and window applications using .NET programming components. In this role, you must have a minimum of 4 years of experience in ASP .NET/ C# Development, with the ability to customize current applications according to changing business requirements. Additionally, you will be expected to mentor a team of 2-3 members and ensure timely completion of the software development life cycle phases. Your expertise should include proficiency in ASP.NET, C#, VB.NET/ JavaScript, jQuery, VBScript, HTML, XML, and Ajax. Knowledge of ADO.NET/SQL Server is essential, and experience in Angular is advantageous. Familiarity with MVC/MVC/WCF/WPF/Design patterns/LINQ/Lambda expressions and SQL scripting, stored procedures, functions is desirable. Experience in Source & Configuration Management using Git/SVN, API integrations, and Cloud Development will be beneficial for this role. Located at Technopark Phase-1, Trivandrum, Kerala, this is a permanent position that requires you to work from the office. If you are ready to take the next step in your career, please send your CV and details to career.mpt@muthoot.com and join our team to contribute to our success.,

Posted 1 month ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Gurugram

Work from Office

Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description As an AWS Data Engineer, your role will be to design, develop, and maintain scalable data pipelines on AWS. You will work closely with technical analysts, client stakeholders, data scientists, and other team members to ensure data quality and integrity while optimizing data storage solutions for performance and cost-efficiency. This role requires leveraging AWS native technologies and Databricks for data transformations and scalable data processing. Technical Skills Responsibilities Lead and support the delivery of data platform modernization projects. Design and develop robust and scalable data pipelines leveraging AWS native services. Optimize ETL processes, ensuring efficient data transformation. Migrate workflows from on-premise to AWS cloud, ensuring data quality and consistency. Design automations and integrations to resolve data inconsistencies and quality issues Perform system testing and validation to ensure successful integration and functionality. Implement security and compliance controls in the cloud environment. Ensure data quality pre- and post-migration through validation checks and addressing issues regarding completeness, consistency, and accuracy of data sets. Collaborate with data architects and lead developers to identify and document manual data movement workflows and design automation strategies. Nice-to-have skills Qualifications 7+ years experience with a core data engineering skillset leveraging AWS native technologies (AWS Glue, Python, Snowflake, S3, Redshift). Experience in the design and development of robust and scalable data pipelines leveraging AWS native services. Proficiency in leveraging Snowflake for data transformations, optimization of ETL pipelines, and scalable data processing. Experience with streaming and batch data pipeline/engineering architectures. Familiarity with DataOps concepts and tooling for source control and setting up CI/CD pipelines on AWS. Hands-on experience with Databricks and a willingness to grow capabilities. Experience with data engineering and storage solutions (AWS Glue, EMR, Lambda, Redshift, S3). Strong problem-solving and analytical skills. Knowledge of Dataiku is needed Graduate/Post-Graduate degree in Computer Science or a related field.

Posted 1 month ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Gurugram

Work from Office

Role Description Understands the process flow and the impact on the project module outcome. Works on coding assignments for specific technologies basis the project requirements and documentation available Debugs basic software components and identifies code defects. Focusses on building depth in project specific technologies. Expected to develop domain knowledge along with technical skills. Effectively communicate with team members, project managers and clients, as required. A proven high-performer and team-player, with the ability to take the lead on projects. Design and create S3 buckets and folder structures (raw, cleansed_data, output, script, temp-dir, spark-ui) Develop AWS Lambda functions (Python/Boto3) to download Bhav Copy via REST API and ingest into S3 Author and maintain AWS Glue Spark jobs to: partition data by scrip, year and month convert CSV to Parquet with Snappy compression Configure and run AWS Glue Crawlers to populate the Glue Data Catalog Write and optimize AWS Athena SQL queries to generate business-ready datasets Monitor, troubleshoot and tune data workflows for cost and performance Document architecture, code and operational runbooks Collaborate with analytics and downstream teams to understand requirements and deliver SLAs Technical Skills 3+ years hands-on experience with AWS data services (S3, Lambda, Glue, Athena) PostgreSQL basics Proficient in SQL and data partitioning strategies Experience with Parquet file formats and compression techniques (Snappy) Ability to configure Glue Crawlers and manage the AWS Glue Data Catalog Understanding of serverless architecture and best practices in security, encryption and cost control Good documentation, communication and problem-solving skills Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 1 month ago

Apply

2.0 - 4.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Zeta Global is looking for an experienced Machine Learning Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organisation. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning engineering best-practices and industry standards. Empower the product and engineering teams to make data-driven decisions. What you need to succeed: 2 to 4 years of proven experience as a Machine Learning Engineer in a professional setting. Proficiency in any programming language (Python preferable). Prior experience in building and deploying Machine learning systems. Experience with containerization: Docker & Kubernetes. Experience with AWS cloud services like EKS, ECS, EMR, Lambda, and others. Fluency with workflow management tools like Airflow or dbt. Familiarity with distributed batch compute technologies such as Spark. Experience with modern data warehouses like Snowflake or BigQuery. Knowledge of MLFlow, Feast, and Terraform is a plus.

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Experience required: 3-6 years Tech stack: Java 17/21 AWS Hands on in Services like ECS, Lambda, SQS, IoTCore, Kinesis, ECR, S3, Secret Manager, Cloud formation templates, Junit, Cucumber, Python 3.9 above, CICD Soft skills: Good team collaborator, Quick learner Job description : Experience required: 3-6 years Tech stack: Java 17/21 AWS Hands on in Services like ECS, Lambda, SQS, IoTCore, Kinesis, ECR, S3, Secret Manager, Cloud formation templates, Junit, Cucumber, Python 3.9 above, CICD Soft skills: Good team collaborator, Quick learner Experience required: 3-6 years Tech stack: Java 17/21 AWS Hands on in Services like ECS, Lambda, SQS, IoTCore, Kinesis, ECR, S3, Secret Manager, Cloud formation templates, Junit, Cucumber, Python 3.9 above, CICD Soft skills: Good team collaborator, Quick learner

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Mumbai

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications : Minimum of bachelors degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular Mandatory Skills: Python for Insights Experience : 5-8 Years.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 Title Technical Specialist Department Technology - Corporate Enablers (CFO Technology) Location Gurgaon / Bengaluru (Bangalore) India Reports To Senior Manager Level 4 About your team The Corporate Enablers technology function provides IT services to multiple business functions like Finance, HR and General Counsel, globally. CFO Technology collaborates with Finance and Procurement stakeholders globally to develop and support business applications that underpin all core finance processes across FIL. This includes on-premises and SaaS solutions, both in-house built and vendor provided. There is a strong focus on data analytics, workflow and automation tools to bring greater efficiency to these functions. Together with this there is continued move towards greater use of Agile and DevOps practices. CFO Technology team is a global team with people based in the UK, China, and India. About your role Join our team of enthusiastic technologists as we build the future of cloud-based Integration platform We are seeking a skilled and experienced Full stack Developer to join our team. The ideal candidate will have a strong background in API development and PLSQL Store procedures along with good understanding of Kubernetes,AWS,SnapLogic cloud-native technologies. This role requires deep technical expertise and the ability to work in a dynamic and fast-paced environment. About you Essential Skills Minimum 7 years of overall full stack (Python, Oracle/PLSQL) hands on experience of system software development, testing and maintenance Knowledge of latest Python frameworks and technologies (e.g., Django, Flask, FastAPI) Experience with Python libraries and tools (e.g., Pandas, NumPy, SQLAlchemy) Strong experience in designing, developing, and maintaining RESTful APIs. Familiarity with API security, authentication, and authorization mechanisms (e.g., OAuth, JWT) Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors) Experience in development & low-level design of Warehouse solutions Familiarity with Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation and Oracle performance optimisation techniques Hands on development experience of AWS (S3, lambda, api gateway, EC2, CloudFront, Route53, Dynamo DB, vpc, subnets) Hands-on experience with Kubernetes for container orchestration Knowledge of deploying, managing, and scaling applications on Kubernetes clusters Should be able to provide technical design and architecture independently for business solutions Experience with cloud architecture and design principles, micro-services Good understanding of infra-aspects of technical solutions like storage, platform, middleware Should have clear understating on continuous integration, build, release, code quality Good understating of load balancing, disaster recovery aspects of solutions Good knowledge on security aspects like authentication, authorization by using open standards like oAuth Hands on with coding and debugging. Should be able to write high quality code optimized for performance and scale Good analytical- problem solving skills and should be good with algorithms Skills nice to have Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Experience with SnapLogic cloud-native integration platform. Ability to design and implement integration pipelines using SnapLogic. Experience in AI prompt engineering, Gen AI, LLM models , Agents Experience in CI/CD, TDD, DevOps, CI/CD tools - Jenkins/UrbanCode/SonarQube/ Bamboo Key Responsibilities Lead and guide a team of developers/senior developers Architect technical design of the application, document and present it to senior stakeholders Interact with senior architects and other consultants to understand and review the technical solution and direction Communicate with business analysts to discuss various business requirements Proactively refactor code/solution, be aggressive about tech debt identification and reduction Develop, maintain and troubleshoot issues; and take a leading role in the ongoing support and enhancements of the applications Help in maintaining the standards, procedures and best practices in the team. Also help his team to follow these standards. Prioritisation of requirements in pipeline with stakeholders Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 8-10 years of experience with application development on Python language, API development along with Oracle RDBMS, SQL, PL/SQL Must have led a team of developers Feel rewarded For starters, we will offer you a comprehensive benefits package. We will value your wellbeing and support your development. And we will be as flexible as we can about where and when you work finding a balance that works for all of us. It is all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com. Our Values Integrity Doing the right thing, every time and putting the client first. Trust Empowering each other to take the initiative and make good decisions. Our Behaviours Brave - Challenging the status quo, being accountable and speaking up. Bold - Acting with conviction, encouraging diverse thinking, and keeping things simple. Curious - Learning to do new things in better ways and encouraging fresh thinking. Compassionate - Having empathy, caring for colleagues, clients

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Delhi / NCR, Bengaluru

Work from Office

Description: Job Title: Senior Python Developer (AWS, SQL, Django/Flask) Experience: 6–8 Years Location: Noida / Bangalore (Hybrid) Notice Period: Immediate Joiners Preferred Requirements: Must-Have Skills: Strong programming experience in Python (6+ years). Hands-on expertise with Django and/or Flask frameworks. Proven experience with AWS services – Lambda, EC2, S3, RDS, etc. Strong understanding of SQL and relational database systems. Experience with RESTful API development and integration. Good understanding of software engineering best practices (CI/CD, version control, testing). Job Responsibilities: Key Responsibilities: Develop, maintain, and optimize scalable backend services using Python with Django and/or Flask frameworks. Design and implement APIs and integration solutions with strong focus on performance and reliability. Work on cloud-based architecture and deployment using AWS (EC2, Lambda, S3, RDS, etc.). Develop robust data models and queries using SQL and work with relational databases like PostgreSQL or MySQL. Participate in code reviews, unit testing, and application monitoring to ensure quality deliverables. Collaborate with cross-functional teams to understand requirements and deliver efficient solutions. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Noida

Work from Office

About the Role: Grade Level (for internal use): 10 The RoleSenior DevOps Engineer The Team The Technology team works in partnership with other functions in the business to deliver quality products by providing software engineering and services along with quality assurance, that continuously improve our customers ability to succeed. The DevOps/Platform Engineering function in the Technology team is independent in driving all technical decisions and is responsible for ensuring platform stability and efficiency while keeping our infrastructure costs under control. The technology team is located globally, but practices close collaboration with colleagues from multiple regions across the globe. The Impact We are seeking a highly skilled Senior DevOps/Platform Engineer to join our dynamic team. The ideal candidate will have extensive experience in managing and automating infrastructure, improving deployment processes, and fostering a culture of collaboration between development and operations teams. As a proactive team player, you focus on maintaining the cloud infrastructure free from vulnerabilities and implementing infrastructure-as-code and automation, while being responsible and autonomous in your tasks. You are open to collaboration, eager to listen, and ready to engage with both technical and non-technical stakeholders. In this role, you will work in an enthusiastic team dedicated to supporting our critical technology systems, guiding business partners and end users with industry best practices, solution design, and creating long-term value for our customers. Whats in it for you Do you love developing and maintaining enterprise-scale infrastructure that serve a large customer base with growing demand and usageBe the part of a successful team which works on delivering top priority projects which will directly contribute to the companys strategy. You will use a wide range of technologies and have the opportunity to interact with different teams internally. You will also get plenty of learning and skill-building opportunities with participation in innovation projects, training and knowledge sharing. You will have the opportunity to own and drive a projects end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities Manage cloud infrastructure (AWS) and optimize resource utilization on cloud services such as EC2, S3, Lambda, ECS, SNS, SQS, RDS Design, implement, and manage CI/CD pipelines (GitLab, GitHub) to automate software builds, tests, and deployments Collaborate with development teams to ensure smooth integration of applications into the production environment Monitor system performance and troubleshoot issues to ensure high availability and reliability Develop and maintain infrastructure as code (IaC) using tools such as Cloudformation and Terraform Implement security best practices in the development and deployment processes Mentor junior team members and promote best practices in DevOps Work with a fantastic group of people in a supportive environment where training, learning and growth are embraced Enhance application security measures and implement best practices to safeguard sensitive data Take ownership of infrastructure projects, from concept to delivery, ensuring adherence to project timelines and objectives from platform engineering perspective Support and enhance existing platform, ensuring it meets ongoing business requirements What Were Looking For: 5 to 9 years of hands-on experience in DevOps, Site Reliability Engineering, or related roles Excellent hands-on proficiency on AWS cloud services (EC2, S3, Lambda, ECS, SNS, SQS, RDS) and Infrastructure templating using Terraform Experience with maintaining database servers SQL-Server & Postgres Experience working with Windows as well as Linux OS and in containerization strategies Experience with monitoring and logging tools (Grafana, PRTG, Cloudwatch) Hand-on familiarity with test-driven CI/CD frameworks Proficiency in scripting languages (Python, Bash, PowerShell etc.) Excellent problem-solving skills and the ability to work independently or as part of a team. Exceptional communication skills, with the ability to articulate technical concepts to non-technical stakeholders. Preferred Qualifications Bachelor's degree in Computer Science, Engineering, or related field. Soft Skills Strong problem solver abilities Effective communication skills Fluent in English (required) Proven leadership abilities A self-motivated and proactive contributor Ability to work with local and remote teams across multiple time zones About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Must have technical skills: 4 years+ onSnowflake advanced SQL expertise 4 years+ on data warehouse experiences hands on knowledge with the methods to identify, collect, manipulate, transform, normalize, clean, and validate data, star schema, normalization / denormalization, dimensions, aggregations etc, 4+ Years experience working in reporting and analytics environments development, data profiling, metric development, CICD, production deployment, troubleshooting, query tuning etc, 3 years+ on Python advanced Python expertise 3 years+ on any cloud platform AWS preferred hands on experience on AWS on Lambda, S3, SNS / SQS, EC2 is bare minimum, 3 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 3+ years with developing functional metrics in any specific business vertical (finance, retail, telecom etc), Must have soft skills: Clear communication written and verbal communication, especially with time off, delays in delivery etc. Team Player Works in the team and works with the team, Enterprise Experience Understands and follows enterprise guidelines for CICD, security, change management, RCA, on-call rotation etc, Nice to have: Technical certifications from AWS, Microsoft, Azure, GCP or any other recognized Software vendor, 4 years+ on any ETL / ELT tool Informatica, Pentaho, Fivetran, DBT etc. 4 years+ with developing functional metrics in any specific business vertical (finance, retail, telecom etc), 4 years+ with team lead experience, 3 years+ in a large-scale support organization supporting thousands ofusers

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

The Team: We are looking for a highly motivated, enthusiastic, and skilled engineering lead for Commodity Insights. We strive to deliver solutions that are sector-specific, data-rich, and hyper-targeted for evolving business needs. Our Software development Leaders are involved in the full product life cycle, from design through release. The resource would be joining a strong innovative team working on the content management platforms which support a large revenue stream for S&P Commodity Insights. Working very closely with the Product owner and Development Manager, teams are responsible for the development of user enhancements and maintaining good technical hygiene. The successful candidate will assist in the design, development, release and support of content platforms. Skills required include ReactJS, Spring Boot, RESTful microservices, AWS services (S3, ECS, Fargate, Lambda, etc.), CSS / HTML, AJAX / JSON, XML and SQL (PostgreSQL/Oracle), . The candidate should be aware of GEN AI or LLM models like Open AI and Claude etc. The candidate should be enthusiast in working on prompt building related to GenAI and business-related prompts. The candidate should be able to develop and optimize prompts for AI models to improve accuracy and relevance. The candidate must be able to work well with a distributed team, demonstrate an ability to articulate technical solutions for business requirements, have experience with content management/packaging solutions, and embrace a collaborative approach for the implementation of solutions. Responsibilities: Lead and mentor a team through all phases of the software development lifecycle, adhering to agile methodologies (Analyze, design, develop, test, debug, and deploy). Ensure high-quality deliverables and foster a collaborative environment. Be proficient with the use of developer tools supporting the CI/CD process including configuring and executing automated pipelines to build and deploy software components Actively contribute to team planning and ceremonies and commit to team agreement and goals Ensure code quality and security by understanding vulnerability patterns, running code scans, and be able to remediate issues. Mentoring the junior developers. Make sure that code review tasks on all user stories are added and timely completed. Perform reviews and integration testing to assure quality of project development eorts Design database schemas, conceptual data models, UI workows and application architectures that t into the enterprise architecture Support the user base, assisting with tracking down issues and analyzing feedback to identify product improvements Understand and commit to the culture of S&P Global: the vision, purpose and values of the organization Basic Qualifications: 10+ years experience in an agile team development role, delivering software solutions using Scrum Java, J2EE, Javascript, CSS/HTML, AJAX ReactJS, Spring Boot, Microservices, RESTful services, OAuth XML, JSON, data transformation SQL and NoSQL Databases (Oracle, PostgreSQL) Working knowledge of Amazon Web Services (Lambda, Fargate, ECS, S3, etc.) Experience on GEN AI or LLM models like Open AI and Claude is preferred. Experience with agile workflow tools (e.g. VSTS, JIRA) Experience with source code management tools (e.g. git), build management tools (e.g. Maven) and continuous integration/delivery processes and tools (e.g. Jenkins, Ansible) Self-starter able to work to achieve objectives with minimum direction Comfortable working independently as well as in a team Excellent verbal and written communication skills Preferred Qualifications: Analysis of business information patterns, data analysis and data modeling Working with user experience designers to deliver end-user focused benefits realization Familiar with containerization (Docker, Kubernetes) Messaging/queuing solutions (Kafka, etc.) Familiar with application security development/operations best practices (including static/dynamic code analysis tools)

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru, Karnataka

Work from Office

Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Experience: Minimum of 10+ years in database development and management roles. SQL Mastery: Advanced expertise in crafting and optimizing complex SQL queries and scripts. AWS Redshift: Proven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQL: Deep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data Pipelines: Extensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud Proficiency: Strong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data Modeling: Comprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. Scripting: Proficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications Leadership: Prior experience in leading database or data engineering teams. Data Visualization: Familiarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOps: Knowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). Certifications: Any relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure Databricks: Familiarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience

Posted 1 month ago

Apply

2.0 - 3.0 years

5 - 9 Lacs

Kochi

Work from Office

Job Title - Data Engineer Sr.Analyst ACS Song Management Level:Level 10- Sr. Analyst Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture (do not remove the hyperlink)Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Node.js Good to have skills : Amazon Web Services (AWS), React.jsMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in brainstorming sessions to explore innovative solutions, ensuring that the applications align with both technical and business objectives. Additionally, you will participate in code reviews and contribute to the continuous improvement of application design processes, fostering a culture of collaboration and excellence within the team. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in the development and implementation of application design strategies.- Collaborate with stakeholders to ensure that application designs meet user requirements and business goals. Professional & Technical Skills: - Must To Have Skills: Development Experience in Node.js and React JS.- Must To Have Skills: Development Experience with AWS Development.- Design and build robust, maintainable codebases using Node.js and React JS- Develop RESTful APIs and backend services using AWS Lambda, API Gateway, and DynamoDB.- Implement serverless and containerized applications using AWS services such as Lambda, Fargate, ECS, and EKS.- Automate infrastructure using CloudFormation, Terraform, or AWS CDK.- Integrate third-party APIs, SDKs, and AWS services for business logic automation.- Participate in code reviews, unit/integration testing, and debugging sessions.- Monitor application health using CloudWatch, X-Ray, and implement alerts.- Write well-documented, reusable, and testable code following best coding practices.- Collaborate with other developers, architects to deliver end-to-end solutions.- Strong understanding of application architecture and design principles.- Experience with RESTful API and SOAP API development and integration.- Familiarity with version control systems such as Git. Preferred Qualifications:AWS Developer Associate or AWS Solutions Architect Associate certification Experience with microservices and event-driven architectures Understanding of security practices:IAM roles/policies, encryption, least privilege Familiarity with agile development and code collaboration tools (JIRA, Confluence) Additional Information:- The candidate should have minimum 5 years of Development experience in Node.js, React JS and AWS.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Amazon Web Services (AWS) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress, address any challenges that arise, and facilitate communication among team members to foster a productive work environment. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: Professional & Technical Skills: Primary:AWS + Python Secondary:Devops, TerraformGood To Have:AWS CDK3-4 Years of overall software development experience with strong hands on in AWS and Python.Hands on experience on AWS services EC2, Lambda, SNS, SQS, Glue, Step function, Cloud watch, API Gateway, EMR, S3, Dynamo DB, RDS, Athena.Hands on experience in writing Python code for AWS services like Glue job, Lambda and AWS CDK.Strong technical and Debugging hands on.Strong Devops experience in Terraform, Git and CI/CD.Experience working in Agile development environments.Strong verbal and written communication skills, with the ability to engage directly with clients. Additional Information:- The candidate should have minimum 5 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15-year full time education is required.- Shift Timing:12:30 PM to 9:30 PM IST [Weekdays] Additional Information:- The candidate should have minimum 3 years of experience in Amazon Web Services (AWS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

Pune

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.QualificationYour experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java

Posted 1 month ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : ASP.NET MVC Good to have skills : Amazon Web Services (AWS)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages creativity and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in ASP.NET MVC.- Must To Have Skills: Experience with Amazon Web Services (AWS).- Strong understanding of web application architecture and design patterns.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and SQL.- Ability to troubleshoot and optimize application performance.Must -AWS:Lambda, DynamoDB, CloudWatch, ...-GitHubMust/Nice -NodeJSNice -React-Azure DevOps-NewRelic knowledge-Fintech knowledge Additional Information:- The candidate should have minimum 5 years of experience in ASP.NET MVC.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 1 month ago

Apply

10.0 - 15.0 years

13 - 18 Lacs

Bengaluru

Work from Office

In your role as a technical architect, you will be primariliy responsbile for designing customer solutions that will be deployed on IBM Cloud. You will closely work with project managers, architects, product managers and customers to architect the solution. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 10+ years of experience in building, deploying and managing large scale services / platforms for Cloud Platforms like AWS, Azure, IBM Cloud or Google Cloud. Python, Go Lang, Terraform and Schematics experience Hands on experience on Design, Build and Manage of Storage, Servers, Network, Backup, Monitoring etc. Knowledgeable in software engineering including API and microservice development Performing technical Cloud architect role. Management of large Hybrid Cloud end-to-end deals. Hands on expertise of IaaS, SaaS services Ability to solution across all hyperscalers – AWS/Azure/Google Hands on experience of integration of Infrastructure tools. Experience on Automation, Infrastructure cost optimization. Works closely with project management to ensure alignment of plans with what is being delivered. Challenges conventional thinking and traditional ways of operating and invites stakeholders to identify issues and opportunities. Helps others overcome resistance to change Preferred technical and professional experience Experience with Packer, Kubernetes, and/or Tekton Experience with OO programming languages (Golang, Python, C#) Experience with serverless computing (FaaS, Lambda) Experience with Ansible, Puppet, Chef, or other configuration management tool Knowledge of VPC and networking fundamentals/components MS Windows and/or SQL administration experience"" Designed & delivered large complex project on Cloud Infrastructure. Client Management and acted as an advisor for the client."

Posted 1 month ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

5.0 - 9.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Data Engineering Manager 5-9 years Software Development Manager 9+ Years Kotak Mahindra BankBengaluru, Karnataka, India (On-site) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. Thats why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies