Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
45 - 50 Lacs
Pune
Work from Office
Requirements: Our client is seeking a highly skilled Technical Project Manager (TPM) with strong hands-on experience in full-stack development and cloud infrastructure to lead the successful planning, execution, and delivery of technical projects. The ideal candidate will have a strong background in React, Java, Spring Boot, Python, and AWS, and will work closely with cross-functional teams including developers, QA, DevOps, and product stakeholders. As a TPM, you will play a critical role in bridging technical and business objectives, ensuring timelines, quality, and scalability across complex software projects. Responsibilities : - Own and drive the end-to-end lifecycle of technical projects-from initiation to deployment and post-launch support. - Collaborate with development teams and stakeholders to define project scope, goals, deliverables, and timelines. - Act as a hands-on contributor when needed, with the ability to guide and review code and architecture decisions. - Coordinate cross-functional teams across front-end (React), back-end (Java/Spring Boot, Python), and AWS cloud infrastructure. - Manage risk, change, and issue resolution in a fast-paced agile environment. - Ensure projects follow best practices around version control, CI/CD, testing, deployment, and monitoring. - Deliver detailed status updates, sprint reports, and retrospectives to leadership and stakeholders. Required Qualifications : - IIT /NIT graduate with 5+ years of experience in software engineering, with at least 2 years in a technical project management role. - Hands-on expertise in : React Java & Spring Boot Python AWS (EC2, S3, Lambda, CloudWatch, etc.) - Experience leading agile/Scrum teams with strong understanding of software development lifecycles. - Excellent communication, organizational, and interpersonal skills. Desired Profile : - Experience designing and managing Microservices architectures. - Familiarity with Kafka or other messaging systems. - Knowledge of CI/CD pipelines, deployment strategies, and application monitoring tools (e.g., Prometheus, Grafana, CloudWatch). - Experience with containerization tools like Docker and orchestration platforms like Kubernetes.
Posted 3 weeks ago
2.0 - 5.0 years
2 - 7 Lacs
Bengaluru
Work from Office
o Deploy applications on AWS using services such as EC2, ECS, S3, RDS, or Lambda o Implement CI/CD pipelines using GitHub Actions, Jenkins, or CodePipeline o Apply DevSecOps best practices including container security (Docker, ECR), infrastructure as code (Terraform), and runtime monitoring Team Collaboration & Agility o Participate in Agile ceremonies (stand-ups, sprint planning, retros) o Work closely with product, design, and AI engineers to build secure and intelligent systems
Posted 3 weeks ago
8.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Duration: Full Time Job Description: We are seeking a highly experienced and hands-on PHP Developer with leadership or managerial experience to join our growing team. The ideal candidate will be proficient in Laravel, CodeIgniter, React.js, Ajax, AWS, and SQL, with a proven track record of leading development teams and delivering robust, scalable web applications. Key Responsibilities: Lead and manage a team of developers, ensuring timely and quality delivery of projects. Architect, design, and develop high-performance web applications using PHP frameworks (Laravel & CodeIgniter). Integrate and manage front-end components using React.js. Work with Ajax for seamless asynchronous user experiences. Design and maintain SQL databases for high availability and performance. Deploy, manage, and troubleshoot applications hosted on AWS. Ensure coding standards, best practices, and secure programming techniques. Collaborate with cross-functional teams, including product managers and designers. Perform code reviews, mentorship, and performance evaluations. Required Skills & Experience: 8+ years of experience in PHP development. Strong hands-on experience with Laravel and CodeIgniter frameworks. Proficiency with React.js for front-end integration. Experience with Ajax for dynamic web functionality. Solid understanding of AWS services like EC2, S3, RDS, etc. Proficient in MySQL / SQL database design and optimization. Previous experience leading a team or managing developers (must-have). Strong problem-solving, debugging, and analytical skills. Excellent communication and leadership skills. Preferred Qualifications: Familiarity with CI/CD pipelines and DevOps practices. Experience with RESTful APIs and third-party integrations. Knowledge of version control tools like Git. Bachelors/Masters degree in Computer Science or related field.
Posted 3 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Work from Office
7+ years in software engineering, with 4+ years using AWS. Programming languages: C# and Python, along with SQL and Spark. The engineering position requires a minimum three-hour overlap with team members in the US-Pacific time zone. Strong experience with some (or all) of the following: Lambda and Step functions, API Gateway, Fargate, ECS, S3, SQS, Kinesis, Firehose, DynamoDB, RDS, Athena, and Glue. Solid foundation in data structures and algorithms and in-depth knowledge and passion for coding standards and following proven design patterns. RESTful and GraphQL APIs are examples. You might also have... DevOps experience is a plus, GitHub, GitHub Actions, Docker. Experience building CI/CD and server/deployment automation solutions, and container orchestration technologies.
Posted 3 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Qualifications: 4+ years of software development experience, including work with cloud technologies. Bachelors or Master’s degree in Computer Science, Engineering, or equivalent experience. Proficiency in one or more modern programming languages (e.g., Python, Java, Go, NodeJS). Experience with cloud platforms such as AWS, Azure, or Google Cloud. Experience with microservices architecture, distributed systems, and event-driven design. Expertise in designing and consuming RESTful APIs and familiarity with GraphQL Hands one experience with CI/CD pipelines, infrastructure as a code (e.g. Terraform, CloudFormation) and automated deployments. Strong understanding of relational and NoSQL databases. Knowledge of SaaS specific security practices (e.g. OWASP, data encryption, identity management). Strong understanding of software development methodologies and tools. Familiarity with containerization (Docker) and orchestration (Kubernetes). Knowledge of monitoring and logging tools Experience with distributed systems and data-intensive applications.
Posted 3 weeks ago
6.0 - 11.0 years
25 - 37 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Azure Expertise, Proven experience with Azure Cloud services especially Azure Data Factory, Azure SQL Database & Azure Databricks Expert in PySpark data processing & analytics Strong background in building and optimizing data pipelines and workflows. Required Candidate profile Solid exp with data modeling,ETL processes & data warehousing Performance Tuning Ability to optimize data pipelines & jobs to ensure scalability & performance troubleshooting & resolving performance
Posted 4 weeks ago
6.0 - 9.0 years
14 - 22 Lacs
Pune, Chennai
Work from Office
Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse
Posted 4 weeks ago
9.0 - 13.0 years
32 - 40 Lacs
Ahmedabad
Remote
About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 4 weeks ago
6.0 - 10.0 years
25 - 30 Lacs
Noida, and Remote
Work from Office
Job Title: Full Stack Software Developer Experience Required: 6+ Years Location: [Noida / Remote] Employment Type: Full-Time Job Summary We are seeking a talented and motivated Full Stack Software Developer with 6+ years of experience to join our dynamic team. The ideal candidate should be highly skilled in React and Node.js, with a solid grasp of GraphQL and AWS being a significant advantage. You will be instrumental in designing, developing, and maintaining scalable, efficient, and user-centric applications across the entire technology stack. Key Responsibilities Design & Development: Build, deploy, and maintain robust front-end and back-end applications using React and Node.js. API Integration: Create and consume RESTful and GraphQL APIs to support dynamic client-server interactions. System Architecture: Contribute to the design of scalable and maintainable software systems. Cloud Integration: Leverage AWS services (e.g., Lambda, S3, EC2) to host and scale applications efficiently. Collaboration: Work closely with cross-functional teams including product managers, designers, and other developers. Code Quality: Maintain clean, testable, and maintainable code following best practices. Troubleshooting: Diagnose and resolve issues across the stack to ensure high performance and reliability. Skills and Qualifications Required: Strong proficiency in JavaScript/TypeScript, React, and Node.js. Solid understanding of front-end development concepts (state management, component lifecycle, performance tuning). Experience working with REST and/or GraphQL APIs. Familiarity with relational databases like PostgreSQL or similar. Excellent problem-solving abilities and experience in Agile development environments. Preferred: Hands-on experience with GraphQL and tools like Apollo. Working knowledge of AWS services such as EC2, S3, Lambda, API Gateway, and DynamoDB. Experience with CI/CD tools (e.g., GitHub Actions, Jenkins). Understanding of automated testing using frameworks like Jest, Cypress, etc.
Posted 4 weeks ago
5.0 - 10.0 years
20 - 25 Lacs
Chennai, Bengaluru
Work from Office
Role & responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.
Posted 4 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.
Posted 4 weeks ago
4.0 - 6.0 years
7 - 9 Lacs
Bengaluru
Hybrid
We are looking for an experienced *Software Engineer - Informatica* with 4 to 6 years of hands-on expertise* in designing, developing, and optimizing large-scale *ETL solutions* using *Informatica PowerCenter*. The ideal candidate will lead ETL projects, mentor junior developers, and ensure high-performance data integration across enterprise systems. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues. About You You are a fit for this position if your background includes: - 4-6 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks.
Posted 4 weeks ago
8.0 - 10.0 years
10 - 15 Lacs
Gurugram
Work from Office
T he Team: Financial Risk Analytics at S&P Global provides products and solutions to financial institutions to measure and manage their counterparty credit risk, market risk, regulatory risk capital and derivative valuation adjustments. Using the latest analytics and technology such as a fully vectorized pricing library, Machine Learning and a Big Data stack for scalability, our products and solutions are used by the largest tier-one banks to smaller niche firms. Our products are available deployed, in the cloud or can be run as a service. We have a need for an enthusiastic and skilled Senior Python developer who is interested in learning about quantitative analytics and perhaps looking to make a career at the intersection of Financial Analytics, Big Data and Mathematics! The Impact: You will be working on a strategic component that allows clients to on-demand extract data required for pricing and risk calculations. This is an essential entry point to a risk calculation which requires speed to market and good design to drive efficient and robust workflows. Whats in it for y ou: The successful candidate will gain exposure to risk analytics and latest trending technology that allows you to grow into a hybrid role specializi ng in both financial markets and technology a highly rewarding, challenging, and marketable position to gain skills in. Responsibilities: The successful candidate will work on the Market Risk solution with a technology stack that is best of breed, involving Python 3.10+ , Airflow, Pandas, NumPy , ECS (AWS) . You will join a fast-paced, dynamic team environment, building commercial products that are at the heart of the business and contributing directly to revenue generation. Design and implement end to end applications in Python with an emphasis on efficiently writing functions on large datasets. Interpret and analyse business use-cases and feature requests into technical designs and development tasks. Participate in regular design and code review meetings. Be a responsive team player in system architecture and design discussions. Be proud of the high quality of your own work. Always follow quality standards (unit tests, integration tests and documented code) Happy to coach and mentor junior engineers. Be delivery focused, have a passion for technology and enjoy offering new ideas and approaches. Demonstrable technical capacity in understanding technical deliveries and dependencies. Strong experience in working in software engineering projects in an Agile manner. What Were Looking For: Bachelors degree in computer science, Engineering, or a related discipline, or equivalent experience Computer Science and Software Engineering: Strong software development experience Minimum 8years' experience in developing applications using Python. Experience using Python 3.10+ Core Python with rich knowledge in OO methodologies and design. Experience writing python code that is scalable and performant. Experience/exposure to complex data types when designing and anticipating issues that impact performance (under ETL processes) by generating metrics using industry adopted profiling tools during development. Experience working on AWS, ECS, S3 and ideally MWAA (hosted Airflow on AWS) Experience working in data engineering/orchestration and scalable efficient flow design. Experience in developing data pipelines using Airflow. Good working competency in Docker, Git, Linux Good working knowledge of Pandas and NumPy Understanding of CI/CD pipelines Test frameworks. Agile and XP (Scrum, Kanban, TDD) Experience with cloud-based infrastructures, preferably with AWS. Fluent in English Passionate individual who thrives development, data and is hands on.
Posted 4 weeks ago
6.0 - 11.0 years
8 - 15 Lacs
Hyderabad
Work from Office
The Team: This team is part of the global Application Operations and Infrastructure group that provides production support to Ratings Applications. These applications are critical for the Analysts who drive the business through their actions. Team is responsible for the high availability and resiliency of these applications. The Impact: As part of global team of engineers, provide production support for Tier-1 business critical applications. Troubleshoot application related issues and work with infrastructure & Database team to triage Major Incidents. Contribute to delivery of innovative and continuous highly reliable technology services. Strong focus towards developing shared integration services with automation and cloud enablement, guide the team to design technical solutions. Become an integral part of a high performing global network of engineers working from India, Denver, New York and London to help advance our technology. Whats in it for you: Working with a team of highly skilled, ambitious and result-oriented professionals. An ever-challenging environment to hone your existing skills in Automation, performance, service layer testing, SQL scripting etc. A plenty of skill building, knowledge sharing, and innovation opportunities. Building a fulfilling career with a global financial technology company. Ability to lead and build a world class production support group. Highly technical hands-on role which will help enhance team skills. Work on Tier-1 applications that are in the critical path for the business. Ability to work on cutting-edge technologies such as AWS, Oracle and Ansible. Ability to grow within the organization thats part of the global team. Responsibilities: This role requires extensive skills in operating within the AWS cloud platform, along with deep expertise in database engineering, performance tuning, backup and recovery solutions (such as Cohesity), cloud database technologies, and the auditing and security of database systems. Hands-on experience working with AWS cloud service provider. encompassing key services such as IAM (Identity and Access Management), Compute, Storage, Elastic Load Balancing, RDS (Relational Database Service), VPC (Virtual Private Cloud), TGW (Transit Gateway), Route 53, ACM, Serverless computing, Containerization, Account Administration, CloudWatch, CloudTrail etc. Additional experience with other cloud providers is advantageous. Proficiency in working with configuration management tools such as Ansible Solid understanding of CI/CD pipelines, utilizing tools such as Azure DevOps and GitHub for seamless integration and deployment. Proficiency in scripting languages such as PowerShell, Bash, and Python. Demonstrated ability to learn new technologies quickly and integrate them into existing systems. Collaborate with cross-functional teams to ensure the stability, security, and efficiency of our database environment Ability to support/resolve infrastructure related issues across different business applications. As part of global team of engineers, deliver innovative and continuous highly reliable technology services. Ability to communicate well and manage multiple initiatives with multiple engineers potentially across multiple time zones. Participate in on call and a weekly rotating shift schedule Involvement in Architecture and Development design reviews for new implementation and integration projects. Troubleshoot application related issues and work with infrastructure team to triage Major Incidents. Work with business users to understand needs, issues, develop root cause analysis and work with the team for the development of solutions and enhancements Manage the Error Budgets to measure risk, balance availability and feature development. Drive the automation to reduce the Manual Toil Measure, Track & Report the SLOs Create & Manage the Systems & Process documentation. Analyse & Conduct Post Incident reviews & drive the actions. What were looking for: Basic Qualifications: 6+ Years of IT Experience Bachelor MS degree in Computer Science, Engineering, or a related subject Ability to architect high availability application and servers on cloud adhering best practices. Hands-on experience using automation tooling like Shell, Python, Ansible and Terraform Hand-on experience with DevOps tools like ADO, Jenkins, Ansible Tower, Docker. Hands-on experience integrating AWS services like VPC, EC2, Route53, S3 to create scalable application environments. Experience performing Root Cause analyses and automating solutions to address underlying issues. Having exposure to Database technologies like Oracle, PostgreSQL, SQL Server, Mongo etc, are desirable A team player capable of high performance, flexibility in a dynamic working environment. Skill and ability to train others on technical and procedural topics. Ability to support/resolve infrastructure related issues as required. Preferred Qualifications: Bachelors degree in Computer Science, Engineering or a related technical discipline Proven working experience in AWS Cloud Platform Engineering Expert knowledge of Observability Tools like SPLUNK & Open Telemetry. Expert knowledge automating the building and deployment of containerized applications Expertise in Infra as Code automations Certification in AWS Cloud Technologies, DevOps preferred.
Posted 4 weeks ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.
Posted 4 weeks ago
5.0 - 8.0 years
1 - 2 Lacs
Kochi
Remote
Job Description: Job Title: AWS Cloud Engineer Location: Kochi, India (Remote Option Available) We are looking for a highly skilled AWS Cloud Engineer with a minimum of 5 years of hands-on experience in AWS cloud technologies. The ideal candidate will have strong expertise in AWS services such as S3, EC2, MSK, Glue, DMS, and SageMaker, along with solid development experience in Python and Docker. This role involves troubleshooting issues, reviewing solution designs, and coding high-quality implementations. Key Responsibilities: Work extensively with AWS services including S3, EC2, MSK, Glue, DMS, and SageMaker Develop, containerize, and deploy applications using Python and Docker Design and review system architecture and cloud-based solutions Troubleshoot and resolve issues in AWS infrastructure and application layers Collaborate with development and DevOps teams to build scalable and secure applications Preferred candidate profile Requirements: Minimum 5 years of hands-on experience in AWS Cloud Proficiency in Python and containerization using Docker Strong understanding of AWS data and streaming services Experience with AWS Glue, DMS, and SageMaker Ability to troubleshoot issues, analyze root causes, and implement effective solutions Strong communication and problem-solving skills Preferred Qualifications: AWS Certification (Associate or Professional level) is a plus
Posted 4 weeks ago
6.0 - 9.0 years
30 - 32 Lacs
Hyderabad, Coimbatore, Bengaluru
Work from Office
Job Summary: We are seeking a skilled Cloud Migration Consultant with hands-on experience in assessing and migrating complex applications from AWS to Azure. The ideal candidate will work closely with Microsoft business units, participating in application intake, assessment, and migration planning. This role includes creating migration artifacts, leading client interactions, and supporting application modernization initiatives on Azure with occasional AWS exposure. Key Responsibilities: Assess application readiness by documenting architecture, dependencies, and migration strategies. Conduct interviews with stakeholders and generate discovery insights using tools like Azure Migrate, CloudockIt, and PowerShell. Develop architecture diagrams, migration playbooks, and manage Azure DevOps boards. Set up and configure applications in on-premises and cloud environments, primarily Azure. Support proof-of-concepts (PoCs) and provide expert advice on migration and modernization options. Collaborate with application, database, and infrastructure teams to ensure smooth transition to migration factory teams. Track project progress, identify blockers and risks, and report timely status updates to leadership. Required Skills and Qualifications: Minimum 4 years of experience in cloud migration and application assessment. Strong expertise in Azure IaaS and PaaS services (e.g., VMs, App Services, Azure Data Factory). Familiarity with AWS IaaS and PaaS components (e.g., EC2, RDS, Glue, S3). Proficient in programming languages and frameworks including Java (Spring Boot), C#, .NET, Python, Angular, React.js, and REST APIs. Working knowledge of Kafka, Docker, Kubernetes, and Azure DevOps. Solid understanding of network infrastructure including VNets, NSGs, Firewalls, and WAFs. Experience with IAM concepts and technologies such as OAuth, SAML, Okta, and SiteMinder. Exposure to Big Data technologies like Databricks, Hadoop, Oracle, and DocumentDB. Preferred Qualifications: Azure or AWS cloud certifications. Prior experience with enterprise-scale cloud migration projects, especially within the Microsoft ecosystem. Excellent communication skills and proven ability to manage stakeholder relationships effectively. Location : Hyderabad/ Bangalore/ Coimbatore/ Pune
Posted 4 weeks ago
3.0 - 5.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Data Analysis: Conduct in-depth analysis of data to identify trends, anomalies, and opportunities, utilizing SQL, AWS, and Python to extract and manipulate data. Business Transformation: Translate existing SQL queries into business transformation logics, enabling the conversion of raw data into actionable insights to drive strategic decision-making. Requirements Gathering: Collaborate with business stakeholders to gather and document. clear and concise business requirements, ensuring a thorough understanding of data needs. Documentation: Develop and maintain documentation related to data analysis, transformation, and reporting processes, ensuring knowledge transfer and continuity. AWS Integration: Leverage AWS services to facilitate data extraction, storage, and analysis, making data readily available for the business. Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and integrity of data used in analyses. Qualifications: Bachelors degree in business, Computer Science, or a related field. Proven experience as a Business Analyst with a strong focus on data analysis and transformation. Proficiency in SQL for querying and manipulating relational databases. Awareness of AWS services such as Redshift, S3, Athena, Lambda, Step Functions, AWS Batch Proficiency in Python for data analysis and scripting. Experience in converting SQL queries into actionable business transformation logics. Strong problem-solving and critical-thinking skills. Excellent communication and interpersonal skills to work effectively with cross-functional. teams and stakeholders. Attention to detail and a commitment to data accuracy and quality.
Posted 4 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Design,develop,maintain scalable microservices using Java,Kotlin,Spring Boot. Build/ optimize data models and queries in MongoDB. Integrate and manage Apache Kafka for real-time data streaming and messaging. Implement CI/CD pipelines using Jenkins Required Candidate profile 6+ yrs of exp in backend development with Java/Spring Boot. Exp with Jenkins for CI/CD automation. Familiarity with AWS services (EC2, S3, Lambda, RDS, etc.) or OpenShift for container orchestration
Posted 4 weeks ago
8.0 - 12.0 years
25 - 40 Lacs
Chennai, Bengaluru, Delhi / NCR
Work from Office
AWS Admin exp involving design, Landing Zone deployment, Migration, and optimization Design and develop AWS cloud solutions Create architectural blueprints, diagrams, and documentation Hands on Exp on AWS Terraform and Cloud formation automation
Posted 4 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Dear Candidates, We are conducting a Walk-In Interview in Hyderabad for the position of Data Engineering on 20th/21st/22nd June 2025 . Position: Data Engineering Job description: Expert knowledge in AWS Data Lake implementation and support (S3, Glue,DMS Athena, Lambda, API Gateway, Redshift) Handling of data related activities such as data parsing, cleansing quality definition data pipelines, storage and ETL scripts Experiences in programming language Python/Pyspark/SQL Experience with data migration with hands-on experience Experiences in consuming rest API using various authentication options with in AWS Lambda architecture orchestrate triggers, debug and schedule batch job using a AWS Glue, Lambda and step functions understanding of AWS security features such as IAM roles and policies Exposure to Devops tools AWS certification in AWS is highly preferred Mandatory skills for Data engineer: Python/Pyspark, Aws Glue, lambda , redshift. Date: 20th June 2025 to 22nd June 2025 Time : 9.00 AM to 6.00 PM Eligibility: Any Graduate Experience : 2- 10 Years Gender: ANY Interested candidates can walk in directly. For any queries, please contact us at +91 7349369478/ 8555079906 Interview Venue Details: Selectify Analytics Address: Capital Park (Jain Sadguru Capital Park) Ayyappa Society, Silicon Valley, Madhapur, Hyderabad, Telangana 500081 Contact Person: Mr. Deepak/Saqeeb/Ravi Kumar Interview Time: 9.00 AM to 6.00 PM Contact Number : +91 7349369478/ 8555079906
Posted 4 weeks ago
7.0 - 12.0 years
6 - 16 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Role & responsibilities Total Experience: 7 yrs Relevant experience: 6 yrs Mandatory skills: Hadoop Apache Spark, Hive AWS cloud services including S3, Redshift, EMR etc SQL Nice to have: Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow / Control-M. Practical experience in Core Java /Python/Scala. Experience with SDLC tools (e.g. Bamboo, JIRA, GIT, Confluence, Bitbucket) Having experience in Data Modelling, Data Quality, Load Assurance. Ability to communicate problems and solutions effectively with both business and technical stakeholders (written and verbal) Added advantages : Scheduling tools - control-m, airflow Monitoring tools : Sumologic, SPLUNK Process familiarity of Incident Management Job Description: Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow / Control-M. Practical experience in Core Java /Python/Scala. Experience with SDLC tools (e.g. Bamboo, JIRA, GIT, Confluence, Bitbucket) Having experience in Data Modelling, Data Quality, Load Assurance. Ability to communicate problems and solutions effectively with both business and technical stakeholders (written and verbal) Added advantages : Scheduling tools - control-m, airflow Monitoring tools : Sumologic, SPLUNK Process familiarity of Incident Management
Posted 4 weeks ago
1.0 - 6.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Responsibilities Design and develop new features to meet evolving business and technical needs Maintain and enhance existing functionality to ensure reliability and performance Collaborate directly with customers to gather requirements and understand business objectives Stay up to date with the latest technologies and apply them to influence project decisions and outcomes Requirements 1+ year of experience in developing commercial applications on .NET Good understanding of the Software Development Lifecycle Understanding of C#, including .NET 6/8 and .NET Framework 4.8 Good knowledge and experience with Azure (Azure Functions, VMs, Cosmos DB, Azure SQL) or AWS (EC2, Lambda, S3, DynamoDB) Skills in front-end web development (React, Angular, TypeScript) Substantial knowledge of relational and non-relational databases Good knowledge of Event-Driven Architecture (CQRS & Event Sourcing), Domain-Driven Design, and Microservices Experience working with CI/CD (Azure DevOps, AWS CodePipeline) Experience with testing tools and techniques Good spoken English (at least B1 level according to CEFR)
Posted 1 month ago
8.0 - 13.0 years
15 - 25 Lacs
Gurugram
Remote
Minimum 6 years of hands-on experience deploying, enhancing, and troubleshooting foundational AWS Services (EC2, S3, RDS, VPC, CloudTrail, CloudFront, Lambda, EKS, ECS, etc.) • 3+ years of experience with serverless technologies, services, and container technologies (Docker, Kubernetes, etc.) o Manage Kubernetes charts using helm. o Managed production application deployments in Kubernetes cluster using KubeCTL. o Expertise in deploying distributed apps with containers (Docker) & orchestration (Kubernetes EKS,). o Experience in infrastructure-as-code tools for provisioning and managing Kubernetes infrastructure. o (Preferred) Certification in container orchestration systems and/or Certified Kubernetes Administrator. o Experience with Log Management and Analytics tools such as Splunk / ELK • 3+ years of experience with writing, debugging, and enhancing Terraform to write infrastructure as code to create scrips for EKS, EC2, S3, and other AWS services. o Expertise with working with Terraform Key features such as Infrastructure as code, execution plans, resource graphs, and change automation. o Implemented cluster services using Kubernetes and docker to manage local deployments in Kubernetes by building self-hosted Kubernetes clusters using Terraform. o Managed provisioning of AWS infrastructure using Terraform. o Develop and maintain infrastructure-as-code solutions using Terraform. • Ability to write scripts in JavaScript, Bash, Python, Typescript, or similar languages. • Able to work independently and as a team to architect and implement new solutions and technologies. • Very strong written and verbal communication skills; the ability to communicate verbally and in writing with all levels of employees and management, capable of successful formal and informal communication, speaks and writes clearly and understandably at the right level. • Ability to identify, evaluate, learn, and POC new technologies for implementation. • Experience in designing and implementing highly resilient AWS solutions.
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Responsibilities Develop key features and enhance the platforms performance, scalability, and security Design and implement secure, efficient APIs to support communication and data flow Optimize database performance and adhere to data standards Collaborate with team members to align on technical and business requirements Contribute to the maintenance of CI/CD processes and proactive monitoring Stay updated on advancements in PHP, Laravel, AWS, and healthcare technology Requirements Experience with PHP frameworks, particularly Laravel, and a good understanding of modern development practices Knowledge of optimizing database performance, including query optimization and indexing Experience in designing and implementing RESTful APIs with authentication standards such as OAuth 2.0 Understanding of software architecture patterns, including MVC and microservices, to ensure maintainable and scalable code CI/CD and Monitoring Skills Familiarity with CI/CD processes and tools, with experience in automating testing and deployment Knowledge of monitoring tools for system health and performance Core Infrastructure Skills Practical experience in deploying, managing, and scaling applications using AWS services like EC2, RDS, and S3 Basic experience managing cloud-based architectures, including EC2 instances and load balancing Familiarity with containerization technologies such as Docker for application deployment Nice to have Experience with front-end development Basic knowledge of front-end technologies (HTML, CSS, JavaScript) for collaborative development Integration Skills Experience integrating third-party APIs, such as EMR systems or communication tools like Twilio Regulatory Awareness Understanding of regulatory standards impacting healthcare technology, such as HIPAA compliance Communication and Problem-Solving Strong communication skills to work effectively with team members and stakeholders Ability to analyze and troubleshoot technical issues to improve application performance
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France