Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Key Responsibilities Design, develop, and optimize data pipelines using Python and AWS services such asGlue, Lambda, S3, EMR, Redshift, Athena, and Kinesis. Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses). Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions. Monitor, troubleshoot, and enhance data workflows for performance and cost optimization. Ensure data quality and consistency by implementing validation and governance practices. Work on data security best practices in compliance with organizational policies and regulations. Automate repetitive data engineering tasks using Python scripts and frameworks. Leverage CI/CD pipelines for deployment of data workflows on AWS. Required Skills and Qualifications Professional Experience:5+ years of experiencein data engineering or a related field. Programming: Strong proficiency inPython, with experience in libraries likepandas,pySpark,orboto3. AWS Expertise: Hands-on experience with core AWS services for data engineering, such as: AWS Gluefor ETL/ELT. S3for storage. RedshiftorAthenafor data warehousing and querying. Lambdafor serverless compute. KinesisorSNS/SQSfor data streaming. IAM Rolesfor security. Databases: Proficiency in SQL and experience withrelational(e.g., PostgreSQL, MySQL) andNoSQL(e.g., DynamoDB) databases. Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus. DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline. Version Control: Proficient with Git-based workflows. Problem Solving: Excellent analytical and debugging skills. Optional Skills Knowledge ofdata modelinganddata warehouse designprinciples. Experience withdata visualization tools(e.g., Tableau, Power BI). Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes). Exposure to other programming languages like Scala or Java.
Posted 1 month ago
5.0 - 10.0 years
0 - 3 Lacs
Hyderabad
Hybrid
Dear Candidate, A Warm Greeting for SAIS IT Services! We are hiring for Java Developer for our client. Interested people can share your CV to Jyoti.r@saisservices.com, For more queries, kindly reach me on 8360298749 with the below mentioned details; Please fill the below details: Total Exp- CTC- ECTC- Notice Period- Current Location- Comfortable for Work from Office- Job Description: Were Hiring:Java Developer Location: Hyderabad Job Title: Java Developer Experience 5+ years Work Mode: Hybrid(3 Days WFO) Strong hands-on experience in Java + AWS (Lambda-Mandatory, s3, Ec2) (3 years in AWS) Regards, Jyoti Rani 8360298749 Jyoti.r@saisservices.com
Posted 1 month ago
7.0 - 12.0 years
10 - 20 Lacs
Hyderabad
Remote
Job Title: Senior Data Engineer Location: Remote Job Type: Fulltime Experience Level: 7+ years About the Role: We are seeking a highly skilled Senior Data Engineer to join our team in building a modern data platform on AWS. You will play a key role in transitioning from legacy systems to a scalable, cloud-native architecture using technologies like Apache Iceberg, AWS Glue, Redshift, and Atlan for governance. This role requires hands-on experience across both legacy (e.g., Siebel, Talend, Informatica) and modern data stacks. Responsibilities: Design, develop, and optimize data pipelines and ETL/ELT workflows on AWS. Migrate legacy data solutions (Siebel, Talend, Informatica) to modern AWS-native services. Implement and manage a data lake architecture using Apache Iceberg and AWS Glue. Work with Redshift for data warehousing solutions including performance tuning and modelling. Apply data quality and observability practices using Soda or similar tools. Ensure data governance and metadata management using Atlan (or other tools like Collibra, Alation). Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions. Build scalable, secure, and high-performing data platforms supporting both batch and real-time use cases. Participate in defining and enforcing data engineering best practices. Required Qualifications: 7+ years of experience in data engineering and data pipeline development. Strong expertise with AWS services, especially Redshift, Glue, S3, and Athena. Proven experience with Apache Iceberg or similar open table formats (like Delta Lake or Hudi). Experience with legacy tools like Siebel, Talend, and Informatica. Knowledge of data governance tools like Atlan, Collibra, or Alation. Experience implementing data quality checks using Soda or equivalent. Strong SQL and Python skills; familiarity with Spark is a plus. Solid understanding of data modeling, data warehousing, and big data architectures. Strong problem-solving skills and the ability to work in an Agile environment.
Posted 1 month ago
9.0 - 14.0 years
20 - 30 Lacs
Kochi, Bengaluru
Work from Office
Senior Data Engineer AWS (Glue, Data Warehousing, Optimization & Security) Experienced Senior Data Engineer (6+ Yrs) with deep expertise in AWS cloud Data services, particularly AWS Glue, to design, build, and optimize scalable data solutions. The ideal candidate will drive end-to-end data engineering initiatives — from ingestion to consumption — with a strong focus on data warehousing, performance optimization, self-service enablement, and data security. The candidate needs to have experience in doing consulting and troubleshooting exercise to design best-fit solutions. Key Responsibilities Consult with business and technology stakeholders to understand data requirements, troubleshoot and advise on best-fit AWS data solutions Design and implement scalable ETL pipelines using AWS Glue, handling structured and semi-structured data Architect and manage modern cloud data warehouses (e.g., Amazon Redshift, Snowflake, or equivalent) Optimize data pipelines and queries for performance, cost-efficiency, and scalability Develop solutions that enable self-service analytics for business and data science teams Implement data security, governance, and access controls Collaborate with data scientists, analysts, and business stakeholders to understand data needs Monitor, troubleshoot, and improve existing data solutions, ensuring high availability and reliability Required Skills & Experience 8+ years of experience in data engineering in AWS platform Strong hands-on experience with AWS Glue, Lambda, S3, Athena, Redshift, IAM Proven expertise in data modelling, data warehousing concepts, and SQL optimization Experience designing self-service data platforms for business users Solid understanding of data security, encryption, and access management Proficiency in Python Familiarity with DevOps practices & CI/CD Strong problem-solving Exposure to BI tools (e.g., QuickSight, Power BI, Tableau) for self-service enablement Preferred Qualifications AWS Certified Data Analytics – Specialty or Solutions Architect – Associate
Posted 1 month ago
7.0 - 12.0 years
22 - 27 Lacs
Hyderabad
Work from Office
Key Responsibilities Data Pipeline Development: Design, develop, and optimize robust data pipelines to efficiently collect, process, and store large-scale datasets for AI/ML applications. ETL Processes: Develop and maintain Extract, Transform, and Load (ETL) processes to ensure accurate and timely data delivery for machine learning models. Data Integration: Integrate diverse data sources (structured, unstructured, and semi-structured data) into a unified and scalable data architecture. Data Warehousing & Management: Design and manage data warehouses to store processed and raw data in a highly structured, accessible format for analytics and AI/ML models. AI/ML Model Development: Collaborate with Data Scientists to build, fine-tune, and deploy machine learning models into production environments. Focus on model optimization, scalability, and operationalization. Automation: Implement automation techniques to support model retraining, monitoring, and reporting. Cloud & Distributed Systems: Work with cloud platforms (AWS, Azure, GCP) and distributed systems to store and process data efficiently, ensuring that AI/ML models are scalable and maintainable in the cloud environment. Data Quality & Governance: Implement data quality checks, monitoring, and governance frameworks to ensure the integrity and security of the data being used for AI/ML models. Collaboration: Work cross-functionally with Data Science, Business Intelligence, and other engineering teams to meet organizational data needs and ensure seamless integration with analytics platforms. Required Skills and Qualifications Bachelor's or Masters Degree in Computer Science, Engineering, Data Science, or a related field. Strong proficiency in Python for AI/ML and data engineering tasks. Experience with AI/ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and Keras. Proficient in SQL and working with relational databases (e.g., MySQL, PostgreSQL, SQL Server). Strong experience with ETL pipelines and data wrangling in large datasets. Familiarity with cloud-based data engineering tools and services (e.g., AWS (S3, Lambda, Redshift), Azure, GCP). Solid understanding of big data technologies like Hadoop, Spark, and Kafka for data processing at scale. Experience in managing and processing both structured and unstructured data. Knowledge of version control systems (e.g., Git) and agile development methodologies. Experience with data containers and orchestration tools such as Docker and Kubernetes. Strong communication skills to collaborate effectively with cross-functional teams. Preferred Skills Experience with Data Warehouses (e.g., Amazon Redshift, Google BigQuery, Snowflake). Familiarity with CI/CD pipelines for ML model deployment and automation. Familiarity with machine learning model monitoring and performance optimization. Experience with data visualization tools like Tableau, Power BI, or Plotly. Knowledge of deep learning models and frameworks. DevOps or MLOps experience for automating deployment of models. Advanced statistics or math background for improving model performance and accuracy.
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
Key Responsibilities: Design, implement, and maintain cloud-based infrastructure on AWS. Manage and monitor AWS services, including EC2, S3, Lambda, RDS, CloudFormation, VPC, etc. Develop automation scripts for deployment, monitoring, and scaling using AWS services. Collaborate with DevOps teams to automate build, test, and deployment pipelines. Ensure the security and compliance of cloud environments using AWS security best practices. Optimize cloud resource usage to reduce costs while maintaining high performance. Troubleshoot issues related to cloud infrastructure and services. Participate in capacity planning and disaster recovery strategies. Monitor application performance and make necessary adjustments to ensure optimal performance. Stay current with new AWS features and tools and evaluate their applicability for the organization. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as an AWS Engineer or in a similar cloud infrastructure role. In-depth knowledge of AWS services, including EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Proficiency in scripting languages such as Python, Shell, or Bash. Experience with infrastructure-as-code tools like Terraform or AWS CloudFormation. Strong understanding of networking concepts, cloud security, and best practices. Familiarity with containerization technologies (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication skills, both written and verbal. AWS certifications (AWS Certified Solutions Architect, AWS Certified DevOps Engineer, etc.) are preferred. Preferred Skills: Experience with serverless architectures and services. Knowledge of CI/CD pipelines and DevOps methodologies. Experience with monitoring and logging tools like CloudWatch, Datadog, or Prometheus. Knowledge in AWS FinOps.
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Hello Candidates We are hiring for Java Developer Please find the Job Description below Position - Java Developer Experience - 5 + Years Location - Mumbai / Bengaluru Skills - Java, SQL, AWS Please find below the JD for the position. Proven experience in Java development, with a strong understanding of object-oriented programming principles. Experience with AWS services, including ECS, S3, RDS, Elasticache and CloudFormation. Experience with microservices architecture and RESTful API design. Strong problem-solving skills and attention to detail. Experience in the financial services industry, particularly in trading or risk management, is a plus. Excellent communication and collaboration skills. All Important Check points for all the requirements: Candidate should have all necessary documents as they have very strict Background Verification All Employment: 1) Documents from all the companies candidate worked till now (offer letters, Experience letters & reliving letters) 2) PF, UAN Number, Form 16 & Form 26 A - Mandatory 3) Educational Documents - Marksheets & Degree Certificates Kindly revert back with your acknowledgement on same, & share your updated CV. Total Experience: Relevant Experience: Current Salary: Expected Salary: Current Company / Last Company: Notice Period Last Working Date : Reason for Job Change: Current Location: Preferred Location Have you applied for Mphasis before : YES/NO Alternate Mail ID : Alternate Phone No : PAN CARD No. : NOTE: Interested candidates can share their resume - shrutia.talentsketchers@gmail.com Regards Shruti TS
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be experienced working in projects involving Other ideal qualifications include experiences in Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. Experience on Shell scripting Exceptionally strong analytical and problem-solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and cross functional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 1 month ago
5.0 - 10.0 years
25 - 30 Lacs
Pune
Remote
- Passionate about TDD (Test First Development) - Have at least 7 to 10 years of development experience with Python. - Have at least 2 to 3 years of experience with React. - Document key business workflows and software designs. Required Candidate profile - Have built complex applications with AWS Serverless technologies (AppSync, DynamoDB, DynamoDB Streams, Lambda, Cognito, S3, CloudFront, Route 53, Amplify) - Strong knowledge of GraphQL.
Posted 1 month ago
8.0 - 10.0 years
1 - 2 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Urgent Hiring: Senior Software Engineer Java AWS Location: Bangalore – Domlur (Work from Office – 4 Days/Week) Type: Contract (6 Months, Extendable) Notice Period: Immediate to 15 Days Open Positions: 2 Required Skills: Java (8/11/17) , Spring Boot , Microservices Architecture Experience with Kafka or Apache Camel Minimum 2 Years of AWS Hands-on Experience with: EC2, ECS, S3, SQS, SNS, Lambda, DynamoDB, CloudFormation Experience: 5+ Years
Posted 1 month ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Develop, test, and maintain applications using Java and Spring Boot. Design and implement microservices architecture. Work with databases to ensure data integrity and performance. Collaborate with cross-functional teams to define, design, Required Candidate profile Proficiency in Java programming. Experience with Spring Boot framework. Knowledge of microservices architecture. Familiarity with databases (SQL/NoSQL). Basic understanding of Kafka and S3.
Posted 1 month ago
3.0 - 8.0 years
5 - 15 Lacs
Pune
Work from Office
P2 3 Java Full stack - Angular 4-6 yrs DL 20 Java web developers Design, develop, and maintain REST-based microservices using Java. Develop intuitive and responsive user interfaces using modern front-end technologies such as Angular, React, and HTML5 Build and optimize robust back-end services, ensuring seamless integration with databases (SQL). Deploy and manage cloud-native applications on AWS infrastructure. Collaborate with cross-functional teams, including UI/UX designers, DevOps, and product owners, to deliver end-to-end solutions. Ensure the applications performance, scalability, and reliability. Write clean, maintainable, and efficient code while following best practices, including unit testing and code reviews. Troubleshoot, debug, and optimize application code and infrastructure. Stay up to date with emerging technologies and industry trends to drive continuous improvement. Required : We are seeking a highly skilled Software Engineers with expertise in full-stack development. The ideal candidate will have experience building scalable, cloud-native applications and a strong understanding of modern software development practices. Microservice Development: Hands-on experience developing and deploying REST-based microservices using Java frameworks (e.g., Spring and Hibernate). Full-Stack Development: Front-End: Proficiency in Angular, React, and HTML5 for building interactive UIs. Back-End: Expertise in Java for business logic and APIs. Database: Strong understanding of SQL and experience with relational databases. Cloud Experience: Hands-on experience with AWS services (e.g., EC2, S3, Lambda, RDS). Familiarity with cloud-native architecture and deployment practices. Experience with CI/CD tools (Jenkins, GitHub, etc.) and containerization technologies (Docker, Kubernetes). Solid understanding of software development principles, including design patterns, clean code, system design, software architecture and agile methodologies. Experience with Advertising, AdTech, Ad Server (SSP/DSP), OTT or Media Streaming will be preferred. Work Experience : - P2 : 3 to 5Yrs, P3 : 5 to 8yrs. P4 : 8 to 12Yrs Job Location : - Pan India
Posted 1 month ago
8.0 - 12.0 years
20 - 30 Lacs
Hyderabad
Work from Office
Design and development of cloud-hosted web applications for insurance industry from high-level architecture and network infrastructure to low-level creation of site layout, user experience, database schema, data structure, work-flows, graphics, unit testing, an end to end integration testing, etc. Working from static application mock-ups and wireframes, developing front-end user interfaces and page templates in HTML 5, CSS, SAAS, LESS TypeScript, bootstrap, Angular and third-party controls like Kendo UI/Infragistics. Proficiency in AWS services like Lambda, EC2, S3, and IAM for deploying and managing applications. Excellent programming skills in Python with the ability to develop, maintain, and debug Python-based applications. Develop, maintain, and debug applications using .NET Core and C#. Stay up to date with the latest industry trends and technologies related to PostgreSQL, AWS, and Python. Design and implement risk management business functionality and in-database analytics. Identify complex data problems and review related information to develop and evaluate options and design and implement solutions. Design and develop functional and responsive web applications by collaborating with other engineers in the Agile team. Develop REST API and understand WCF Services. Prepare documentations and specifications.
Posted 1 month ago
4.0 - 9.0 years
0 - 3 Lacs
Pune
Work from Office
We are seeking a highly skilled and motivated FullStack Node.js Developer to join our dynamic engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable backend services, APIs, and integrations, as well as contributing to the development of our user interfaces. This role requires strong expertise in Node.js, PostgreSQL, and a solid understanding of various AWS services, including S3, Athena, RDS, and EC2. Experience with Stripe integration for payment processing and a proven ability to both write and consume APIs are essential, along with proficiency in front-end technologies like HTML and CSS. Key Responsibilities: Design, develop, and maintain high-performance, scalable, and secure backend services using Node.js. Develop and implement RESTful APIs for various internal and external applications, ensuring high availability and performance. Integrate with third-party APIs, including payment gateways like Stripe, and other external services. Manage and optimize PostgreSQL databases, including schema design, query optimization, and data migration. Work extensively with AWS services, specifically: Amazon S3: Store and manage application data, backups, and other static assets. AWS Athena: Develop and execute analytical queries on data stored in S3 for reporting and insights. Amazon RDS (PostgreSQL): Configure, manage, and optimize PostgreSQL instances within RDS. Amazon EC2: Deploy, manage, and scale Node.js applications on EC2 instances. Develop responsive and engaging user interfaces using HTML and CSS. Implement and maintain secure coding practices, including data encryption, authentication, and authorization mechanisms. Collaborate with the client and team to define requirements and deliver high-quality software solutions. Participate in code reviews, ensuring code quality, maintainability, and adherence to best practices. Troubleshoot and debug production issues, providing timely resolutions. Contribute to the continuous improvement of our development processes and tools. Qualifications: Technical Skills: Proven experience as a Node.js Developer with a strong understanding of its asynchronous nature, event loop, and best practices. Expertise in database design, development, and optimization with PostgreSQL. Hands-on experience with AWS services, including: S3: Object storage and management. Athena: Serverless query service for S3 data. RDS (PostgreSQL): Managed relational database service. EC2: Virtual servers for deploying applications. Proficiency in designing, building, and consuming RESTful APIs. Experience integrating with payment processing platforms, specifically Stripe. Strong proficiency in HTML5 and CSS3, including responsive design principles. Familiarity with version control systems (Git). Understanding of software development lifecycle (SDLC) and agile methodologies. Experience with the Redis server for caching, session management, and task scheduling. Experience 5+ years of experience in fullStack development with Node.js. 3+ years of experience working with PostgreSQL. 5+ years of experience with AWS cloud services. Nice to have Familiarity with other AWS services (e.g., Lambda, SQS, SNS). Experience with microservices architecture. Familiarity with JavaScript frameworks/libraries (e.g., React, Angular, Vue.js) Soft Skills Excellent problem-solving and analytical skills. Strong communication and interpersonal abilities. Ability to work independently and as part of a team. Proactive and eager to learn new technologies.
Posted 1 month ago
5.0 - 7.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
Key Responsibilities: Design, implement, and maintain cloud-based infrastructure on AWS. Manage and monitor AWS services, including EC2, S3, Lambda, RDS, CloudFormation, VPC, etc. Develop automation scripts for deployment, monitoring, and scaling using AWS services. Collaborate with DevOps teams to automate build, test, and deployment pipelines. Ensure the security and compliance of cloud environments using AWS security best practices. Optimize cloud resource usage to reduce costs while maintaining high performance. Troubleshoot issues related to cloud infrastructure and services. Participate in capacity planning and disaster recovery strategies. Monitor application performance and make necessary adjustments to ensure optimal performance. Stay current with new AWS features and tools and evaluate their applicability for the organization. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as an AWS Engineer or in a similar cloud infrastructure role. In-depth knowledge of AWS services, including EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Proficiency in scripting languages such as Python, Shell, or Bash. Experience with infrastructure-as-code tools like Terraform or AWS CloudFormation. Strong understanding of networking concepts, cloud security, and best practices. Familiarity with containerization technologies (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication skills, both written and verbal. AWS certifications (AWS Certified Solutions Architect, AWS Certified DevOps Engineer, etc.) are preferred. Preferred Skills: Experience with serverless architectures and services. Knowledge of CI/CD pipelines and DevOps methodologies. Experience with monitoring and logging tools like CloudWatch, Datadog, or Prometheus. Knowledge in AWS FinOps
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Grade : 7 Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics. Be accountable for technical delivery and take ownership of solutions. Lead a team of senior and junior developers providing mentorship and guidance. Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress. Drive technical innovation within the department to increase code reusability, code quality and developer productivity. Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion. Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes. Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making. Communication:Strong in strategic communication and stakeholder engagement. Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources.
Posted 1 month ago
4.0 - 9.0 years
25 - 35 Lacs
Bengaluru
Hybrid
Dodge Position Title: Software Engineer STG Labs Position Title: Location: Bangalore, India About Dodge Dodge Construction Network exists to deliver the comprehensive data and connections the construction industry needs to build thriving communities. Our legacy is deeply rooted in empowering our customers with transformative insights, igniting their journey towards unparalleled business expansion and success. We serve decision-makers who seek reliable growth and who value relationships built on trust and quality. By combining our proprietary data with cutting-edge software, we deliver to our customers the essential intelligence needed to excel within their respective landscapes. We propel the construction industry forward by transforming data into tangible guidance, driving unparalleled advancement. Dodge is the catalyst for modern construction. https://www.construction.com/ About Symphony Technology Group (STG) STG is a Silicon Valley (California) based private equity firm that has a long and successful track record of transforming high potential software and software-enabled services companies, as well as insights-oriented companies into definitive market leaders. The firm brings expertise, flexibility, and resources to build strategic value and unlock the potential of innovative companies. Partnering to build customer-centric, market winning portfolio companies, STG creates sustainable foundations for growth that bring value to all existing and future stakeholders. The firm is dedicated to transforming and building outstanding technology companies in partnership with world class management teams. With over $5.0 billion in assets under management, including a recently raised $2.0 billion fund. STGs expansive portfolio has consisted of more than 30 global companies. STG Labs is the incubation center for many of STG’s portfolio companies, building their engineering, professional services, and support delivery teams in India. STG Labs offers an entrepreneurial start-up environment for software and AI engineers, data scientists and analysts, project and product managers and provides a unique opportunity to work directly for a software or technology company. Based in Bangalore, STG Labs supports hybrid working. https://stg.com Roles and Responsibilities Design, build, and maintain scalable data pipelines and ETL processes leveraging AWS services. Collaborate closely with data architects, business analysts, and DevOps teams to translate business requirements into technical data solutions. Apply SDLC best practices, including planning, coding standards, code reviews, testing, and deployment. Automate workflows and optimize data pipelines for efficiency, performance, and reliability. Implement monitoring and logging to ensure the health and performance of data systems. Ensure data security and compliance through adherence to industry and internal standards. Participate actively in agile development processes and contribute to sprint planning, stand-ups, retrospectives, and documentation efforts. Qualifications Hands-on working knowledge and experience is required in: Data Structures Memory Management Basic Algos (Search, Sort, etc) Hands-on working knowledge and experience is preferred in: Memory Management Algorithms: Search, Sort, etc. AWS Data Services: Glue, EMR, Kinesis, Lambda, Athena, Redshift, S3 Scripting & Programming Languages: Python, Bash, SQL Version Control & CI/CD Tools: Git, Jenkins, Bitbucket Database Systems & Data Engineering: Data modeling, data warehousing principles Infrastructure as Code (IaC): Terraform, CloudFormation Containerization & Orchestration: Docker, Kubernetes Certifications Preferred : AWS Certifications (Data Analytics Specialty, Solutions Architect Associate).(Preferred Skill). Role & responsibilities Preferred candidate profile
Posted 1 month ago
5.0 - 8.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Requirements 4+ years of overall experience in software development. Hands on experience in Python(Oops concepts) for 3+ years with extensive ability to write complex Python code. Hands on experience in AWS(Lambda, S3, EC2, Step function) for 2+ years. Knowledge of code versioning tools(GIT) and databases. Should have skill to work in JIRA workflows. Good to have experience in Azure. Should have good logical thinking. Strong analytical and debugging skills. Strong communication skills.
Posted 1 month ago
7.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Role Overview We are seeking an experienced Data Engineer with 7-10 years of experience to design, develop, and optimize data pipelines while integrating machine learning (ML) capabilities into production workflows. The ideal candidate will have a strong background in data engineering, big data technologies, cloud platforms, and ML model deployment. This role requires expertise in building scalable data architectures, processing large datasets, and supporting machine learning operations (MLOps) to enable data-driven decision-making. Key Responsibilities Data Engineering & Pipeline Development Design, develop, and maintain scalable, robust, and efficient data pipelines for batch and real-time data processing. Build and optimize ETL/ELT workflows to extract, transform, and load structured and unstructured data from multiple sources. Work with distributed data processing frameworks like Apache Spark, Hadoop, or Dask for large-scale data processing. Ensure data integrity, quality, and security across the data pipelines. Implement data governance, cataloging, and lineage tracking using appropriate tools. Machine Learning Integration Collaborate with data scientists to deploy, monitor, and optimize ML models in production. Design and implement feature engineering pipelines to improve model performance. Build and maintain MLOps workflows, including model versioning, retraining, and performance tracking. Optimize ML model inference for low-latency and high-throughput applications. Work with ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and deployment tools like Kubeflow, MLflow, or SageMaker. Cloud & Big Data Technologies Architect and manage cloud-based data solutions using AWS, Azure, or GCP. Utilize serverless computing (AWS Lambda, Azure Functions) and containerization (Docker, Kubernetes) for scalable deployment. Work with data lakehouses (Delta Lake, Iceberg, Hudi) for efficient storage and retrieval. Database & Storage Management Design and optimize relational (PostgreSQL, MySQL, SQL Server) and NoSQL (MongoDB, Cassandra, DynamoDB) databases. Manage and optimize data warehouses (Snowflake, BigQuery, Redshift, Databricks) for analytical workloads. Implement data partitioning, indexing, and query optimizations for performance improvements. Collaboration & Best Practices Work closely with data scientists, software engineers, and DevOps teams to develop scalable and reusable data solutions. Implement CI/CD pipelines for automated testing, deployment, and monitoring of data workflows. Follow best practices in software engineering, data modeling, and documentation. Continuously improve the data infrastructure by researching and adopting new technologies. Required Skills & Qualifications Technical Skills: Programming Languages: Python, SQL, Scala, Java Big Data Technologies: Apache Spark, Hadoop, Dask, Kafka Cloud Platforms: AWS (Glue, S3, EMR, Lambda), Azure (Data Factory, Synapse), GCP (BigQuery, Dataflow) Data Warehousing: Snowflake, Redshift, BigQuery, Databricks Databases: PostgreSQL, MySQL, MongoDB, Cassandra ETL/ELT Tools: Airflow, dbt, Talend, Informatica Machine Learning Tools: MLflow, Kubeflow, TensorFlow, PyTorch, Scikit-learn MLOps & Model Deployment: Docker, Kubernetes, SageMaker, Vertex AI DevOps & CI/CD: Git, Jenkins, Terraform, CloudFormation Soft Skills: Strong analytical and problem-solving abilities. Excellent collaboration and communication skills. Ability to work in an agile and cross-functional team environment. Strong documentation and technical writing skills. Preferred Qualifications Experience with real-time streaming solutions like Apache Flink or Spark Streaming. Hands-on experience with vector databases and embeddings for ML-powered applications. Knowledge of data security, privacy, and compliance frameworks (GDPR, HIPAA). Experience with GraphQL and REST API development for data services. Understanding of LLMs and AI-driven data analytics.
Posted 2 months ago
9.0 - 12.0 years
20 - 25 Lacs
Hyderabad
Work from Office
designing, managing, and optimizing our cloud infrastructure to ensure high availability, reliability, scalability of services Architect, deploy, maintain AWS infrastructure using Infrastructure-as-Code (IaC) tools such as Terraform or CloudFormation Required Candidate profile experience in a Site Reliability Engineer or DevOps role, with a focus on AWS cloud infrastructure AWS services such as EC2, S3, RDS, VPC, Lambda, CloudFormation, and CloudWatch.
Posted 2 months ago
5.0 - 8.0 years
15 - 22 Lacs
Chennai
Work from Office
.....are days'We are hiring passionate DevOps professionals with strong Kubernetes and AWS experience. Immediate joiners or candidates with 15 days notice preferred. Key Responsibilities: Administer and manage Kubernetes clusters (CKA preferred) Implement Infrastructure as Code using Terraform or CloudFormation Automate CI/CD pipelines using Jenkins, GitOps, and Helm charts Manage and optimize AWS services: ALB/NLB, Lambda, RDS, S3, Route 53, API Gateway, CloudFront Monitor systems with tools like Datadog, Prometheus Apply security best practices and ensure cost optimization Collaborate with Agile teams to deliver scalable and reliable infrastructure Required Skills: Kubernetes (Must) AWS (EC2, RDS, S3, Lambda, etc.) Docker, Helm, Jenkins, Git Terraform or CloudFormation MongoDB Atlas (Nice to have) Monitoring: Prometheus, Datadog Why Join Us? Competitive pay (up to 22 LPA) Opportunity to work on cutting-edge cloud-native tech Fast-paced, agile environment Immediate onboarding
Posted 2 months ago
3.0 - 8.0 years
9 - 18 Lacs
Hyderabad
Hybrid
Data Engineer with Python development experience Experience: 3+ Years Mode: Hybrid (2-3 days/week) Location: Hyderabad Key Responsibilities Develop, test, and deploy data processing pipelines using AWS Serverless technologies such as AWS Lambda, Step Functions, DynamoDB, and S3. Implement ETL processes to transform and process structured and unstructured data eiciently. Collaborate with business analysts and other developers to understand requirements and deliver solutions that meet business needs. Write clean, maintainable, and well-documented code following best practices. Monitor and optimize the performance and cost of serverless applications. Ensure high availability and reliability of the pipeline through proper design and error handling mechanisms. Troubleshoot and debug issues in serverless applications and data workows. Stay up-to-date with emerging technologies in the AWS and serverless ecosystem to recommend improvements. Required Skills and Experience 3-5 years of hands-on Python development experience, including experience with libraries like boto3, Pandas, or similar tools for data processing. Strong knowledge of AWS services, especially Lambda, S3, DynamoDB, Step Functions, SNS, SQS, and API Gateway. Experience building data pipelines or workows to process and transform large datasets. Familiarity with serverless architecture and event-driven programming. Knowledge of best practices for designing secure and scalable serverless applications. Prociency in version control systems (e.g., Git) and collaboration tools. Understanding of CI/CD pipelines and DevOps practices. Strong debugging and problem-solving skills. Familiarity with database systems, both SQL (e.g., RDS) and NoSQL (e.g., DynamoDB). Preferred Qualications AWS certications (e.g., AWS Certied Developer Associate or AWS Certied Solutions Architect Associate). Familiarity with testing frameworks (e.g., pytest) and ensuring test coverage for Python applications. Experience with Infrastructure as Code (IaC) tools such as AWS CDK, CloudFormation. Knowledge of monitoring and logging tools . Apply for Position
Posted 2 months ago
6.0 - 10.0 years
22 - 25 Lacs
Bengaluru
Work from Office
Proficiency in Python, SQL, data transformation and scripting. Experience with data pipeline and workflow tools such as Airflow, Apache, Airflow, Flyte, Argo Hands on experience with Spark/ PySpark, Docker and Kubernetes Strong experience with relational databases (e.g., SQL Server, PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Expertise in cloud data platforms such as AWS (Glue, Redshift, S3), Azure (Data Factory, Synapse), or GCP (BigQuery, Dataflow).
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like youd make a great addition to our vibrant team. Siemens founded the new business unit Siemens Foundational Technologies (formerly known as Siemens IoT Services) on April 1, 2019 with its headquarter in Munich, Germany. It has been crafted to unlock the digital future of its clients by offering end-to-end support on their outstanding digitalization journey. Siemens Foundational Technologies is a strategic advisor and a trusted implementation partner in digital transformation and industrial IoT with a global network of more than 8000 employees in 10 countries and 21 offices. Highly skilled and experienced specialists offer services which range from consulting to craft & prototyping to solution & implementation and operation everything out of one hand. We are looking for a Senior DevOps Engineer Youll make a difference by: Key Responsibilities: Design, implement, and maintain CI/CD pipelines using GitLab, including configuring GitLab Runners. Build, manage, and scale containerized applications using Docker, Kubernetes, and HELM. Automate infrastructure provisioning and management with Terraform. Manage and optimize cloud-based environments, especially AWS. Administer and optimize Kafka clusters for data streaming and processing. Oversee the performance and reliability of databases and Linux environments. Monitor and enhance system health using tools like Prometheus and Grafana. Collaborate with cross-functional teams to implement DevOps best practices. Ensure system security, scalability, and disaster recovery readiness. Troubleshoot and resolve technical issues across the infrastructure. Required Skills & Qualifications: 6 - 8 years of experience in DevOps, system administration, or a related role. Expertise in CI/CD tools and workflows, especially GitLab Pipelines and GitLab Runners. Proficient in containerization and orchestration tools like Docker, Kubernetes, and HELM. Strong hands-on experience with Docker Swarm, including creating and managing Docker clusters. Proficiency in packaging Docker images for deployment. Strong hands-on experience with Kubernetes, including managing clusters and deploying applications. Strong hands-on experience with Terraform for Infrastructure as Code (IaC). In-depth knowledge of AWS services, including EC2, S3, IAM, EKS, MSK, Route53 and VPC. Solid experience in managing and maintaining Kafka ecosystems. Strong Linux system administration skills. Proficiency in database management, optimization, and troubleshooting. Experience with monitoring tools like Prometheus and Grafana. Excellent scripting skills in languages like Bash, Python. Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication skills and a collaborative mindset. Good to Have Skills: Experience with Keycloak for identity and access management. Familiarity with Nginx or Traefik for reverse proxy and load balancing. Hands-on experience in PostgreSQL maintenance, including backups, tuning, and troubleshooting. Knowledge of the railway domain, including industry-specific challenges and standards. Experience in implementing and managing high-availability architectures. Exposure to distributed systems and microservices architecture. Desired Skills: 5-8 years of experience is required. Great Communication skills. Analytical and problem-solving skills This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come.
Posted 2 months ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
We are looking for a Cloud Application Developer. Youll make a difference by: Having proficiency in development of backend(python) and smaller frontend applications(angular) using on our AWS-based managed cloud environment. Having hands-on experience with Contribution to development of larger composite applications. Having Knowledge in Operating and troubleshooting existing applications. Having ability to Break down and implement high level concepts created by architects and PO. Having exposure of AWS cloud platform services (Lambda, ECS, S3, RDS/DynamoDB), certification is a plus. Having Practical experience in setting up and maintaining CI/CD pipelines, containerization (Docker), and version control systems (Git). Having Proficiency in full-stack technologies (front-end Angular), databases, APIs) Youll win us over by: Holding a graduate BE / B.Tech / MCA/M.Tech/M.Sc with good academic record. 5+ Years of Experience in Software development, with a focus on Python. Familiar with agile development processes and principles. Optional skills: C#, Kubernetes, DevOps engineering, Infrastructure-as-Code
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France