Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
10 - 18 Lacs
Noida
Work from Office
Precognitas Health Pvt. Ltd., a fully owned subsidiary of Foresight Health Solutions LLC, is seeking a Data Engineer to build and optimize our data pipelines, processing frameworks, and analytics infrastructure that power critical healthcare insights. Are you a bright, energetic, and skilled data engineer who wants to make a meaningful impact in a dynamic environment? Do you enjoy designing and implementing scalable data architectures, ML pipelines, automating ETL workflows, and working with cloud-native solutions to process large datasets efficiently? Are you passionate about transforming raw data into actionable insights that drive better healthcare outcomes? If so, join us! Youll play a crucial role in shaping our data strategy, optimizing data ingestion, and ensuring seamless data flow across our systems while leveraging the latest cloud and big data technologies. Required Skills & Experience : 4+ years of experience in data engineering, data pipelines, and ETL/ELT workflows. Strong Python programming skills with expertise in Python Programming, NumPy, Pandas, and data manipulation techniques. Hands-on experience with orchestration tools like Prefect, Apache Airflow, or AWS Step Functions for managing complex workflows. Proficiency in AWS services, including AWS Glue, AWS Batch, S3, Lambda, RDS, Athena, and Redshift. Experience with Docker containerization and Kubernetes for scalable and efficient data processing. Strong understanding of data processing layers, batch and streaming data architectures, and analytics frameworks. Expertise in SQL and NoSQL databases, query optimization, and data modeling for structured and unstructured data. Familiarity with big data technologies like Apache Spark, Hadoop, or similar frameworks. Experience implementing data validation, quality checks, and observability for robust data pipelines. Strong knowledge of Infrastructure as Code (IaC) using Terraform or AWS CDK for managing cloud-based data infrastructure. Ability to work with distributed systems, event-driven architectures (Kafka, Kinesis), and scalable data storage solutions. Experience with CI/CD for data workflows, including version control (Git), automated testing, and deployment pipelines. Knowledge of data security, encryption, and access control best practices in cloud environments. Strong problem-solving skills and ability to collaborate with cross-functional teams, including data scientists and software engineers. Compensation will be commensurate with experience. If you are interested, please send your application to jobs@precognitas.com. For more information about our work, visit www.caliper.care
Posted 19 hours ago
10.0 - 15.0 years
15 - 25 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Experience: 10+ Years Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have over 10 years of experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: Lead and architect data modernization/transformation projects using AWS services. Manage and mentor a team of data engineers and analysts. Build and maintain strong client relationships, ensuring successful project delivery. Design and implement scalable data architectures and solutions. Oversee the migration of large datasets to AWS, ensuring data integrity and security. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure best practices in data management and governance are followed. Required Skills and Experience: 10+ years of experience in data architecture and analytics. Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. Proven experience in delivering 1-2 large data migration/modernization projects using AWS. Strong leadership and team management skills. Excellent communication and interpersonal skills. Deep understanding of data modeling, ETL processes, and data warehousing. Experience with data governance and security best practices. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: AWS Certified Solutions Architect Professional or AWS Certified Big Data Specialty. Experience with other cloud platforms (e.g., Azure, GCP) is a plus. Familiarity with machine learning and AI technologies.
Posted 1 day ago
4.0 - 9.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 4 to 9+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 day ago
7.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.
Posted 2 days ago
5.0 - 10.0 years
16 - 20 Lacs
Mumbai, Goregaon
Work from Office
Role Overview We are seeking a highly skilled Engineering Manager with deep expertise in the MERN stack (MongoDB, Express, React, Node.js), AWS infrastructure, and DevOps practices. This role requires both hands-on technical leadership and strong people management to lead a team of engineers building scalable, high-performance applications. Key Responsibilities Lead, mentor, and manage a team of full-stack developers working primarily with MERN. Own architecture decisions, code quality, and engineering practices across multiple microservices. Collaborate with Product, Design, and QA teams to define and deliver on product roadmaps. Implement CI/CD pipelines, infrastructure as code, and automated testing strategies. Ensure system scalability, security, and performance optimization across services. Drive sprint planning, code reviews, and technical documentation standards. Work closely with DevOps to maintain uptime and operational excellence. Required Skills 6+ years of experience with full-stack JavaScript development (MERN stack) 2+ years in a leadership/managerial role Strong understanding of Node.js backend and API development Hands-on with React.js, component design, and front-end state management Proficient in MongoDB and designing scalable NoSQL schemas Experience in AWS services (EC2, S3, RDS, Lambda, CloudWatch, IAM) Working knowledge of Docker, GitHub Actions, or similar CI/CD tools Familiarity with monitoring tools like New Relic, Datadog, or Prometheus Solid experience managing agile workflows and team velocity
Posted 2 days ago
11.0 - 20.0 years
25 - 40 Lacs
Hyderabad, Chennai, Greater Noida
Hybrid
Primary Skills Proficiency in AWS Services : Deep knowledge of EC2, S3, RDS, Lambda, VPC, IAM, AWS Event Bridge, AWS B2Bi (EDI Generator), CloudFormation, and more. Cloud Architecture Design : Ability to design scalable, resilient, and cost-optimized architectures. Networking & Connectivity: Understanding of VPC peering, Direct Connect, Route 53, and load balancing. Security & Compliance: Implementing IAM policies, encryption, KMS, and compliance frameworks like HIPAA or GDPR. Infrastructure as Code (IaC): Using tools like AWS CloudFormation or Terraform to automate deployments. DevOps Integration : Familiarity with CI/CD pipelines, AWS CodePipeline, and container orchestration (ECS, EKS). Cloud Migration : Planning and executing lift-and-shift or re-architecting strategies for cloud adoption. Monitoring & Optimization: Using CloudWatch, X-Ray, and Trusted Advisor for performance tuning and cost control. Secondary Skills Programming Skills : Python, Java, or Node.js for scripting and automation. Serverless Architecture: Designing with Lambda, API Gateway, and Step Functions. Cost Management: Understanding pricing models (On-Demand, Reserved, Spot) and using Cost Explorer. Disaster Recovery & High Availability: Multi-AZ deployments, backups, and failover strategies. Soft Skills: Communication, stakeholder management, and documentation. Team Collaboration: Working with DevOps, security, and development teams to align cloud goals. Certifications: AWS Certified Solutions Architect Associate/Professional, and optionally DevOps Engineer or Security Specialty
Posted 2 days ago
0.0 - 1.0 years
5 - 9 Lacs
Kolkata
Work from Office
Key Responsibilities. Collaborate with data scientists to support end-to-end ML model development, including data preparation, feature engineering, training, and evaluation.. Build and maintain automated pipelines for data ingestion, transformation, and model scoring using Python and SQL.. Assist in model deployment using CI/CD pipelines (e.g., Jenkins) and ensure smooth integration with production systems.. Develop tools and scripts to support model monitoring, logging, and retraining workflows.. Work with data from relational databases (RDS, Redshift) and preprocess it for model consumption.. Analyze pipeline performance and model behavior; identify opportunities for optimization and refactoring.. Contribute to the development of a feature store and standardized processes to support reproducible data science.. Required Skills & Experience. 1-3 years of hands-on experience in Python programming for data science or ML engineering tasks.. Solid understanding of machine learning workflows, including model training, validation, deployment, and monitoring.. Proficient in SQL and working with structured data from sources like Redshift, RDS, etc.. Familiarity with ETL pipelines and data transformation best practices.. Basic understanding of ML model deployment strategies and CI/CD tools like Jenkins.. Strong analytical mindset with the ability to interpret and debug data/model issues.. Preferred Qualifications. Exposure to frameworks like scikit-learn, XGBoost, LightGBM, or similar.. Knowledge of ML lifecycle tools (e.g., MLflow, Ray).. Familiarity with cloud platforms (AWS preferred) and scalable infrastructure..
Posted 3 days ago
10.0 - 15.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Deep expertise in a wide range of AWS services, including: Compute: (EC2, Lambda, ECS, EKS), Storage: (S3, EFS, FSx), Databases: (RDS, DynamoDB, Aurora), Networking: (VPC, Route 53, CloudFront), Security: (IAM, KMS, GuardDuty, WAF), Monitoring: ( Required Candidate profile The role of an AWS Senior Architect is a senior-level position focused on designing, implementing, and managing robust cloud solutions on Amazon Web Services (AWS). Security: (IAM, KMS, GuardDuty,
Posted 4 days ago
5.0 - 10.0 years
7 - 12 Lacs
Chennai
Remote
Location: 100% Remote Employment Type: Full-Time Must have Own laptop and Internet connection Work hours: 11 AM to 8 PM IST Position Summary: We are looking for a highly skilled and self-driven Full Stack Developer with deep expertise in React.js, Node.js, and AWS cloud services. The ideal candidate will play a critical role in designing, developing, and deploying full-stack web applications in a secure and scalable cloud environment. Key Responsibilities: Design and develop scalable front-end applications using React.js and modern JavaScript/TypeScript frameworks. Build and maintain robust backend services using Node.js, Express, and RESTful APIs. Architect and deploy full-stack solutions on AWS using services such as Lambda, API Gateway, ECS, RDS, S3, CloudFormation, CloudWatch, and DynamoDB. Ensure application performance, security, scalability, and maintainability. Work collaboratively in Agile/Scrum environments and participate in sprint planning, code reviews, and daily standups. Integrate CI/CD pipelines and automate testing and deployment workflows using AWS-native tools or services like Jenkins, CodeBuild, or GitHub Actions. Troubleshoot production issues, optimize system performance, and implement monitoring and alerting solutions. Maintain clean, well-documented, and reusable code and technical documentation. Required Qualifications: 5+ years of professional experience as a full stack developer. Strong expertise in React.js (Hooks, Context, Redux, etc.). Advanced backend development experience with Node.js and related frameworks. Proven hands-on experience designing and deploying applications on AWS Cloud. Solid understanding of RESTful services, microservices architecture, and cloud-native design. Experience working with relational databases (PostgreSQL, MySQL, DynamoDB). Proficient in Git and modern DevOps practices (CI/CD, Infrastructure as Code, etc.). Strong communication skills and ability to collaborate in distributed teams.
Posted 4 days ago
2.0 - 7.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Experience : 2+ years Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Bengaluru) Must have skills required: AWS, Go Lang, Python Requirements : We are looking for a Backend Engineer to help us through the next level of technology changes needed to revolutionize Healthcare for India. We are seeking individuals who can understand real-world scenarios and come up with scalable tech solutions for millions of patients to make healthcare accessible. The role comes with a good set of challenges to solve, and offers an opportunity to build new systems that will be rolled out at scale. You have 2 to 4 years or more of software development experience with expertise in designing and implementing high-performance web applications. Very strong understanding and experience with any of Java, Scala, GoLang, Python. Experience writing optimized queries in relational databases like Mysql, redshift / Postgres. You have exposure to basic data engineering concepts like data pipeline, hadoop or spark Write clean and testable code. You love to build platforms that enable other teams to build on top of. Some of challenges we solve include: Clinical decision support Early Detection: Digitally assist doctors in identifying high-risk patients for early intervention Track & Advice: Analyze patients vitals/test values across visits to assist doctors in personalizing chronic care. Risk Prevention: Assist doctors in monitoring the progression of chronic disease by drawing attention to additional symptoms and side effects. EMR (Electronic Medical Records): Clinical software to write prescriptions and manage clinical records AI-powered features Adapts to doctors practice: Learns from doctors prescribing preferences and provides relevant auto-fill recommendations for faster prescriptions. Longitudinal patient journey: AI analyses the longitudinal journey of patients to assist doctors in early detection. Medical language processing: AI-driven automatic digitization of printed prescriptions and test reports. Core platform Pharma advertising platform to doctors at the moment of truth Real world evidence to generate market insights for B2B consumption Virtual Store Online Pharmacy+ Diagnostic solutions helping patients with one-click order Technologies we use : Distributed Tech: Kafka, Elastic search Databases: MongoDB, RDS Cloud platform: AWS Languages: Go-lang, python, PHP UI Tech: React, react native Caching: Redis Big Data: AWS Athena, Redshift APM: NewRelic Responsibilities : Develop well testable and reusable services with structured, granular and well-commented code. Contribute in the area of API building, data pipeline setup, and new tech initiatives needed for a core platform Acclimate to new technologies and situations as per the company demands and requirements with the vision of providing best customer experience Meet expected deliverables and quality standards with every release Collaborate with teams to design, develop, test and refine deliverables that meet the objectives Perform code reviews and implement improvement plans Additional Responsibilities : Pitch-in during the phases of design and architectural solutions of Business problems. Organize, lead and motivate the development team to meet expected timelines and quality standards across releases. Actively contribute to development process improvement plans. Assist peers by code reviews and juniors through mentoring. Must have Skills : Sound understanding of Computer Science fundamentals including Data Structures and Space and Time complexity. Excellent problem solving skills Solid understanding of any of the modern Object oriented programming languages (like Java, Ruby or Python) and or Functional languages (like Scala, GoLang) Understanding of MPP (Massive parallel processing) and frameworks like Spark Experience working with Databases (RDBMS - Mysql, Redshift etc, NoSQL - Couchbase / MongoDB / Cassandra etc). Experience working with open source libraries and frameworks. Strong hold on versioning tools Git/Bitbucket. Good to have Skills : Knowledge of MicroServices architecture. Have experience working with Kafka Experience or Exposure to ORM frameworks (like ActiveRecord, SQLAlchemy etc). Working knowledge of full text search (like ElasticSearch, Solr etc). Skills AWS, Go Lang, Python
Posted 4 days ago
3.0 - 8.0 years
20 - 35 Lacs
Bengaluru
Hybrid
Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity Must have skills required: Golang, Python, Java Requirements : We are looking for a Backend Engineer to help us through the next level of technology changes needed to revolutionize Healthcare for India. We are seeking individuals who can understand real-world scenarios and come up with scalable tech solutions for millions of patients to make healthcare accessible. The role comes with a good set of challenges to solve, and offers an opportunity to build new systems that will be rolled out at scale. You have 4 to 7 years or more of software development experience with expertise in designing and implementing high-performance web applications. Very strong understanding and experience with any of Java, Scala, GoLang, Python. Experience writing optimized queries in relational databases like Mysql, redshift / Postgres. You have exposure to basic data engineering concepts like data pipeline, hadoop or spark Write clean and testable code. You love to build platforms that enable other teams to build on top of. Some of challenges we solve include: Clinical decision support Early Detection: Digitally assist doctors in identifying high-risk patients for early intervention Track & Advice: Analyze patients vitals/test values across visits to assist doctors in personalizing chronic care. Risk Prevention: Assist doctors in monitoring the progression of chronic disease by drawing attention to additional symptoms and side effects. EMR (Electronic Medical Records): Clinical software to write prescriptions and manage clinical records AI-powered features Adapts to doctors practice: Learns from doctors prescribing preferences and provides relevant auto-fill recommendations for faster prescriptions. Longitudinal patient journey: AI analyses the longitudinal journey of patients to assist doctors in early detection. Medical language processing: AI-driven automatic digitization of printed prescriptions and test reports. Core platform Pharma advertising platform to doctors at the moment of truth Real world evidence to generate market insights for B2B consumption Virtual Store Online Pharmacy + Diagnostic solutions helping patients with one-click order Technologies we use : Distributed Tech: Kafka, Elastic search Databases: MongoDB, RDS Cloud platform: AWS Languages: Go-lang, python, PHP UI Tech: React, react native Caching: Redis Big Data: AWS Athena, Redshift APM: NewRelic Responsibilities : Develop well testable and reusable services with structured, granular and well-commented code. Contribute in the area of API building, data pipeline setup, and new tech initiatives needed for a core platform Acclimate to new technologies and situations as per the company demands and requirements with the vision of providing best customer experience Meet expected deliverables and quality standards with every release Collaborate with teams to design, develop, test and refine deliverables that meet the objectives Perform code reviews and implement improvement plans Additional Responsibilities : Pitch-in during the phases of design and architectural solutions of Business problems. Organize, lead and motivate the development team to meet expected timelines and quality standards across releases. Actively contribute to development process improvement plans. Assist peers by code reviews and juniors through mentoring. Must have Skills : Sound understanding of Computer Science fundamentals including Data Structures and Space and Time complexity. Excellent problem solving skills Solid understanding of any of the modern Object oriented programming languages (like Java, Ruby or Python) and or Functional languages (like Scala, GoLang) Understanding of MPP (Massive parallel processing) and frameworks like Spark Experience working with Databases (RDBMS - Mysql, Redshift etc, NoSQL - Couchbase / MongoDB / Cassandra etc). Experience working with open source libraries and frameworks. Strong hold on versioning tools Git/Bitbucket. Good to have Skills : Knowledge of MicroServices architecture. Have experience working with Kafka Experience or Exposure to ORM frameworks (like ActiveRecord, SQLAlchemy etc). Working knowledge of full text search (like ElasticSearch, Solr etc).
Posted 5 days ago
4.0 - 9.0 years
5 - 9 Lacs
Mumbai
Hybrid
Key Responsibilities Design and develop scalable software solutions for recruitment workflows Build and maintain data-intensive applications with high performance and reliability Collaborate with cross-functional teams to define, design, and ship new features Optimize applications for maximum speed and scalability Implement security and data protection measures Participate in code reviews and contribute to engineering best practices Requirements 4-6 years of experience developing software on a Java/J2EE and relational database stack Strong understanding of system design and scalable architecture principles Proficiency with technologies like Spring, Hibernate, SQL, and REST Experience in designing and implementing microservices-based architecture Familiarity with setting up and deploying applications to cloud providers like AWS or GCP Ability to harness AI as an engineering assistant to improve productivity and code quality Preferred Qualifications Experience with frontend development - JavaScript frameworks like Backbone/Angular Data science experience - fetching data from multiple sources, modeling, and extracting information Familiarity with tools like MongoDB, Hadoop, Mahout, Neo4j Information security knowledge - OWASP Security principles Our Tech Stack Java, Spring, Hibernate, MySQL - RDS, MongoDB, Apache Solr, Spring Cloud, S3 - Angular 2, Backbone JS, Azure OpenAI. Our applications are hosted on AWS and GCP.
Posted 5 days ago
5.0 - 10.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 5 to 10+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 week ago
4.0 - 7.0 years
0 - 1 Lacs
Bengaluru
Hybrid
Job Requirements Job Description: AWS Developer Quest Global, a leading global technology and engineering services company, is seeking an experienced AWS Developer to join our team. As an AWS Developer, you will play a key role in designing, developing, and maintaining cloud-based applications using Amazon Web Services (AWS) and Java development skills. Responsibilities: - Designing, developing, and deploying scalable and reliable cloud-based applications on AWS platform. - Collaborating with cross-functional teams to gather requirements and translate them into technical solutions. - Writing clean, efficient, and maintainable code using Java programming language. - Implementing best practices for security, scalability, and performance optimization. - Troubleshooting and resolving issues related to AWS infrastructure and applications. - Conducting code reviews and providing constructive feedback to ensure code quality. - Keeping up-to-date with the latest AWS services, tools, and best practices. Join our dynamic team at Quest Global and contribute to the development of cutting-edge cloud-based applications using AWS and Java. Apply now and take your career to new heights! Note: This job description is intended to provide a general overview of the position and does not encompass all the tasks and responsibilities that may be assigned to the role. Work Experience Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - Minimum 5 years of experience as an AWS Developer or similar role. - Strong proficiency in Java programming language. - In-depth knowledge of AWS services such as EC2, S3, Lambda, RDS, DynamoDB, etc. - Experience with cloud-based application development and deployment. - Familiarity with DevOps practices and tools. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities.
Posted 1 week ago
5.0 - 8.0 years
7 - 11 Lacs
Chennai, Malaysia, Malaysia
Work from Office
Responsibilities for Data Engineer Create and maintain the optimal data pipeline architecture, Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing big data data pipelines, architectures and data sets. Experience performing root cause analysis of internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured data sets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including MongoDB, Postgres and Cassandra, AWS Redshift, Snowflake Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Experience with AWS cloud services: EC2, EMR, ETL, Glue, RDS, Redshift Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, etc. Knowledge of data pipelines and workflow management tools like Airflo Location: Chenna, India / Kuala Lumpur, Malaysia
Posted 1 week ago
2.0 - 5.0 years
2 - 7 Lacs
Bengaluru
Work from Office
o Deploy applications on AWS using services such as EC2, ECS, S3, RDS, or Lambda o Implement CI/CD pipelines using GitHub Actions, Jenkins, or CodePipeline o Apply DevSecOps best practices including container security (Docker, ECR), infrastructure as code (Terraform), and runtime monitoring Team Collaboration & Agility o Participate in Agile ceremonies (stand-ups, sprint planning, retros) o Work closely with product, design, and AI engineers to build secure and intelligent systems
Posted 1 week ago
8.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Duration: Full Time Job Description: We are seeking a highly experienced and hands-on PHP Developer with leadership or managerial experience to join our growing team. The ideal candidate will be proficient in Laravel, CodeIgniter, React.js, Ajax, AWS, and SQL, with a proven track record of leading development teams and delivering robust, scalable web applications. Key Responsibilities: Lead and manage a team of developers, ensuring timely and quality delivery of projects. Architect, design, and develop high-performance web applications using PHP frameworks (Laravel & CodeIgniter). Integrate and manage front-end components using React.js. Work with Ajax for seamless asynchronous user experiences. Design and maintain SQL databases for high availability and performance. Deploy, manage, and troubleshoot applications hosted on AWS. Ensure coding standards, best practices, and secure programming techniques. Collaborate with cross-functional teams, including product managers and designers. Perform code reviews, mentorship, and performance evaluations. Required Skills & Experience: 8+ years of experience in PHP development. Strong hands-on experience with Laravel and CodeIgniter frameworks. Proficiency with React.js for front-end integration. Experience with Ajax for dynamic web functionality. Solid understanding of AWS services like EC2, S3, RDS, etc. Proficient in MySQL / SQL database design and optimization. Previous experience leading a team or managing developers (must-have). Strong problem-solving, debugging, and analytical skills. Excellent communication and leadership skills. Preferred Qualifications: Familiarity with CI/CD pipelines and DevOps practices. Experience with RESTful APIs and third-party integrations. Knowledge of version control tools like Git. Bachelors/Masters degree in Computer Science or related field.
Posted 1 week ago
5.0 - 6.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 5 to 6+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 week ago
7.0 - 12.0 years
20 - 35 Lacs
Pune
Work from Office
7+ years in software engineering, with 4+ years using AWS. Programming languages: C# and Python, along with SQL and Spark. The engineering position requires a minimum three-hour overlap with team members in the US-Pacific time zone. Strong experience with some (or all) of the following: Lambda and Step functions, API Gateway, Fargate, ECS, S3, SQS, Kinesis, Firehose, DynamoDB, RDS, Athena, and Glue. Solid foundation in data structures and algorithms and in-depth knowledge and passion for coding standards and following proven design patterns. RESTful and GraphQL APIs are examples. You might also have... DevOps experience is a plus, GitHub, GitHub Actions, Docker. Experience building CI/CD and server/deployment automation solutions, and container orchestration technologies.
Posted 1 week ago
5.0 - 10.0 years
30 - 40 Lacs
Pune, Ahmedabad
Work from Office
We are seeking an experienced Sr. Java Developer with expertise in Java Spring and Spring Boot frameworks, Rest API, and Cloud. The ideal candidate will have 6+ years of hands-on experience in developing scalable and robust applications. Experience with any cloud services (AWS/Azure/GCP). Job Title: Sr. Java Developer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Responsible for the complete software development life cycle, including requirement analysis, design, development, deployment, and support. Responsible for developing software products for Agentic AI Security. Write clean, testable, readable, scalable and maintainable Java code. Design, develop and implement highly scalable software features and infrastructure on our security platform ready for cloud native deployment from inception to completion. Participate actively and contribute to design and development discussions. Develop solid understanding and be able to explain advanced cloud computing and cloud security concepts to others. Work cross-functionally with Product Management, SRE, Software, and Quality Engineering teams to deliver new security-as-a-service offerings to the market in a timely fashion with excellent quality. Be able to clearly communicate goals and desired outcomes to internal project teams. Work closely with customer support teams to improve end-customer outcomes. Required Skill: Strong programming skills in Java, with experience in building distributed systems. 6+ years of experience in software engineering, with a focus on cloud-native application development, at large organizations or innovative startups. 3+ Experience and deep understanding for building connectors for Low Code/noCode and Agentic AI platforms like Microsoft Copilot Studio, Microsoft Power Platform, Salesforce Agentforce, Zappier, Crew AI, Marketo etc. 5+ Experience building connectors for SaaS Applications like Microsoft O365, Power Apps, Salesforce, ServiceNow etc. Preferred experience with security products-data and DLP, CASB security, SASE, and integration with third party APIs and services. 5+ years of experience with running workloads on cloud-based architectures. (AWS/GCP experience preferred) 5+ years of experience in cloud technologies like ElasticSearch, Redis, Kafka, Mongo DB, Spring Boot . Experience with Docker and Kubernetes or other container orchestration platforms. Excellent troubleshooting abilities. Isolate issues found during testing and verify bug fixes once they are resolved. Experience with backend development (REST APIs, databases, and serverless computing) of distributed cloud applications. Experience with building and delivering services and workflows at scale, leveraging microservices architectures. Experience with the agile process and working with software development teams involved with building out full-stack products deployed on the cloud at scale. Good understanding of public cloud design considerations and limitations in areas of microservice architectures, security, global network infrastructure, distributed systems, and load balancing. Strong understanding of principles of DevOps and continuous delivery. Can-do attitude and ability to make trade-off judgements with data-driven decision-making. High energy and the ability to work in a fast-paced environment. Enjoys working with many different teams with strong collaboration and communication skills.
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.
Posted 1 week ago
8.0 - 13.0 years
10 - 14 Lacs
Hyderabad, Gurugram
Work from Office
Whats in it for You Career Development: Build a meaningful career with a leading global company at the forefront of technology. Dynamic Work Environment: Work in an environment that is dynamic and forward-thinking, directly contributing to innovative solutions. Skill Enhancement: Enhance your software development skills on an enterprise-level platform. Versatile Experience: Gain full-stack experience and exposure to cloud technologies. Leadership Opportunities: Mentor peers and influence the products future as part of a skilled team. Work Flexibility: Benefit from a flexible work arrangement, balancing office time with the option to work from home. Community Engagement: Utilize five paid days for charity work or volunteering, supporting your passion for community service. Responsibilities: Design and implement cloud solutions using AWS and Azure. Develop and maintain Infrastructure as Code (IAC) with Terraform. Create and manage CI/CD pipelines using GitHub Actions and Azure DevOps. Automate deployment processes and provisioning of compute instances and storage. Orchestrate container deployments with Kubernetes. Develop automation scripts in Python, PowerShell, and Bash. Monitor and optimize cloud resources for performance and cost-efficiency using tools like Datadog and Splunk. Configure Security Groups, IAM policies, and roles in AWS\Azure. Troubleshoot production issues and ensure system reliability. Collaborate with development teams to integrate DevOps and MLOps practices. Create comprehensive documentation and provide technical guidance. Continuously evaluate and integrate new AWS services and technologies Cloud engineering certifications (AWS, Terraform) are a plus. Excellent communication and problem-solving skills. Minimum Qualifications: Bachelors Degree in Computer Science or equivalent experience. Minimum of 8+ years in cloud engineering, DevOps, or Site Reliability Engineering (SRE). Hands-on experience with AWS and Azure cloud services, including IAM, Compute, Storage, ELB, RDS, VPC, TGW, Route 53, ACM, Serverless computing, Containerization, CloudWatch, CloudTrail, SQS, and SNS. Experience with configuration management tools like Ansible, Chef, or Puppet. Proficiency in Infrastructure as Code (IAC) using Terraform. Strong background in CI/CD pipelines using GitHub Actions and Azure DevOps. Knowledge of MLOps or LLMops practices. Proficient in scripting languages: Python, PowerShell, Bash. Ability to work collaboratively in a fast-paced environment. Preferred Qualifications: Advanced degree in a technical field. Extensive experience with ReactJS and modern web technologies. Proven leadership in agile and project management. Advanced knowledge of CI/CD and industry best practices in software development.
Posted 1 week ago
6.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Work from Office
We are seeking an experienced Senior Software Engineer with 6+ years of experience to design and review complex system architectures, focusing on microservices and high-performance applications You will be responsible for building scalable solutions, optimizing performance, and ensuring seamless integration across multiple components and services Strong expertise in C#, with emphasis on building distributed microservices Experience working on cloud infrastructure such as AWS (EC2, RDS, EKS) Experience using task-tracking tools like Jira and documentation platforms like Confluence
Posted 1 week ago
6.0 - 10.0 years
20 - 30 Lacs
Bengaluru
Work from Office
We are seeking an experienced Senior Software Engineer with 6+ years of experience to design and review complex system architectures, focusing on microservices and high-performance applications You will be responsible for building scalable solutions, optimizing performance, and ensuring seamless integration across multiple components and services Strong expertise in C#, with emphasis on building distributed microservices Experience working on cloud infrastructure such as AWS (EC2, RDS, EKS) Experience using task-tracking tools like Jira and documentation platforms like Confluence
Posted 1 week ago
5.0 - 8.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 5 to 8+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane