Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
We seek a Technical Lead with deep expertise in MEAN/MERN stacks to lead a team of 10 developers, ensuring timely delivery of scalable, high-performance applications. You will architect solutions, optimize databases, manage sprint workflows, and guide the transition to microservices. Your role will balance hands-on coding, team mentorship, and strategic planning to align with business goals. Key Responsibilities: Team Leadership & Delivery Lead a team of 10 developers, ensuring adherence to timelines and resource efficiency. Drive sprint planning, task allocation, and daily standups to meet project milestones. Conduct code reviews and enforce best practices for maintainable, scalable code. Technical Expertise Design and develop CRM applications using MongoDB, Node.js, Angular, and React, handling millions of records with optimized queries and indexing. Plan and execute upgrades for Angular, Node.js, and MongoDB to ensure security and performance. Architect microservices-based solutions and modularize monolithic systems. Scalability & Performance Optimize database performance (MongoDB), APIs (Node.js), and frontend rendering (Angular/React). Implement caching, load balancing, and horizontal/vertical scaling strategies. DevOps & Cloud Build CI/CD pipelines for automated testing and deployment. Leverage AWS services (Lambda, SQS, S3, EC2) for serverless architectures and scalable infrastructure and related services in cloud space like GCP etc. Problem-Solving Debug complex issues across the stack, providing data-driven solutions. Anticipate risks (e.g., bottlenecks, downtime) and implement preventive measures. Mandatory Requirements 1. 6-9 years of hands-on experience in MEAN/MERN stacks, including: 2. MongoDB: Schema design, aggregation pipelines, sharding, replication. 3. Node.js: REST/GraphQL APIs, middleware, asynchronous processing. 4. Angular/React: State management, component lifecycle, performance tuning. 5. Proven expertise in Agile sprint management, resource tracking, and deadline-driven delivery. 6. Experience upgrading Angular, Node.js, and MongoDB in production environments. 7. Leadership skills with a track record of managing teams (810 members). 8. Strong grasp of microservices, event-driven architecture, and scalability patterns. 9. Analytical thinker with excellent debugging and problem-solving abilities. Preferred Skills DevOps: CI/CD pipelines (Jenkins/GitLab), Docker, Kubernetes. AWS: Lambda, SQS, S3, EC2, CloudFormation. Monitoring: New Relic, Prometheus, Grafana.
Posted 3 weeks ago
8.0 - 12.0 years
20 - 25 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are looking for an experienced Senior Full Stack Developer to lead and deliver critical components of a modern SaaS platform. This is a technically demanding role with a focus on system architecture, integration, and mentoring junior team members. The successful candidate will drive high-quality outcomes across the front-end and back-end, working closely with cross-functional teams. What You Will Do * Architect, develop, and maintain robust web applications * Design scalable, modular system components * Lead by example in code quality, review practices, and test coverage * Integrate third-party APIs and manage data flows * Troubleshoot and resolve complex technical issues * Write clean, testable, maintainable code * Produce clear, structured technical documentation * Collaborate in planning, delivery, and technical decision-making Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
2.0 - 7.0 years
4 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Develop scalable microservices using Java Spring Boot Design, implement REST APIs and integrate with frontend and external services Deploy and manage services using AWS services like EC2, S3, Lambda, RDS, and ECS Required Candidate profile Use CI/CD pipelines for automated builds, deployments (e.g., Jenkins, GitHub Actions) Collaborate with frontend, QA, DevOps, and business teams Write unit, integration tests to ensure code quality Perks and benefits Perks and Benefits
Posted 4 weeks ago
4.0 - 9.0 years
15 - 25 Lacs
Hyderabad
Work from Office
python, experienced in performing ETL andData Engineering concepts (PySpark, NumPy, Pandas, AWS Glue and Airflow)SQL exclusively Oracle hands-on work experience SQL profilers / Query Analyzers AWS cloud related (S3, RDS, RedShift) ETLPython
Posted 4 weeks ago
3.0 - 8.0 years
10 - 15 Lacs
Kochi
Remote
* Implement modular, scalable system components * Contribute to code reviews and maintain consistent coding standards * Integrate third-party APIs and manage data flows * Debug, troubleshoot, and resolve issues as they arise * Write clean, testable, maintainable code * Produce clear technical documentation * Participate in planning and delivery sessions Tech Stack * TypeScript * NestJS (Node) * ReactJS * GraphQL * AWS (Lambda, DynamoDB, S3, CloudWatch) * Bitbucket, Jira, Confluence * Experience in unit testing, API development, and distributed systems is preferred. Requirements * Minimum 3 years of hands-on full stack development experience * Bachelor degree in Information Technology, Computer Science, Software Engineering, or a related field * Very good level of English communication (spoken and written) * Ability to work Indian Standard Time (IST) hours, with flexibility to adjust start times during onboarding to overlap with Australian business hours for the first week * Proactive approach to problem solving and clear communication with distributed teams The Position * We are seeking an Intermediate Full Stack Developer to contribute to the delivery of a modern SaaS platform. This role involves working across front-end and back-end layers, with a focus on feature implementation, data integration, and high-quality code delivery. The developer will collaborate with product, design, and engineering teams in a fast-paced environment where attention to detail, clear communication, and initiative are expected. What You Will Do * Develop and maintain robust web applications
Posted 4 weeks ago
3.0 - 6.0 years
4 - 6 Lacs
Mumbai
Work from Office
What you will do for Sectona Key Responsibilities: Cloud Infrastructure Management: Design, implement, and maintain cloud infrastructures on AWS. Manage compute resources, storage, and networking components in AWS. Provision, configure, and monitor EC2 instances, S3 storage, and VPCs. Operating System Management: Configure and manage Windows and Unix-based VMs (Linux/Ubuntu). Perform patch management, security configurations, and system upgrades. Ensure high availability and performance of cloud-hosted environments. Active Directory Integration: Implement and manage Active Directory (AD) services, including AWS Directory Service, within the cloud environment. Integrate on-prem AD with AWS using AWS Managed AD or AD Connector. Networking: Design and manage secure network architectures, including VPCs, subnets, VPNs, and routing configurations. Implement network security best practices (firewalls, security groups, NACLs). Troubleshoot and resolve network connectivity issues, ensuring optimal network performance. Storage Solutions: Implement scalable storage solutions using AWS S3, EBS, and Glacier. Manage backup and recovery strategies for cloud-hosted environments. Database Management: Manage relational (RDS, Aurora) and NoSQL (DynamoDB) databases in the cloud. Ensure database performance, security, and high availability. Load Balancer & Auto-scaling: Configure and manage AWS Elastic Load Balancers (ELB) to distribute traffic across instances. Implement Auto Scaling policies to ensure elasticity and high availability of applications. Performance Tuning: Monitor system performance and apply necessary optimizations. Identify and resolve performance bottlenecks across compute, network, storage, and database layers. Security & Compliance: Implement security best practices in line with AWS security standards (IAM, encryption, security groups, etc.). Regularly audit cloud environments for compliance with internal and external security regulations. Skills and experience you require Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent work experience). 4+ years of hands-on experience with AWS cloud platforms, including EC2, S3, VPC, RDS, Lambda, and IAM. Proficient in managing both Windows and Unix/Linux servers in a cloud environment. Strong experience with Active Directory integration in a cloud infrastructure. Solid understanding of cloud networking, VPC design, and security groups. Knowledge of cloud storage solutions like EBS, S3, and Glacier. Experience with cloud-based databases- RDS (MySQL and MS SQL Server). Familiarity with load balancing technologies (Elastic Load Balancer) and Auto Scaling in AWS. Experience with cloud monitoring tools such as AWS CloudWatch, CloudTrail, or third-party tools. Familiarity with cloud services in Azure (e.g., VMs, Azure AD, Azure Storage) and GCP
Posted 4 weeks ago
8.0 - 12.0 years
8 - 12 Lacs
Pune, Bengaluru
Hybrid
Role & responsibilities - Overall 8+ years of prior experience as Data engineer/ Data analyst/ BI Engineer. - At least 5 years of Consulting or client service delivery experience on Amazon AWS (AWS) - At least 5 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions - Minimum of 5 years of hands-on experience in AWS and Big Data technologies such as Python, SQL, EC2, S3, Lambda, Spark/SparkSQL, Redshift, Snowflake, Snaplogic. - Prior experience on Snaplogic, AWS Glue, Lambda is must to have. - 3-5+ years of hands on experience in programming languages such as python, pyspark, spark, SQL,. - 2+ years of experience with DevOps tools such as GitLabs, Jenkins, Code Build, CodePipeline, CodeDeploy, etc. - Bachelors or higher degree in Computer Science or a related discipline. - AWS certification like Solution Architect Associate or Associate AWS Developer or AWS Big Data Specialty (nice to have).
Posted 4 weeks ago
3.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
The candidate must have below knowledge base for development of Front end software applications. 1-Develop robust and responsive user interfaces using React.js / Redux and related libraries while ensuring high performance and scalability. 2-Collaborate closely with the design team to translate UI/UX designs into code, ensuring a seamless user experience. 3-Implement best practices in frontend development, including code reviews, testing, and documentation. 4-Utilize AWS services to architect, deploy, and maintain scalable and secure frontend applications. 5-Optimize applications for maximum performance and scalability in AWS Cloud environments. 6-Work collaboratively with cross-functional teams including backend developers, DevOps engineers, and QA testers to deliver high-quality products. 7-3 to 5 years of professional experience in frontend development using React.js. 8-Strong proficiency in HTML5, CSS3, JavaScript, and related web technologies. 9-Extensive knowledge of AWS services such as EC2, S3, Lambda, CloudFront, etc. 10-Experience in optimizing frontend applications for performance and scalability on AWS. 11-Familiarity with version control systems (e.g., Git) and CI/CD pipelines. 12-Familiarity with MQTT, Pub-sub such as WAMP,socket communication and Rest API client development and working with tools like Postman Additional Knowledge preferred 1-Knowledge of backend technologies and frameworks like Node.js, Python, or Ruby on Rails. 2-Familiarity with containerization (Docker, Kubernetes).
Posted 4 weeks ago
7.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Work from Office
Job Title: Sr. Managed Services Engineer AWS (L3) Company: SHI | LOCUZ Location: Hyderabad Experience: 8+ Years Level: L3 Managed Services Shift: 24/7 Support (Rotational Shifts) Notice Period: Immediate Joiners or Max 15 to 20 Days About the Role: We are looking for a seasoned Sr. Managed Services Engineer AWS (L3) to join our expert team supporting SHI Complete and Expert-level AWS services. The ideal candidate will have strong hands-on experience with core AWS services, managed services delivery, and a passion for proactive monitoring and automation in cloud environments. Key Responsibilities: Perform in-depth reviews and understanding of customer AWS environments Evaluate business requirements and develop tailored service delivery plans Configure, monitor, and maintain AWS infrastructure for performance and availability Handle L3-level escalations and troubleshoot complex customer incidents/tickets Conduct proactive system checks, health monitoring, and performance tuning Implement data backup and recovery best practices Maintain security compliance and ensure adherence to SLAs and KPIs Prepare AWS-level change roadmaps for continuous improvement Lead incident response and root cause analysis for critical issues Collaborate with L1, L2, and vendor support teams Mentor junior engineers and ensure knowledge transfer Required Skills & Experience: 8+ years of IT experience, with strong exposure to Managed Services environments Deep hands-on experience with a wide range of AWS services, including but not limited to: CloudWatch, EC2, EBS, S3, RDS, EKS, Lambda, CloudFormation, CloudTrail, VPC, Route53, Transit Gateway, IAM, Security Hub, GuardDuty, AWS Backup, WAF & Shield, ACM, FSx, EFS, Elastic Beanstalk, API Gateway, AWS Workspaces, Control Tower Excellent understanding of AWS operational excellence and Well-Architected Framework Experience with 24x7 production environments and ITIL-based service delivery Strong troubleshooting and analytical skills Excellent communication and documentation skills Nice to Have: AWS Certifications (e.g., Solutions Architect Associate/Professional, SysOps Admin, DevOps Engineer) Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation Experience with monitoring/alerting via EventBridge, SNS, SQS, or 3rd party tools Why Join Us? Work with leading-edge AWS technologies Be part of a high-performance managed services team Great learning opportunities and certifications Stable and growth-oriented career path in cloud infrastructure Apply now and be part of our mission to deliver expert AWS support 24x7 for enterprise customers!
Posted 4 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Bengaluru
Work from Office
Role:AWS Data Engineer Exp:5+ Years Loc:Hyderabad/Bangalore Notice:-Immediate Role & responsibilities Hands-on experience with AWS services including S3,Glue, API Gateway, and SQS. Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL. Experience with batch job scheduling and managing data dependencies. Knowledge of data processing tools like Spark and Airflow. Automate repetitive tasks and build reusable frameworks to improve efficiency. Provide Run/DevOps support and manage the ongoing operation of data services. If you are interested Please share resume to abhishikth.sanku@rite.digital
Posted 4 weeks ago
4.0 - 9.0 years
10 - 18 Lacs
Noida
Work from Office
Precognitas Health Pvt. Ltd., a fully owned subsidiary of Foresight Health Solutions LLC, is seeking a Data Engineer to build and optimize our data pipelines, processing frameworks, and analytics infrastructure that power critical healthcare insights. Are you a bright, energetic, and skilled data engineer who wants to make a meaningful impact in a dynamic environment? Do you enjoy designing and implementing scalable data architectures, ML pipelines, automating ETL workflows, and working with cloud-native solutions to process large datasets efficiently? Are you passionate about transforming raw data into actionable insights that drive better healthcare outcomes? If so, join us! Youll play a crucial role in shaping our data strategy, optimizing data ingestion, and ensuring seamless data flow across our systems while leveraging the latest cloud and big data technologies. Required Skills & Experience : 4+ years of experience in data engineering, data pipelines, and ETL/ELT workflows. Strong Python programming skills with expertise in Python Programming, NumPy, Pandas, and data manipulation techniques. Hands-on experience with orchestration tools like Prefect, Apache Airflow, or AWS Step Functions for managing complex workflows. Proficiency in AWS services, including AWS Glue, AWS Batch, S3, Lambda, RDS, Athena, and Redshift. Experience with Docker containerization and Kubernetes for scalable and efficient data processing. Strong understanding of data processing layers, batch and streaming data architectures, and analytics frameworks. Expertise in SQL and NoSQL databases, query optimization, and data modeling for structured and unstructured data. Familiarity with big data technologies like Apache Spark, Hadoop, or similar frameworks. Experience implementing data validation, quality checks, and observability for robust data pipelines. Strong knowledge of Infrastructure as Code (IaC) using Terraform or AWS CDK for managing cloud-based data infrastructure. Ability to work with distributed systems, event-driven architectures (Kafka, Kinesis), and scalable data storage solutions. Experience with CI/CD for data workflows, including version control (Git), automated testing, and deployment pipelines. Knowledge of data security, encryption, and access control best practices in cloud environments. Strong problem-solving skills and ability to collaborate with cross-functional teams, including data scientists and software engineers. Compensation will be commensurate with experience. If you are interested, please send your application to jobs@precognitas.com. For more information about our work, visit www.caliper.care
Posted 1 month ago
6.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.
Posted 1 month ago
6.0 - 10.0 years
30 - 35 Lacs
Kochi, Hyderabad, Coimbatore
Work from Office
1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python
Posted 1 month ago
2.0 - 4.0 years
4 - 6 Lacs
Mumbai, Maharastra
Work from Office
About the Role: Grade Level (for internal use): 10 S&P Global Dow Jones Indices The Role Development Engineer Python Full Stack S&P Dow Jones Indices a global leader in providing investable and benchmark indices to the financial markets, is looking for a Development Engineer with full stack experience to join our technology team. This is mostly a back-end development role but will also support UI development work. The Team : You will be part of global technology team comprising of Dev, QA and BA teams and will be responsible for analysis, design, development and testing. Responsibilities and Impact You will be working on one of the key systems that is responsible for calculating re-balancing weights and asset selections for S&P indices. Ultimately, the output of this team is used to maintain some of the most recognized and important investable assets globally. Development of RESTful web services and databases; supporting UI development requirements. Interfacing with various AWS infrastructure and services, deploying to Docker environment. Coding, Documentation, Testing, Debugging, Documentation and tier-3 support. Work directly with stakeholders and technical architect to formalize/document requirements for both supporting existing application as well as new initiatives. Perform Application & System Performance tuning and troubleshoot performance issues. Coordinately closely with the QA team and the scrum master to optimize team velocity and task flow. Helps establish and maintain technical standards via code reviews and pull requests Whats in it for you This is an opportunity to work on a team of highly talented and motivated engineers at a highly respected company. You will work on new development as well as enhancements to existing functionality. What Were Looking For: Basic Qualifications 7 - 10 years of IT experience in application development and support, primarily in a back-end API and database development roles with at least some UI development experience. Bachelor's degree in Computer Science, Information Systems, Engineering or, or in lieu, a demonstrated equivalence in work experience. Proficiency in modern Python 10+ (minimum 4 years dedicated, recent Python experience) AWS services experience including API Gateway, ECS / Docker, DynamoDB, S3, Kafka, SQS. SQL database experience, with at least 1 year of Postgres. Python libraries experience including Pydantic, SQLAlchemy and at least one of (Flask, FastAPI, Sanic), focusing on creating RESTful endpoints for data services. JavaScript / Typescript experience and at least one of (Vue 3, React, Angular) Strong unit testing skills with PyTest or UnitTest, and API testing using Postman or Bruno. CI/CD build process experience using Jenkins. Experience with software testing (unit testing, integration testing, test driven development). Strong Work Ethic and good communication skills. Additional Preferred Qualifications : Basic understanding of financial markets (stocks, funds, indices, etc.) Experience working in mission-critical enterprise organizations A passion for creating high quality code and broad unit test coverage. Ability to understand complex business problems, break into smaller executable parts, and delegate. About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. Were the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 and the Dow Jones Industrial Average . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 4 trillion in passively managed assets linked to our indices and over USD 13 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals.
Posted 1 month ago
10.0 - 17.0 years
9 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Dear Candidate, Please find below job description Role :- MLOps + ML Engineer Job Description: Role Overview: We are looking for a highly experienced MLOps and ML Engineer to lead the design, deployment, and optimization of machine learning systems at scale. This role requires deep expertise in MLOps practices, CI/CD automation, and AWS SageMaker, with a strong foundation in machine learning engineering and cloud-native development. Key Responsibilities: Architect and implement robust MLOps pipelines for model development, deployment, monitoring, and governance. Lead the operationalization of ML models using AWS SageMaker and other AWS services. Build and maintain CI/CD pipelines for ML workflows using tools like GitHub Actions, Jenkins, or AWS CodePipeline. Automate model lifecycle management including retraining, versioning, and rollback. Collaborate with data scientists, ML engineers, and DevOps teams to ensure seamless integration and scalability. Monitor production models for performance, drift, and reliability. Establish best practices for reproducibility, security, and compliance in ML systems. Required Skills: 10+ years of experience in ML Engineering, MLOps, or related fields. Deep hands-on experience with AWS SageMaker, Lambda, S3, CloudWatch, and related AWS services. Strong programming skills in Python and experience with Docker, Kubernetes, and Terraform. Expertise in CI/CD tools and infrastructure-as-code. Familiarity with model monitoring tools (e.g., Evidently, Prometheus, Grafana). Solid understanding of ML algorithms, data pipelines, and production-grade systems. Preferred Qualifications: AWS Certified Machine Learning Specialty or DevOps Engineer certification. Experience with feature stores, model registries, and real-time inference systems. Leadership experience in cross-functional ML/AI teams. Primary Skills: MLOps, ML Engineering, AWS related services (SageMaker/S3/CloudWatch) Regards Divya Grover +91 8448403677
Posted 1 month ago
10.0 - 15.0 years
15 - 25 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Experience: 10+ Years Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have over 10 years of experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: Lead and architect data modernization/transformation projects using AWS services. Manage and mentor a team of data engineers and analysts. Build and maintain strong client relationships, ensuring successful project delivery. Design and implement scalable data architectures and solutions. Oversee the migration of large datasets to AWS, ensuring data integrity and security. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure best practices in data management and governance are followed. Required Skills and Experience: 10+ years of experience in data architecture and analytics. Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. Proven experience in delivering 1-2 large data migration/modernization projects using AWS. Strong leadership and team management skills. Excellent communication and interpersonal skills. Deep understanding of data modeling, ETL processes, and data warehousing. Experience with data governance and security best practices. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: AWS Certified Solutions Architect Professional or AWS Certified Big Data Specialty. Experience with other cloud platforms (e.g., Azure, GCP) is a plus. Familiarity with machine learning and AI technologies.
Posted 1 month ago
7.0 - 12.0 years
30 - 45 Lacs
Noida, Pune, Gurugram
Hybrid
Role: Lead Data Engineer Experience: 7-12 years Must-Have: 7+ years of relevant experienceinData Engineeringand delivery. 7+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark. Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc. Role & Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects. Education: BE/B.Tech from a reputed institute.
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Bengaluru
Work from Office
About this role: Wells Fargo is seeking a Lead Software Engineer (Lead Data Engineer). In this role, you will: Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines Design, code, test, debug, and document for projects and programs Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor Required Qualifications: 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ Years of experience in Data Engineering. 5+ years of overall experience of software development. 5+ years of Python development experience must include 3+ years in Spark framework. 5+ years of Oracle or SQL Server experience in designing, coding and delivering database applications Expert knowledge and considerable development experience with at least two or more of the following : Kafka ,ETL, Big Data, NoSql database, S3 or other object store . Strong understanding of data flows design and how to implement your designs in python Experience in writing and debugging complex PL/SQL or TSQL Stored Procedures Excellent troubleshooting and debugging skills Analyze a feature story and design a robust solution for it and create specs for complex business rules and calculations Ability to understand business problems and articulate a corresponding solution Excellent verbal, written, and interpersonal communication skills Job Expectations: Strong knowledge and understanding of Dremio framework Database query design and optimization Strong experience using the development ecosystem of applications (JIRA, ALM, GitHub, uDeploy(Urban Code Deploy), Jenkins, Artifactory, SVN, etc) Knowledge and understanding of multiple source code version control systems in working with branches, tags and labels
Posted 1 month ago
3.0 - 5.0 years
0 - 1 Lacs
Pune
Work from Office
Company: Covalensedigital Location: Pune Experience: 3.5 to 5 Years Hiring For: Consultant About the Role Covalensedigital is seeking a passionate and experienced Full Stack Developer to join our technology team. The ideal candidate will have strong backend development expertise in Python (Flask or Django), along with experience in MySQL, frontend development, and data visualization. You will be responsible for building and maintaining scalable internal tools and dashboards that support business operations and insights. Key Responsibilities Backend Development Develop RESTful APIs using Flask or Django Implement authentication, authorization, and business logic Write scalable and modular Python code Database Management Design and optimize MySQL schemas Perform migrations, indexing, and tuning for performance Frontend Development Build intuitive user interfaces with HTML, CSS, JavaScript Integrate frontend components with backend APIs Data Visualization Utilize Pandas, NumPy, Plotly, Matpand lotlib for analysis and dashboarding Deliver real-time insights through dynamic visualizations Deployment & DevOps Deploy applications on AWS (EC2, RDS, S3) Manage environment variables, backups, and basic CI/CD pipelines Required Skills Minimum 2 years of hands-on experience in Flask or Django Proficiency in MySQL (queries, joins, procedures) Strong understanding of REST APIs and JSON Experience in frontend development with HTML, CSS, JavaScript Data analysis and visualization using Pandas, NumPy, Plotly, Matplotlib AWS deployment experience Familiarity with Git and collaborative workflows Nice to Have Exposure to React or Vue.js Experience with Docker or containerised deployments Familiarity with background jobs (Celery, cron) Understanding of secure coding practices Immediate Joiners Preferred Work Location: Pune (Onsite/Hybrid as per project needs) Send your updated resume to: Email: kalaivanan.balasubramaniam@covalensedigital.com Thanks Kalai 8015302990
Posted 1 month ago
7.0 - 12.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Hybrid
Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.
Posted 1 month ago
5.0 - 10.0 years
16 - 20 Lacs
Mumbai, Goregaon
Work from Office
Role Overview We are seeking a highly skilled Engineering Manager with deep expertise in the MERN stack (MongoDB, Express, React, Node.js), AWS infrastructure, and DevOps practices. This role requires both hands-on technical leadership and strong people management to lead a team of engineers building scalable, high-performance applications. Key Responsibilities Lead, mentor, and manage a team of full-stack developers working primarily with MERN. Own architecture decisions, code quality, and engineering practices across multiple microservices. Collaborate with Product, Design, and QA teams to define and deliver on product roadmaps. Implement CI/CD pipelines, infrastructure as code, and automated testing strategies. Ensure system scalability, security, and performance optimization across services. Drive sprint planning, code reviews, and technical documentation standards. Work closely with DevOps to maintain uptime and operational excellence. Required Skills 6+ years of experience with full-stack JavaScript development (MERN stack) 2+ years in a leadership/managerial role Strong understanding of Node.js backend and API development Hands-on with React.js, component design, and front-end state management Proficient in MongoDB and designing scalable NoSQL schemas Experience in AWS services (EC2, S3, RDS, Lambda, CloudWatch, IAM) Working knowledge of Docker, GitHub Actions, or similar CI/CD tools Familiarity with monitoring tools like New Relic, Datadog, or Prometheus Solid experience managing agile workflows and team velocity
Posted 1 month ago
0.0 - 1.0 years
2 - 3 Lacs
Bengaluru
Work from Office
Responsibilities: * Collaborate with cross-functional teams on project delivery. * Develop backend solutions using Python, FastAPI & AWS. * Optimize performance through Redis DB & Nginx.
Posted 1 month ago
3.0 - 6.0 years
40 - 45 Lacs
Kochi, Kolkata, Bhubaneswar
Work from Office
We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.
Posted 1 month ago
8.0 - 10.0 years
12 - 20 Lacs
Chennai
Hybrid
We are looking for a skilled Solutions Engineer with 8-10 years of experience in technical leadership, software architecture, and design. In this role, youll have the opportunity to design and develop scalable, reliable software solutions, utilizing a range of AWS services and open-source technologies. If youre passionate about creating impactful solutions and bridging technical and business needs, wed love to hear from you! Key Responsibilities: Lead the architecture and design of complex, scalable software solutions. Develop and implement robust applications using AWS Lambda, Python, EC2, S3, PHP, Laravel, Serverless Microservices, and open-source software. (Experience with Drupal is a plus.) Apply Agile methodologies and DevOps principles to ensure efficient development cycles. Optimize AWS cloud solutions to balance performance and cost. Collaborate with non-technical stakeholders to explain technical concepts effectively. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related technical field. 8-10 years of experience in a technical leadership role focused on software architecture and design. Proficiency in software development with strong experience in AWS cloud services and programming languages. Excellent problem-solving and critical-thinking skills. Effective communication and interpersonal skills.
Posted 1 month ago
5.0 - 10.0 years
14 - 18 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
5+ years of working experience in Python 4+ years of hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, Cloud formation, Docker, ECS, Fargate, and ECR. Very strong hands-on knowledge on using Python for integrations between systems through different data formats Expert in deploying and maintaining the applications in AWS and Hands on experience in Kinesis streams, Auto-scaling Team player with very good written and communication skills Strong problem solving and decision-making skills Ability to solve complex software system issues Collaborate with business and other teams to understand business requirements and work on the project deliverables. Participate in requirements gathering and understanding Design a solution based on available framework and code
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi