Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 25 Lacs
Gurugram
Work from Office
Role & responsibilities Key Responsibilities Design, build, and maintain scalable and efficient data pipelines to move data between cloud-native databases (e.g., Snowflake) and SaaS providers using AWS Glue and Python Implement and manage ETL/ELT processes to ensure seamless data integration and transformation Ensure information security and compliance with data governance standards Maintain and enhance data environments, including data lakes, warehouses, and distributed processing systems Utilize version control systems (e.g., GitHub) to manage code and collaborate effectively with the team Primary Skills: Enhancements, new development, defect resolution, and production support of ETL development using AWS native services Integration of data sets using AWS services such as Glue and Lambda functions. Utilization of AWS SNS to send emails and alerts Authoring ETL processes using Python and PySpark ETL process monitoring using CloudWatch events Connecting with different data sources like S3 and validating data using Athena. Experience in CI/CD using GitHub Actions Proficiency in Agile methodology Extensive working experience with Advanced SQL and a complex understanding of SQL. Secondary Skills: Experience working with Snowflake and understanding of Snowflake architecture, including concepts like internal and external tables, stages, and masking policies. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog): 5 years. Hands-on experience with Python and PySpark: 3 years. PL/SQL experience: 3 years CloudFormation and Terraform: 2 years CI/CD GitHub actions: 1 year Experience with BI systems (PowerBI, Tableau): 1 year Good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda: 2 years Additionally, familiarity with any of the following is highly desirable: Jira, GitHub, Snowflake
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru, Bellandur
Hybrid
Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 4-6 years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.
Posted 1 month ago
7.0 - 12.0 years
1 - 2 Lacs
Hyderabad
Remote
Role & responsibilities We are looking for a highly experienced Senior Cloud Data Engineer to lead the design, development, and optimization of our cloud-based data infrastructure. This role requires deep technical expertise in AWS services, data engineering best practices, and infrastructure automation. You will be instrumental in shaping our data architecture and enabling data-driven decision-making across the organization. Key Responsibilities: Design, build, and maintain scalable and secure data pipelines using AWS Glue , Redshift , and Python . Develop and optimize SQL queries and stored procedures for complex data transformations and migrations. Automate infrastructure provisioning and deployment using Terraform , ensuring repeatability and compliance. Architect and implement data lake and data warehouse solutions on AWS. Collaborate with cross-functional teams including data scientists, analysts, and DevOps to deliver high-quality data solutions. Monitor, troubleshoot, and optimize data workflows for performance, reliability, and cost-efficiency. Implement data quality checks, validation frameworks, and monitoring tools. Ensure data security, privacy, and compliance with industry standards and regulations. Lead code reviews, mentor junior engineers, and promote best practices in data engineering. Participate in capacity planning, cost optimization, and performance tuning of cloud data infrastructure. Evaluate and integrate new tools and technologies to improve data engineering capabilities. Document technical designs, processes, and operational procedures. Support business intelligence and analytics teams by ensuring timely and accurate data availability. Required Skills & Experience: 10+ years of experience in data engineering or cloud data architecture. Strong expertise in AWS Redshift , including schema design, performance tuning, and workload management. Proficiency in SQL and stored procedures for ETL and data migration tasks. Hands-on experience with Terraform for infrastructure as code (IaC) in AWS environments. Deep knowledge of AWS Glue for ETL orchestration and job development. Advanced programming skills in Python , especially for data processing and automation. Solid understanding of data warehousing, data lakes, and cloud-native data architectures. Preferred candidate profile AWS Certifications (e.g., AWS Certified Data Analytics Specialty, AWS Certified Solutions Architect). Experience with CI/CD pipelines and DevOps practices. Familiarity with additional AWS services like S3, Lambda, CloudWatch, Step Functions, and IAM. Knowledge of data governance, lineage, and cataloging tools (e.g., AWS Glue Data Catalog, Apache Atlas). Experience with real-time data processing frameworks (e.g., Kinesis, Kafka, Spark Streaming).
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Noida, New Delhi, Delhi / NCR
Hybrid
Build and manage data infrastructure on AWS , including S3, Glue, Lambda, Open Search, Athena, and CloudWatch using IaaC tool like Terraform Design and implement scalable ETL pipelines with integrated validation and monitoring. Set up data quality frameworks using tools like Great Expectations , integrated with PostgreSQL or AWS Glue jobs. Implement automated validation checks at key points in the data flow: post-ingest, post-transform, and pre-load. Build centralized logging and alerting pipelines (e.g., using CloudWatch Logs, Fluent bit ,SNS, File bit ,Logstash , or third-party tools). Define CI/CD processes for deploying and testing data pipelines (e.g., using Jenkins, GitHub Actions) Collaborate with developers and data engineers to enforce schema versioning, rollback strategies, and data contract enforcement. Preferred candidate profile 5+ years of experience in DataOps, DevOps, or data infrastructure roles. Proven experience with infrastructure-as-code (e.g., Terraform, CloudFormation). Proven experience with real-time data streaming platforms (e.g., Kinesis, Kafka). Proven experience building production-grade data pipelines and monitoring systems in AWS . Hands-on experience with tools like AWS Glue , S3 , Lambda , Athena , and CloudWatch . Strong knowledge of Python and scripting for automation and orchestration. Familiarity with data validation frameworks such as Great Expectations, Deequ, or dbt tests. Experience with SQL-based data systems (e.g., PostgreSQL). Understanding of security, IAM, and compliance best practices in cloud data environments.
Posted 1 month ago
3.0 - 6.0 years
18 - 25 Lacs
Bengaluru
Hybrid
Join the team designing and developing innovative software solutions to meet client needs while providing expert technical support. Hey, thanks so much for joining us! We totally get how exhausting job hunting can be, so lets dive right in and share what weve got for you. Who We are and what We Offer at Next point Nextpoint delivers transformative software and services for all law-kind. Our award-winning team is 100% focused on making it simple, fluid, and affordable for law firms of all sizes to win the day, with streamlined ediscovery workflows, simplified case management, and best-in-class security at every point. Our secure, cloud-based solution lets teams begin document review in minutes with powerful data analytics tools, a user-friendly interface and collaborative access from anywhere. Innovative case prep and presentation features exceed expectations of what smart eLaw software can do. We’re problem solvers, simplifiers, and challenge seekers, all united by a shared goal: fostering a happy workplace and delivering great results for our clients. At Nextpoint, we value innovation, creativity, diversity, and initiative, thriving in a tech-forward environment where our team enjoys coming to work. If you thrive in a relaxed, informal setting and love taking on new challenges, Nextpoint is the perfect place for you to grow and be rewarded. What you’d be doing in this role As our Cloud Developer, you will be playing a critical role in the delivery of cloud-based solutions to support our client's success. The role embeds you with a motivated development team where you will identify, plan, and execute tasks that deliver real value to our clients. You’ll play multiple roles and contribute to our mission. The impact you’ll make: Building new tools and solutions to enable client self-service. Comprehensive knowledge of designing and developing high-quality, efficient software Define and document best practices and strategies. Seek ways to push and improve Nextpoint’s development processes by introducing new methods and technologies. Delivering software solutions to increase our clients’ command of their data. Identify and track meaningful metrics. Your expertise: 3+ years of cloud software or infrastructure architecture experience. 3+ years of experience with core AWS services and microservices architecture. Strong notions of security best practices, such as using AWS S3, IAM, VPC, Batch, Step Functions, and AWS Lambda, are mandatory. Working knowledge of MySQL is mandatory. Working knowledge of Python and Elastic/dtSearch is desirable. Should have Knowledge of web services, like API, REST, and RPC. 2+ years of experience in RoR development is a major plus. Passionate Software Engineer who contributes to define, implement and maintain highly scalable, distributed and resilient software applications. Ability to effectively communicate technical information, both orally and in writing. Great team member, Strategic and analytical thinker that takes responsibility and works collaboratively. Should have experience working in an Agile environment. Legal technology experience is a plus, but not required What we bring to the table besides salary Competitive pay that matches your experience Flexible hybrid schedule (2-3 days in-person) Annual professional development stipend to keep growing Most importantly, we offer a psychologically safe and inclusive environment where everyone feels welcome and supported. And more good stuff! Don’t just take our word for it that we’re awesome! 50 BEST STARTUPS TO WORK FOR IN CHICAGO, IL 2025 100 BEST PLACES TO WORK IN CHICAGO, IL 2024 50 BEST STARTUPS TO WORK FOR IN THE US 2024 Glassdoor G2 Capterra Nextpoint Culture Equal Opportunity Employer Nextpoint is an equal opportunity employer. We strive to foster a diverse workplace, actively seeking to recruit, retain and promote people of color and LBGTQ+ candidates. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. If you are interested in our company, we encourage you to apply even if you do not see an applicable job description, as we are growing fast and are always on the lookout for high-performing, curious, and entrepreneurial-minded individuals to join our team!
Posted 1 month ago
5.0 - 7.0 years
9 - 14 Lacs
Noida
Work from Office
Skilled AWS Databricks Platform Administrator to manage and optimize our Databricks environment. The ideal candidate will have strong expertise in user access management, user persona development, and the ability to collaborate with architects to implement configuration changes. This role involves ensuring the security, performance, and reliability of the Databricks platform while supporting users and maintaining compliance with organizational policies. Good experience with SDLC Databricks platform administration is a must Must have security and access control experience, user provisioning Services integration experience Should be able to work with enterprise architects Good to have - API experience Required Skills & Qualifications 5-7 years of experience as a Databricks Administrator or similar role. Strong experience with AWS services (IAM, S3, EC2, Lambda, Glue, etc.). Expertise in Databricks administration, workspace management, and security configurations . Hands-on experience with AD groups, user access management, RBAC, and IAM policies . Experience in developing and managing user personas within enterprise environments. Strong understanding of network security, authentication, and data governance . Proficiency in Python, SQL, and Spark for troubleshooting and automation. Familiarity with Terraform, CloudFormation, or Infrastructure as Code (IaC) is a plus. Knowledge of CI/CD pipelines and DevOps best practices is desirable. Excellent communication and documentation skills . Preferred Certifications AWS Certified Solutions Architect Associate Professional Databricks Certified Data Engineer Administrator Certified Information Systems Security Professional (CISSP) Nice to have Mandatory Competencies Data Science - Databricks Cloud - AWS Cloud - Azure Cloud - AWS Lambda Data on Cloud - AWS S3 Python - Python Database - SQL Big Data - SPARK Beh - Communication and collaboration
Posted 1 month ago
4.0 - 9.0 years
8 - 16 Lacs
Kolkata
Remote
Enhance/modify applications, configure existing systems and provide user support DotNET Full Stack or [Angular 18 + developer + DotNET backend] SQL Server Angular version 18+ (it is nice to have) Angular version 15+ (mandatory)
Posted 1 month ago
6.0 - 8.0 years
8 - 15 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Location HYD/BGL/CHN, Experience – 5-8 Years – NP-Immediate to 15 Days Design, develop, and maintain robust and scalable server-side applications using Node.js and JavaScript/TypeScript. • Develop and consume RESTful APIs and integrate with third-party services. • In-depth knowledge of AWS cloud including familiarity with services such as S3, Lambda, DynamoDB, Glue, Apache Airflow, SQS, SNS, ECS and Step Functions, EMR, EKS (Elastic Kubernetes Service), Key Management Service, Elastic MapReduce • Handon Experience on Terraform • Specializing in designing and developing fully automated end-to-end data processing pipelines for large-scale data ingestion, curation, and transformation. • Experience in deploying Spark-based ingestion frameworks, testing automation tools, and CI/CD pipelines. • Knowledge of unit testing frameworks and best practices. • Working experience in databases- SQL and NO-SQL (preferred)-including joins, aggregations, window functions, date functions, partitions, indexing, and performance improvement ideas. • Experience with database systems such as Oracle, MySQL, PostgreSQL, MongoDB, or other NoSQL databases. • Familiarity with ORM/ODM libraries (e.g., Sequelize, Mongoose). • Proficiency in using Git for version control. • Understanding of testing frameworks (e.g., Jest, Mocha, Chai) and writing unit and integration tests. • Collaborate with front-end developers to integrate user-facing elements with server-side logic. • Design and implement efficient database schemas and ensure data integrity. • Write clean, well-documented, and testable code. • Participate in code reviews to ensure code quality and adherence to coding standards. • Troubleshoot and debug issues in development and production environments. • Knowledge of security best practices for web applications (authentication, authorization, data validation). • Strong communication and collaboration skills. • Effective communication skills to interact with technical and non-technical stakeholders.
Posted 1 month ago
7.0 - 12.0 years
25 - 40 Lacs
Bengaluru
Hybrid
Role - Data Engineer Experience - 7+ Years Notice - Immediate Skills - AWS (S3, Glue, Lambda, EC2), Spark, Pyspark, Python, Airflow
Posted 1 month ago
7.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Work from Office
We are looking to hire an experienced and enthusiastic AWS DevOps Engineer to provide hands-on development and support for all aspects of our AWS infrastructure and CI/CD environment. The primary responsibility of this role will be to build a scalable AI platform that includes configurable workflows, flexible APIs. This role will involve a mix of independent work and collaboration with engineering teams focused on AWS infrastructure and automation. Typical Duties and Responsibilities 1 Deploy, maintain, and manage an AWS system. 2 Ensure AWS systems are available, reliable, secure, and scalable. 3 Provide primary support for cloud and enterprise deployments. 4 Analyze manual processes to identify those that can be automated. 5 Maintain and improve the organizations cloud infrastructure. 6 Collaborate with the core engineering team to lead the organizations platform security efforts. 7 Develop policies and standards for the CI/CD environment. 8 Infrastructure as Code (IaC): Experience with AWS CloudFormation. 9 Hands-on experience in deploying AWS Lambda, API Gateway, Step Functions, and event-driven architectures using AWS SAM
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Noida, India
Work from Office
Full-stack developer with 6 8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles . Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in microservices architecture and implementation Strong knowledge of HTML, CSS and JavaScript, React Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, DynamoDB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., Git) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc . Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, Git, SonarQube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Java - Core JAVA Others - Micro services Java Fullstack - React JS Java Fullstack - HTML CSS Java - Spring Framework Core Java Others - Spring Boot Cloud - AWS Java Others - Spring Batch Java - Hibernate/JPA Java Fullstack - Javascript Data on Cloud - AWS S3 Cloud - AWS Lambda Java - SQL Agile - Agile Java Fullstack - WebServies/REST Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Beh - Communication and collaboration
Posted 1 month ago
6.0 - 8.0 years
8 - 10 Lacs
Noida
Work from Office
Full-stack developer with 6 8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles. Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in micro services architecture and implementation Strong knowledge of HTML, CSS and JavaScript, Angular Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, Dynamo DB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., GIT) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc. Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, GIT, Sonar Qube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Java - Core JAVA Architecture - Micro Service Others - Micro services Java Fullstack - Angular 2+ Java Fullstack - HTML CSS Java - Spring Framework Core Cloud - AWS Cloud - AWS Lambda Java Others - Spring Boot Java Others - Spring Batch Java - Hibernate/JPA Java Fullstack - Javascript Java Fullstack - WebServies/REST Data on Cloud - AWS S3 Java - SQL Database - PL/SQL Python - Rest API DevOps - Git DevOps - CI/CD Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. DevOps - Docker Agile - Agile Beh - Communication and collaboration
Posted 1 month ago
8.0 - 13.0 years
15 - 30 Lacs
Hyderabad
Work from Office
8-12 years of Cloud software development experience under Agile development life cycle processes and tools Experience in Micro-services architecture. Experience in NodeJS applications Strong knowledge of AWS Cloud Platform services like AWS Lambda, GraphQL, NoSQL databases, AWS Kinesis, Queues like SNS/ SQS topics Strong with Object Oriented Analysis & Design (OOAD) ; Programming languages used for cloud Java , C++ , GO Strong experience on PaaS, IaaS cloud computing. Good hands-on experience in Serverless frameworks, AWS JS SDK, AWS services like Lambda, SNS, SES, SQS, SSM, S3, EC2, IAM, CloudWatch, Kinesis and Cloud Formation. Experience in Agile methodology with tools like JIRA, GIT, GITLAB, SVN, Bit Bucket as an active scrum member. Good to have Okta or oAuth2 knowledge. IoTs, Sparkplug-B knowledge/ work experience is added advantage.
Posted 1 month ago
3.0 - 8.0 years
9 - 30 Lacs
Mumbai
Work from Office
Responsibilities: * Design, develop, test & deploy Java apps on AWS using AWS services such as Lambda, Dynamo DB, API Gateway & SNS. * Monitor app performance with Amazon CloudWatch & optimize resources for efficiency. Health insurance Provident fund Annual bonus
Posted 1 month ago
6.0 - 11.0 years
20 - 35 Lacs
Gurugram
Hybrid
Must-Have Skills (Core Requirements) Look for resumes that mention hands-on experience with: Amazon S3 storing and organizing data AWS Glue – running ETL jobs (basic PySpark knowledge is a plus) Glue Catalog – maintaining metadata for datasets Amazon Athena – querying data using SQL Parquet or CSV – basic familiarity with data file formats AWS Lambda – for simple automation or triggers Basic IAM knowledge – setting up access permissions CloudWatch – monitoring jobs or logs Understanding of ETL/ELT pipelines Good-to-Have Skills (Preferred but not mandatory) These add value, but are not essential at this level: AWS Lake Formation – access control and permissions Apache Airflow or Step Functions – workflow orchestration Amazon Redshift – experience with data warehouse usage AWS DMS or Kinesis – for data ingestion Terraform or CloudFormation – for infrastructure setup Exposure to QuickSight or any dashboarding tools
Posted 1 month ago
4.0 - 6.0 years
6 - 9 Lacs
Ahmedabad
Work from Office
Role Overview: As a DevOps Engineer at ChartIQ , you'll play a critical role not only in building, maintaining, and scaling the infrastructure that supports our Development our Development and QA needs , but also in driving new, exciting cloud-based solutions that will add to our offerings. Your work will ensure that the platforms used by our team remain available, responsive, and high-performing. In addition to maintaining the current infrastructure, you will also contribute to the development of new cloud-based solutions , helping us expand and enhance our platform's capabilities to meet the growing needs of our financial services customers. You will also contribute to light JavaScript programming , assist with QA testing , and troubleshoot production issues. Working in a fast-paced, collaborative environment, you'll wear multiple hats and support the infrastructure for a wide range of development teams. This position is based in Ahmedabad, India , and will require working overlapping hours with teams in the US . The preferred working hours will be until 12 noon EST to ensure effective collaboration across time zones. Key Responsibilities: Design, implement, and manage infrastructure using Terraform or other Infrastructure-as-Code (IaC) tools. Leverage AWS or equivalent cloud platforms to build and maintain scalable, high-performance infrastructure that supports data-heavy applications and JavaScript-based visualizations. Understand component-based architecture and cloud-native applications. Implement and maintain site reliability practices , including monitoring and alerting using tools like DataDog , ensuring the platforms availability and responsiveness across all environments. Design and deploy high-availability architecture to support continuous access to alerting engines. Support and maintain Configuration Management systems like ServiceNow CMDB . Manage and optimize CI/CD workflows using GitHub Actions or similar automation tools. Work with OIDC (OpenID Connect) integrations across Microsoft , AWS , GitHub , and Okta to ensure secure access and authentication. Contribute to QA testing (both manual and automated) to ensure high-quality releases and stable operation of our data visualization tools and alerting systems. Participate in light JavaScript programming tasks, including HTML and CSS fixes for our charting library. Assist with deploying and maintaining mobile applications on the Apple App Store and Google Play Store . Troubleshoot and manage network issues , ensuring smooth data flow and secure access to all necessary environments. Collaborate with developers and other engineers to troubleshoot and optimize production issues. Help with the deployment pipeline , working with various teams to ensure smooth software releases and updates for our library and related services. Required Qualifications: Proficiency with Terraform or other Infrastructure-as-Code tools. Experience with AWS or other cloud services (Azure, Google Cloud, etc.). Solid understanding of component-based architecture and cloud-native applications. Experience with site reliability tools like DataDog for monitoring and alerting. Experience designing and deploying high-availability architecture for web based applications. Familiarity with ServiceNow CMDB and other configuration management tools. Experience with GitHub Actions or other CI/CD platforms to manage automation pipelines. Strong understanding and practical experience with OIDC integrations across platforms like Microsoft , AWS , GitHub , and Okta . Solid QA testing experience, including manual and automated testing techniques (Beginner/Intermediate). JavaScript , HTML , and CSS skills to assist with troubleshooting and web app development. Experience with deploying and maintaining mobile apps on the Apple App Store and Google Play Store that utilize web-based charting libraries. Basic network management skills, including troubleshooting and ensuring smooth network operations for data-heavy applications. Knowledge of package publishing tools such as Maven , Node , and CocoaPods to ensure seamless dependency management and distribution across platforms. Additional Skills and Traits for Success in a Startup-Like Environment: Ability to wear multiple hats : Adapt to the ever-changing needs of a startup environment within a global organization. Self-starter with a proactive attitude, able to work independently and manage your time effectively. Strong communication skills to work with cross-functional teams, including engineering, QA, and product teams. Ability to work in a fast-paced, high-energy environment. Familiarity with agile methodologies and working in small teams with a flexible approach to meeting deadlines. Basic troubleshooting skills to resolve infrastructure or code-related issues quickly. Knowledge of containerization tools such as Docker and Kubernetes is a plus. Understanding of DevSecOps and basic security practices is a plus. Preferred Qualifications: Experience with CI/CD pipeline management , automation, and deployment strategies. Familiarity with serverless architectures and AWS Lambda . Experience with monitoring and logging frameworks, such as Prometheus , Grafana , or similar. Experience with Git , version control workflows, and source code management. Security-focused mindset , experience with vulnerability scanning, and managing secure application environments.
Posted 1 month ago
8.0 - 13.0 years
12 - 48 Lacs
Hyderabad
Work from Office
We are seeking a Backend Lead Developer with strong expertise in Python and AWS serverless technologies to lead the backend development of our AI-driven product. Health insurance Provident fund
Posted 1 month ago
5.0 - 9.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Python Developer with 5 to 9 years of experience to design, develop, and maintain serverless applications using Python and AWS technologies. The ideal candidate will have extensive experience in building scalable, high-performance back-end systems and a deep understanding of AWS serverless services such as Lambda, DynamoDB, SNS, SQS, S3, and others. This role is based in Bangalore and Mumbai. Roles and Responsibility Design and implement robust, scalable, and secure back-end services using Python and AWS serverless technologies. Build and maintain serverless applications leveraging AWS Lambda, DynamoDB, API Gateway, S3, SNS, SQS, and other AWS services. Provide technical leadership and mentorship to a team of engineers, promoting best practices in software development, testing, and DevOps. Collaborate closely with cross-functional teams including front-end developers, product managers, and DevOps engineers to deliver high-quality solutions that meet business needs. Implement and manage CI/CD pipelines, automated testing, and monitoring to ensure high availability and rapid deployment of services. Optimize back-end services for performance, scalability, and cost-effectiveness, ensuring the efficient use of AWS resources. Ensure all solutions adhere to industry best practices for security, including data protection, access controls, and encryption. Create and maintain comprehensive technical documentation, including architecture diagrams, API documentation, and deployment guides. Diagnose and resolve complex technical issues in production environments, ensuring minimal downtime and disruption. Stay updated with the latest trends and best practices in Python, AWS serverless technologies, and fintech/banking technology stacks, and apply this knowledge to improve our systems. Job Minimum 7 years of experience in back-end software development, with at least 5 years of hands-on experience in Python. Extensive experience with AWS serverless technologies, including Lambda, DynamoDB, API Gateway, S3, SNS, SQS, S3, ECS, EKS, and other related services. Proven experience in leading technical teams and delivering complex, scalable cloud-based solutions in the fintech or banking sectors. Strong proficiency in Python and related frameworks (e.g., Flask, Django). Deep understanding of AWS serverless architecture and best practices. Experience with infrastructure as code (IaC) tools such as AWS CloudFormation or Terraform. Familiarity with RESTful APIs, microservices architecture, and event-driven systems. Knowledge of DevOps practices, including CI/CD pipelines, automated testing, and monitoring using AWS services (e.g., CodePipeline, CloudWatch, X-Ray). Demonstrated ability to lead and mentor engineering teams, fostering a culture of collaboration, innovation, and continuous improvement. Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve complex technical issues in a fast-paced environment. Excellent verbal and written communication skills, with the ability to effectively convey technical concepts to both technical and non-technical stakeholders. Experience with other cloud platforms (e.g., Azure, GCP) and containerization technologies like Docker and Kubernetes. Familiarity with financial services industry regulations and compliance requirements. Relevant certifications such as AWS Certified Solutions Architect, AWS Certified Developer, or similar.
Posted 1 month ago
6.0 - 8.0 years
2 - 6 Lacs
Pune
Work from Office
We are looking for a skilled Python AWS Developer with 6 to 8 years of experience. The ideal candidate will have expertise in developing scalable and efficient applications on the AWS platform. Roles and Responsibility Design, develop, and deploy scalable and efficient applications on the AWS platform. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in Python and AWS development. Job Strong proficiency in Python programming language. Experience with AWS services such as EC2, S3, Lambda, etc. Knowledge of database management systems such as MySQL or PostgreSQL. Familiarity with agile development methodologies and version control systems like Git. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Additional Info The company name is Apptad Technologies Pvt Ltd., and the industry is Employment Firms/Recruitment Services Firms.
Posted 1 month ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Database Engineer with 5 to 10 years of experience to design, develop, and maintain our database infrastructure. This position is based remotely. Roles and Responsibility Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale and big data processing. Implement data security measures to protect sensitive information and comply with relevant regulations. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to relational database systems or cloud-based solutions like Google BigQuery and AWS. Develop import workflows and scripts to automate data import processes. Ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and resolve issues, while collaborating with the full-stack web developer to implement efficient data access and retrieval mechanisms. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows, exploring third-party technologies as alternatives to legacy approaches for efficient data pipelines. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices, and use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines, taking accountability for achieving development milestones. Prioritize tasks to ensure timely delivery in a fast-paced environment with rapidly changing priorities, while also collaborating with fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems, leveraging online resources effectively like StackOverflow, ChatGPT, Bard, etc., considering their capabilities and limitations. Job Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes. Knowledge of cloud-based databases like AWS RDS and Google BigQuery. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. About Company Marketplace is an experienced team of industry experts dedicated to helping readers make informed decisions and choose the right products with ease. We arm people with trusted advice and guidance, so they can make confident decisions and get back to doing the things they care about most.
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Noida, Nagpur, Pune
Work from Office
Job Title: Full-Stack Developer Job Description: Position Overview: We are an innovative tech company looking for a talented MERN Stack Developer to join our dynamic team. If you're passionate about building scalable, high-performance applications using MongoDB, Express.js, React, Node.js, AWS Lambda, and serverless technologies, we want to hear from you! Location -Pune/Nagpur/Noida/Faridabad Job Role- Fulltime Work mode Work from office (Mon- Fri) Key Responsibilities: Develop and maintain scalable web applications using the MERN stack (MongoDB, Express.js, React, Node.js). Implement serverless architecture using AWS Lambda, API Gateway, and other AWS services. Utilize TypeScript for type-safe, maintainable code in both frontend and backend. Leverage React Hooks and Context API for efficient state management. Manage global state with Redux to ensure a smooth, high-performance user experience. Work with Google AI libraries and integrate AI-based functionalities into the application (e.g., image recognition, object detection). Develop camera-based features that integrate with the front-end React application for AI-driven processing. Design and optimize efficient APIs and manage integration with third-party services. Collaborate closely with cross-functional teams to define features, troubleshoot issues, and improve the product. Required Skills & Experience: MERN Stack: Proficiency with MongoDB, Express.js, React, and Node.js. AWS Lambda & Serverless: Hands-on experience building serverless applications with AWS Lambda, API Gateway, and related services. TypeScript: Strong experience with TypeScript for both client-side and server-side development. React & Redux: In-depth knowledge of React Hooks , Context API , and Redux for state management. Google AI Libraries: Familiarity with Google AI and machine learning libraries for implementing features like image recognition, object detection, or other AI-driven functionalities. Camera-Based Application Development: Experience developing and integrating camera-based applications, including video/image processing and AI integration. Version Control: Proficient in using Git and GitHub for version control and collaboration. API Development: Experience building and consuming RESTful APIs . Preferred Qualifications: Experience with GraphQL and Apollo Client. Familiarity with other AI and ML libraries (TensorFlow, OpenCV, etc.). Understanding of Docker and containerization. Experience with CI/CD pipelines and Agile development methodologies
Posted 1 month ago
7.0 - 12.0 years
8 - 12 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=7 to 12 , jd=Job Title:- IOTJob Location:- Job Type:- 06 months ContractualExp:- Min 7 to 12 MaxClient:- Vanguard LogisticsNo. of positions:- 2DurationImmediate:- Apptad is looking for a IOT Profile. It is a long-term job opportunity with us.IoT Stack Developers• Create modern scalable architecture for OT data pipeline with AWS native service• Supporting development of global solutions and replacing custom local solutions with standard global ones• Develop and integrate IoT devices, sensors, and microcontrollers to enable seamless data collection and connectivity.• Hands-on experience with cloud platforms like AWS IoT Core, Microsoft Azure IoT Hub, or Google Cloud IoT., Title=IoT Stack Developers, ref=6566532
Posted 1 month ago
8.0 - 12.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Lead design, development, and deployment of cloud-native and hybrid solutions on AWS and GCP. Ensure robust infrastructure using services like GKE, GCE, Cloud Functions, Cloud Run (GCP) and EC2, Lambda, ECS, S3, etc. (AWS).
Posted 1 month ago
6.0 - 11.0 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
-Design, build & deployment of cloud-native and hybrid solutions on AWS and GCP -Exp in Glue, Athena, PySpark & Step function, Lambda, SQL, ETL, DWH, Python, EC2, EBS/EFS, CloudFront, Cloud Functions, Cloud Run (GCP), GKE, GCE, EC2, ECS, S3, etc
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Role & responsibilities Skill - Java+AWS Location - Bangalore Exp. - 5+ Yrs. Notice period - Immediate joiners Preferred candidate profile Key Responsibilities: -Design, develop, and maintain Java-based microservices using Spring Boot framework. -Proficient with Java 17 or 21. Able to design and present in Architecture Forums. -Expert level understanding of Event Driven Architecture. -Build RESTful APIs and integrate with external/internal services. -Deploy and manage services on AWS cloud using tools like EC2, ECS/EKS, Lambda, S3, RDS, and API Gateway. -Collaborate with front-end developers, DevOps, and QA teams to deliver high-quality software. -Ensure best practices in code quality, performance, security, and scalability. -Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives. -Write unit, integration, and performance tests to ensure code reliability. -Monitor, troubleshoot, and optimize existing services in production. Required Skills & Experience: 5+ years of experience in backend development using Java. Strong expertise in Spring Boot, Spring Cloud, and building Microservices. 2.Experience with REST APIs, JSON, and API integration. 3.Good knowledge of AWS services for deployment, storage, and compute. 4.Familiarity with CI/CD pipelines and tools like Jenkins, Git, Maven/Gradle. 5.Understanding of containerization using Docker and orchestration with Kubernetes (nice to have). 6.Experience with relational and NoSQL databases (e.g., MySQL, PostgreSQL, DynamoDB, MongoDB). 7.Solid understanding of application performance monitoring and logging tools.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City