Home
Jobs

21 Lambada Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

7 - 13 Lacs

New Delhi, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Title: Infrastructure Engineer Location: Gurugram, Haryana Company: Morningstar is a leading provider of independent investment research in North America, Europe, Australia, and Asia. We offer a wide variety of products and solutions that serve market participants of all kinds, including individual and institutional investors in public and private capital markets, financial advisors, asset managers, retirement plan providers and sponsors, and issuers of securities. Morningstar India has been a Great Place to Work-certified company for the past eight consecutive years. Role: As a Infrastructure Engineer, you will be at the forefront of deploying, and maintaining the core infrastructure that powers the organizations technology landscape. This role requires a strategic thinker with hands-on expertise in infrastructure technologies, a strong grasp of project execution, and the ability to communicate cross-functional efforts. You'll ensure that systems are resilient, secure, scalable, and high-performing, while driving innovation and efficiency. Shift: General Responsibilities: • Infrastructure Design & Architecture o Design and maintain robust, scalable, and secure infrastructure solutions that align with business goals. o Partner with cross-functional teams to gather infrastructure requirements and recommend optimal solutions. • System Implementation & Operations o Deploy, configure, and manage infrastructure components including compute, storage, networking, and virtualization platforms. o Monitor infrastructure health and performance, troubleshoot issues, and optimize systems for peak efficiency. • Team Collaboration o Work closely with DevOps, Security, and Development teams to ensure seamless delivery of infrastructure services. • Security & Compliance o Implement infrastructure security best practices, patch management, and hardening techniques. o Support compliance initiatives and participate in internal and external security audits. • Project Management o Maintain documentation and drive continuous communication among stakeholders. • Automation & Innovation o Drive automation of infrastructure provisioning and management using tools like Terraform, Ansible, or similar. o Stay current with emerging infrastructure trends and recommend improvements or adoptions that drive efficiency. • Disaster Recovery & Business Continuity o Design, implement, and regularly test disaster recovery and backup strategies to ensure system resiliency. o Maintain and improve business continuity plans to minimize downtime and data loss. Qualifications: • 3-5 years of relevant professional experience in Infrastructure and Cloud services • Strong hands-on experience with AWS services including EC2, S3, IAM, Route 53, Lambda, Kinesis, ElastiCache, DynamoDB, Aurora, and Elasticsearch. • Proficient in infrastructure automation using Terraform, AWS CloudFormation. • Hands-on experience with configuration management tools such as Ansible, Chef, or Puppet. • Strong working knowledge of CI/CD tools, particularly Jenkins. • Proficient with Git and other version control systems. • Hands-on experience with Docker and container orchestration using Amazon EKS or Kubernetes. • Proficient in scripting with Bash and PowerShell. • Solid experience with both Linux and Windows Server administration. • Experience setting up monitoring, logging, and alerting solutions using tools like CloudWatch, Nagios, etc. • Working knowledge of Python and the AWS Boto3 SDK. Morningstar is an equal opportunity employer

Posted 11 hours ago

Apply

6.0 - 11.0 years

8 - 15 Lacs

Pune

Hybrid

Naukri logo

- Exp in developing applications using Python, Glue(ETL), Lambda, step functions services in AWS EKS, S3, Glue, EMR, RDS Data Stores, CloudFront, API Gateway - Exp in AWS services such as Amazon Elastic Compute (EC2), Glue, Amazon S3, EKS, Lambda Required Candidate profile - 7+ years of exp in software development and technical leadership, preferably having a strong financial knowledge in building complex trading applications. - Research and evaluate new technologies

Posted 2 days ago

Apply

5.0 - 10.0 years

22 - 37 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Experience: 5-8 Years (Lead-23 LPA), 8-10 Years (Senior Lead 35 LPA), 10+ Years (Architect- 42 LPA)- Max Location : Bangalore as 1 st preference , We can also go for Hyderabad, Chennai, Pune, Gurgaon Notice: Immediate to max 15 Days Joiner Mode of Work: Hybrid Job Description: Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities: Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.

Posted 5 days ago

Apply

6.0 - 11.0 years

8 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Mandatory Skills: .Net Core, AWS Cloud, Angular 10 or above Key Responsibilities: Full Stack Development: Design, develop, and maintain web applications using .NET technologies, including C#, .NET Core and ASP.NET Web API. Build and maintain front-end applications using Angular 10+ versions. Implement responsive and user-friendly UI features, ensuring seamless user experience across devices. Cloud Development and Management: Utilize AWS services (S3, Lambda, EC2, CloudWatch) for hosting, deployment, and monitoring of applications. Work with AWS services for automation, infrastructure management, and scaling solutions. Backend Development & API Design: Develop robust backend APIs using .NET 4.6.1 / .NET Core 3, ensuring high performance and security. Integrate third-party APIs and services into applications, ensuring scalability and reliability. Code Quality & CI/CD: Implement best practices for code quality and standards. Use tools like SonarQube to ensure the code is free from errors and maintain high-quality standards. Work with Jenkins for continuous integration and continuous deployment (CI/CD), ensuring smooth deployments and minimal downtime. http://qentelli.com Quality Intelligence through Engineering Qentelli2024 Collaboration and Agile Practices: Collaborate effectively with cross-functional teams, including designers, product managers, and other developers. Use Agile methodologies for efficient development, and actively participate in sprint planning, stand-ups, and retrospectives. Track and manage tasks using Jira, ensuring all tasks are completed on time and according to project requirements. Version Control & Docker: Manage source code and collaborate with the team using Git for version control. Use Docker for containerization and deployment, ensuring consistent environments across development, staging, and production. Required Skills & Qualifications: Experience in Full Stack Development: Proven experience as a full-stack developer using .NET technologies (e.g., .NET 4.6.1, .NET Core 3, ASP.NET Web API 2). Frontend Technologies: Strong hands-on experience with Angular 10+ and other front-end technologies. Cloud Technologies: Hands-on experience working with AWS services such as S3, Lambda, CloudWatch, and EC2. CI/CD & Code Quality: Experience with tools like Jenkins, SonarQube, and other DevOps practices. Version Control & Collaboration Tools: Experience using Git for version control and Jira for task tracking. Containerization: Knowledge of Docker for creating and managing containerized applications. Strong Problem-Solving Skills: Ability to troubleshoot, debug, and optimize both front-end and back-end issues. Team Player: Strong communication skills and the ability to collaborate effectively in a team- oriented environment. Preferred Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field. Experience with other cloud platforms or services is a plus. Familiarity with Agile methodologies and Scrum practices. Familiarity with additional tools such as Kubernetes, Terraform, or other infrastructure automation tools.

Posted 1 week ago

Apply

7.0 - 12.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

JD: .Net AWS Lead Overall Exp.: Developer: 8+ years Rel. Exp: Developer: 4+ years Work Location: Hyderabad Work Timing: General Shift Work Mode: WFO Position Type: Permanent (Fulltime) Notice Period: 0-30 days Mandatory Skills: .Net Core, AWS Cloud, Angular 10 or above Key Responsibilities: Full Stack Development: Design, develop, and maintain web applications using .NET technologies, including C#, .NET Core and ASP.NET Web API. Build and maintain front-end applications using Angular 10+ versions. Implement responsive and user-friendly UI features, ensuring seamless user experience across devices. Cloud Development and Management: Utilize AWS services (S3, Lambda, EC2, CloudWatch) for hosting, deployment, and monitoring of applications. Work with AWS services for automation, infrastructure management, and scaling solutions. Backend Development & API Design: Develop robust backend APIs using .NET 4.6.1 / .NET Core 3, ensuring high performance and security. Integrate third-party APIs and services into applications, ensuring scalability and reliability. Code Quality & CI/CD: Implement best practices for code quality and standards. Use tools like SonarQube to ensure the code is free from errors and maintain high-quality standards. Work with Jenkins for continuous integration and continuous deployment (CI/CD), ensuring smooth deployments and minimal downtime. Collaboration and Agile Practices: http://qentelli.com Quality Intelligence through Engineering Qentelli2024 Collaborate effectively with cross-functional teams, including designers, product managers, and other developers. Use Agile methodologies for efficient development, and actively participate in sprint planning, stand-ups, and retrospectives. Track and manage tasks using Jira, ensuring all tasks are completed on time and according to project requirements. Version Control & Docker: Manage source code and collaborate with the team using Git for version control. Use Docker for containerization and deployment, ensuring consistent environments across development, staging, and production. Required Skills & Qualifications: Experience in Full Stack Development: Proven experience as a full-stack developer using .NET technologies (e.g., .NET 4.6.1, .NET Core 3, ASP.NET Web API 2). Frontend Technologies: Strong hands-on experience with Angular 10+ and other front-end technologies. Cloud Technologies: Hands-on experience working with AWS services such as S3, Lambda, CloudWatch, and EC2. CI/CD & Code Quality: Experience with tools like Jenkins, SonarQube, and other DevOps practices. Version Control & Collaboration Tools: Experience using Git for version control and Jira for task tracking. Containerization: Knowledge of Docker for creating and managing containerized applications. Strong Problem-Solving Skills: Ability to troubleshoot, debug, and optimize both front-end and back-end issues. Team Player: Strong communication skills and the ability to collaborate effectively in a team- oriented environment. Preferred Qualifications: Bachelor’s degree in Computer Science, Engineering, or related field. Experience with other cloud platforms or services is a plus. Familiarity with Agile methodologies and Scrum practices. Familiarity with additional tools such as Kubernetes, Terraform, or other infrastructure automation tools.

Posted 1 week ago

Apply

8.0 - 13.0 years

12 - 22 Lacs

Pune

Hybrid

Naukri logo

- Exp in developing applications using Python, Glue(ETL), Lambda, step functions services in AWS EKS, S3, Glue, EMR, RDS Data Stores, CloudFront, API Gateway - Exp in AWS services such as Amazon Elastic Compute (EC2), Glue, Amazon S3, EKS, Lambda Required Candidate profile - 10+ years of exp in software development and technical leadership, preferably having a strong financial knowledge in building complex trading applications. - 5+ years of people management exp.

Posted 1 week ago

Apply

6.0 - 10.0 years

9 - 18 Lacs

Hyderabad

Hybrid

Naukri logo

Primary Skills (Mandatory top 3 skills) : AWS working experience AWS Glue or equivalent product experience Lambda functions Python programming Kubernetes knowledge Roles and Responsibilities: - Develop Code - Deployment - Testing - Bug fixing No of interview Rounds : 2 rounds , Face to Face is mandatory

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Pune, Ahmedabad

Work from Office

Naukri logo

We are seeking a seasoned Lead Platform Engineer with a strong background in platform development and a proven track record of leading technology design and teams. The ideal candidate will have at least 8 years of overall experience, with a minimum of 5 years in relevant roles. This position entails owning module design and spearheading the implementation process alongside a team of talented platform engineers. Job Title: Lead Platform Engineer Job Location: Ahmedabad/Pune (Work from Office) Required Experience: 7+ Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Lead the design and architecture of robust, scalable platform modules, ensuring alignment with business objectives and technical standards. Drive the implementation of platform solutions, collaborating closely with platform engineers and cross-functional teams to achieve project milestones. Mentor and guide a team of platform engineers, fostering an environment of growth and continuous improvement. Stay abreast of emerging technologies and industry trends, incorporating them into the platform to enhance functionality and user experience. Ensure the reliability and security of the platform through comprehensive testing and adherence to best practices. Collaborate with senior leadership to set technical strategy and goals for the platform engineering team. Requirements: Minimum of 8 years of experience in software or platform engineering, with at least 5 years in roles directly relevant to platform development and team leadership. Expertise in Python programming, with a solid foundation in writing clean, efficient, and scalable code. Proven experience in serverless application development, designing and implementing microservices, and working within event-driven architectures. Demonstrated experience in building and shipping high-quality SaaS platforms/applications on AWS, showcasing a portfolio of successful deployments. Comprehensive understanding of cloud computing concepts, AWS architectural best practices, and familiarity with a range of AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. Exceptional problem-solving skills, with a proven ability to optimize complex systems for efficiency and scalability. Excellent communication skills, with a track record of effective collaboration with team members and successful engagement with stakeholders across various levels. Previous experience leading technology design and engineering teams, with a focus on mentoring, guiding, and driving the team towards achieving project milestones and technical excellence. Good to Have: AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms such as Azure or GCP. Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

4 - 6 Lacs

Vadodara

Work from Office

Naukri logo

About the job: We are looking for a DevOps Engineer to maintain, upgrade, and manage our software, hardware, and networks with a strong focus on AWS and containerization. Resourcefulness and problem-solving are essential in this role. You should be able to diagnose and resolve issues quickly, while collaborating with interdisciplinary teams and users. Your goal will be to ensure that our AWS-based infrastructure, including containerized environments, runs smoothly, securely, and efficiently. Responsibilities: Monitor and maintain systems, including configuration, security management, patching, automation, hardening, and upgrades. Set up and manage infrastructure on AWS, including EC2, RDS, S3, Route 53, and other AWS services. Manage containerized environments using Docker and Kubernetes (EKS). Build and manage CI/CD pipelines using Jenkins, particularly for containerized applications. Ensure system uptime, availability, reliability, and security across AWS environments and containerized workloads. Manage cloud monitoring and logging using AWS CloudWatch. Handle backup and disaster recovery strategies for AWS-based applications and databases. Collaborate with vendors and IT providers for specific AWS and containerization requirements. Document procedures, policies, and configurations in an internal wiki. Troubleshoot server-side and cloud-related issues, including container orchestration problems. Asset management and cost optimization in AWS. Skills: Proven experience as a DevOps Engineer or System Administrator with a focus on AWS and containerization. In-depth knowledge of AWS services such as EC2, EKS (Kubernetes), RDS, S3, Load Balancers, CloudWatch, and Route 53. Strong experience with Docker and Kubernetes (preferably EKS). Experience with AWS security best practices (IAM, Security Groups, VPC). Familiarity with CI/CD pipelines and tools like Jenkins, especially for containerized environments. Strong scripting skills (Bash, Python, or PowerShell) for automation on AWS. Knowledge of system security, data backup, and recovery. Experience with databases, networks (LAN, WAN), and patch management. Excellent communication skills. Nice to Have: Experience with Azure services, such as Azure Kubernetes Service (AKS), Virtual Machines, and Azure DevOps. Familiarity with Infrastructure as Code (IaC) tools such as Terraform, AWS CloudFormation, or Azure Resource Manager (ARM) templates. Azure certifications such as Azure Administrator Associate or Azure Solutions Architect Expert. Basic knowledge of Azure Active Directory, role-based access control (RBAC), and Azure Networking. Benefits: Health insurance and personal accident insurance. Flexible working hours in a motivational environment. Paid time off and referral programs. Discover a rewarding work/life balance.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Data Engineer: Mandatory skills* AWS, KAFKA, ETL, Glue, Lamda, Phyton, SQL

Posted 3 weeks ago

Apply

10.0 - 15.0 years

9 - 15 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Roles & Responsibilities Lead data migration efforts from legacy systems (e.g., on-premises databases) to cloud-based platforms AWS Collaborate with cross-functional teams to gather requirements and define migration strategies. Develop and implement migration processes to move legacy applications and data to cloud platforms like AWS. Write scripts and automation to support data migration, system configuration, and cloud infrastructure provisioning. Optimize existing data structures and processes for performance and scalability in the new environment. Ensure the migration adheres to performance, security, and compliance standards. Identify potential issues, troubleshoot, and implement fixes during the migration process. Maintain documentation of migration processes and post-migration maintenance plans. Provide technical support post-migration to ensure smooth operation of the migrated systems. Primary Skills (Required): Proven experience in leading data migration projects and migrating applications, services, or data to cloud platforms (preferably AWS). Knowledge of migration tools such as AWS Database Migration Service (DMS), AWS Server Migration Service (SMS), AWS Migration Hub Expertise in data mapping, validation, transformation, and ETL processes Proficiency in Python, Java or similar programming languages. Experience with scripting languages such as Shell, PowerShell, or Bash Cloud Technologies (AWS focus): Strong knowledge of AWS services relevant to data migration (e.g., S3, Redshift, Lambda, RDS, DMS, Glue). Experience in working with CI/CD pipelines (Jenkins, GitLab CI/CD) and infrastructure as code (IaC) using Terraform or AWS CloudFormation Experience in database management and migrating relational (e.g., MySQL, PostgreSQL, Oracle) and non-relational (e.g., MongoDB) databases.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

10 - 17 Lacs

Noida

Hybrid

Naukri logo

The right person for this position should have 3-6 years of experience in Backend Development . He/She should be passionate, tech savvy, academically sound, have interest in cloud technologies like AWS & Azure and technologies that drive the headless domain. He / She should be able to understand Pentair product domain and develop products using industry best practices. He / She is required to be hands-on with node.js, go-lang or python and application development to develop service for SAAS based platform for residential and commercial IOT. He should be able to define Low level design for any problem statement. Roles and Responsibilities: Develop the Smart Products& IoT Technology within the Segment. Responsibility for successful execution of Segment-focused projects aimed at developing Smart products and IoT solutions; such projects/products include fully developed commercial products, minimum viable products, rapid prototypes and proof-of-concepts. Ensure these projects follow the appropriate standard process used at Pentair(such as 3D and Rapid3D). Continuously innovate on existing IoT solutions with latest applicable techniques to boost product capabilities and business value. Collaborate with Pentair-wide technical resources to develop IOT cloud, web and mobile solutions and support IOT Product solutions: Design & Develop technical design document for software development. Develop detailed technical architecture block diagram for cloud solutions. Code & implement the application layer as Infrastructure as Code and Software as a Service solutions. Implement production platform building back-end automation tools. Review product applications, create test platforms to review coding quality. Coordinate with Product Engineer from Filtration Business Unit to develop project plan to integrate IOT. Support product risk assessment, develop guide for IOT design requirement, support developing test plan for integration Provide solutions to issues related to the connection of networks and platforms. Skills required: Bachelors degree in computer science or equivalent. More than 3 years of working experience in Amazon Web Service Infrastructure and Platform as a Service tools. 3+ years' experience with the programming languages -Python, Java, NodeJS Extensive experience working with Cloud based datastores like S3, DynamoDB, MongoDB Deep understanding of mobile and web technology stacks Swagger API specifications, Restful API In-depth understanding of computer programming and network security Expert understanding of data modeling, database design, performance monitoring and tuning. Experience in collaborating with global technology teams is a plus Deep technical knowledge in Agile software development. Experience in working with IoT vendors /third party service providers is a plus. Willingness to travel up to 20% of the time to other Pentair sites and customer locations. Travel will both domestic and international. Key Interfaces Global Project team members GEC Engineering Team External vendors and suppliers Qualifications and Experience: M.Tech/B.Tech in Computer Science / Electronics Engineering from a good Engineering College. Other Requirements: Team player Good communication and presentation skills Ability to multitask Design Thinking Have passion for Design & Technology Should have a can do attitude Excellent interpersonal skills

Posted 3 weeks ago

Apply

5.0 - 8.0 years

12 - 18 Lacs

Pune, Delhi / NCR

Hybrid

Naukri logo

5+ yrs of exp in deploying, enhancing, & troubleshooting AWS Services (EC2, S3, RDS, VPC, CloudTrail, CloudFront, Lambda, EKS, ECS) 3+ yrs of exp with serverless technologies, services,Docker, Kubernetes exp in JavaScript, Bash, Python, Typescript

Posted 4 weeks ago

Apply

6.0 - 10.0 years

0 - 2 Lacs

Gurugram

Remote

Naukri logo

We are seeking an experienced AWS Data Engineer to join our dynamic team. The ideal candidate will have hands-on experience in building and managing scalable data pipelines on AWS, utilizing Databricks, and have a deep understanding of the Software Development Life Cycle (SDLC) and will play a critical role in enabling our data architecture, driving data quality, and ensuring the reliable and efficient flow of data throughout our systems. Required Skills: 7+ years comprehensive experience working as a Data Engineer with expertise in AWS services (S3, Glue, Lambda etc.). In-depth knowledge of Databricks, pipeline development, and data engineering. 2+ years of experience working with Databricks for data processing and analytics. Architect and Design the pipeline - e.g. delta live tables Proficient in programming languages such as Python, Scala, or Java for data engineering tasks. Experience with SQL and relational databases (e.g., PostgreSQL, MySQL). Experience with ETL/ELT tools and processes in a cloud environment. Familiarity with Big Data processing frameworks (e.g., Apache Spark). Experience with data modeling, data warehousing, and building scalable architectures. Understand/implement security aspects - consume data from different sources Preferred Qualifications: Experience with Apache Airflow or other workflow orchestration tools, Terraform , python, spark will be preferred AWS Certified Solutions Architect, AWS Certified Data Analytics Specialty, or similar certifications.

Posted 1 month ago

Apply

4.0 - 8.0 years

9 - 19 Lacs

Gurugram, Chennai, Bengaluru

Work from Office

Naukri logo

Skills: AWS Glue, Lambda, PySpark, Python , SQL

Posted 1 month ago

Apply

9.0 - 11.0 years

6 - 7 Lacs

Raipur

Work from Office

Naukri logo

Job Tille: System Administrator-Big Data (SA-BD) Reports to: The Joint Chief Executive Officer. CHiPS Number of Positions: I Responsibility Summary: Chhattisgarh infotech Promotion Society, Government of Chhattisgarh invites applications from enterprising and aspiring candidates for the position of System Administrator (SA-8D). Chhattisgarh infotech Promotion Society (CHiPS: www,chips.gov.in) is the nodal agency and prime mover for propelling IT growth & implementation of the IT & e-Governance projects in the State of Chhattisgarh. CHiPS is involved in the end-to-end implementation of some mega IT Projects like, SOC, SSDG. SWAN, GTS,e-Procurement etc. A professional approach is being adopted for the implementation of IT Projects using the services of e-governance experts and consultants from corporate and academia. ICT has the potential to significantly improve this contribution. In doing so. Government of Chhattisgarh seeks to create an IT environment in the stare wherein investments in IT are not only encouraged but actively facilitated. We aim to achieve quality and excellence in state government services for the citizens, state transactions with citizens and businesses, and internal state governmental operations/functions through the Strategic deployment of information technologies. The role of the SA-BD is to ensure that the strategic and organizational objectives as well as the values of CHiPS are put into practice. In conjunction with other 1nembers of staff they will ensure organization growth through directing and managing• operational activities to ensure they are delivered in accordance with the strategic objectives. The SA-BD will.be responsible for monitoring the systems to ensure the highest level of infrastructure performance, manages and coordinates all infrastructure projects to meet client needs, ensuring that standards and procedures arc followed during design and implementation of information systems. helping and creating organizational and program budgets• in collaboration with the JCEO and other team members, and undertaking other miscellaneous tasks and when they arise. They are required to work with the staff team, and contribute to the development and implementation of organizational strategies, policies and practices. The candidate should be an outcome - oriented executive. capable of leading the creation of and energizing the institutional, human and technical capacities necessary to realize. the unique and ambitious ICT agenda of the State. S/he should also be familiar with a variety of the field's concepts, practices, and procedures. relies on extensive experience and judgment to plan and accomplish goals and capable of multi-tasking. A wide degree of creativity and latitude is expected 10 secure the necessary cooperation and convergence of resources and activities. The candidate should fulfil the mandatory requirements listed below. and should embody n rich combination of the requirements listed as desired. Mandatory: Educational qualification: B.E. /B.Tcch. (Information Technology/ Computer Science/• Electronics & IT) or M.E/ M. Tech or M. Sc. in Mathematics/statistics/Operation Research/ Computer Science/ IT or Ph.d in a quantitative discipline (such as Computer Science, Bioinformatics or Statistics), recognized by or under• the regulations etc. of relevant regulatory body, obtained upon successful completion of studies (excluding studies in distance education mode) as a regularly enrolled student; in respect of degrees or diplomas awarded abroad. candidate should submit relevant details establishing equivalence with the above qualification, and the decision of the Selection Committee regarding the acceptability of such qualification as equivalent qualification shall be final. Age : Candidate should be energetic and dynamic as the job profile would entail extensive interaction with various stakeholders and should be result oriented. S/he should not be more than 35 years of age on the date of issue of the recruitment notice. For age related relaxations, please refer the Recruitment Rules. Experience: At least 9 years in case of Bachelor's degree or 7 years' experience in case of Master's degree; on collecting, storing, processing. and analyzing of huge sets of data. The candidate must have at least 4.5 years of relevant work experience as Data Analyst/ Scientist or similar quantitative analysis positions. The primary focus will be on choosing Optimal solutions to use for these purposes, then maintaining. implementing. and monitoring them along with the responsibility for integrating them with the architecture used across the organization. The candidate should be highly technical computing architects to collaborate with our customers and partners on solutions in Big Data and Analytics. These engage1rieiits will focus on Real Time and Batch-based Big Data processing. Business intelligence(HPE:; and Machine Learning. This role will specifically focus on building innovative solutions that focus on leveraging the value of data. Job Description: The• right person will be highly technical and analytical, possess significant experience of software development and/or IT and networking implementation / consulting experience. Strong knowledge of and experience with statistics; potentially other advanced math as well. Translate numbers to insights Programming experience, ideally in Python or Java and R. Build a new innovative data platform based on big data technologies Build development methodologies for company-wide principles and strategy. Drive innovation around data management and processing Successfully lead team of big data developers Create technology specific roadmaps as directed • Coordinate and work with DevOps team 10 build and manage data infrastructure Coordinate and work together with data science and reporting teams Deep knowledge. in data mining, machine learning, natural language processing; or information retrieval. Experience processing large amounts of structured and unstructured data. Map Reduce experience is a plus. Enough programming knowledge to clean and scrub noisy datasets. Work with state government department to gauge nod create demand of strategic solutions and fulfil 1hc same. Must have the ability to reach out and work with the senior government official in various departments. To manage implementation of identified projects based on a broad and detailed knowledge of current and emerging technologies and to provide technical input into projects undertaken by or impacting on organization. To advise and inform management, department and Members on technical issues as part of the decision making process for technical direction and procurement of new systems. Desirable: Experience working within 1he software development or internet industries is highly desired; working knowledge of modern software development practices and 1ecltnologies such as agile methodologies and DevOps is highly desired. Understanding ofapplica1ion, server, and network security is highly desired. Technical - Web services development/deployment experience, IT Systems and network engineering experience, security and compliance experience, etc. Operational - Website/ web services as well as1radi1ional IT networking, operations. management, and security experience. Economic and business - RF'P/ Acquisition support; market analysis: cost benefit Knowledge of the underlying infras1ruc1ure requirements such as networking Storage, 3nd Hardware Optimization. Ability to think strategically about business, product, and technical challenges in an enterprise environment. Understanding of Agile methodologies. and 1the ability to apply these practices to Analytics projects. Implementation and tuning experience in the Apache Hadoop Ecosystem, including tools such as Hadoop Streaming, Spark. Pig and Hive Implementation and tuning experience of Data Warehousing platforms, including knowledge of Data Warehouse Schema Design. Query Tuning and Optimization, and Data Migration and Integration. Experience of requirements for the analytics presentation layer including Dashboards. Reporting, and OLAP. Technical Skills: Mnnagement of Hadoop cluster, with all included services. Ability to solve any ongoing issues with operating the cluster. Proficiency with Hadoop v2. MapReduce, HDFS. Experience with building stream-processing systems. using solutions such as "Storm or Spark-Streaming. Good knowledge of Big Data querying tools. such as Pig. Hive, and impala • Experience with Spark. Experience with integration of data from multiple data sources. Experience with NoSQL databases, such as HBase, Cassandra, MongoDB. Knowledge of various ETL techniques and frameworks, such as Flume. . ... Experience with various messaging systems. such as Kafka or RabbitMQ. Experience with Big Data ML toolkits, such as Mahout SparkML, or H20. . Good understanding of Lambda Architecture, along with its advantages and , drawbacks • Experience with Cloudera/ MapR/ Hortonworks. KINDLY NOTE: APPLICATION ARE ACCEPTED ON OR BEFOR JUNE 7th 2025 ONLY.

Posted 1 month ago

Apply

6.0 - 9.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

This requirement to source profiles with 6-9years of overall experience, including minimum of 4years in Data engineering. Added the below point based on observation Look for combinations with Informatica, IICS , Python,(If not Informatica we can submit with Talend,) Pyspark, SQL , Step Functions, Lambda & EMR on high level exp - Location Hyderabad Key responsibilities and accountabilities Design, build and maintain complex ELT/ETL jobs that deliver business value. Extract, transform and load data from various sources including databases, APIs, and flat files using IICS or Python/SQL. Translate high-level business requirements into technical specs Conduct unit testing, integration testing, and system testing of data integration solutions to ensure accuracy and quality Ingest data from disparate sources into the data lake and data warehouse Cleanse and enrich data and apply adequate data quality controls Provide technical expertise and guidance to team members on Informatica IICS/IDMC and data engineering best practices to guide the future development of MassMutuals Data Platform. Develop re-usable tools to help streamline the delivery of new projects Collaborate closely with other developers and provide mentorship Evaluate and recommend tools, technologies, processes and reference Architectures Work in an Agile development environment, attending daily stand-up meetings and delivering incremental improvements Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications and provide feedback on code quality, design, and performance Knowledge, skills and abilities Please refer ‘Education and Experience’ Education and experience Bachelor’s degree in computer science, engineering, or a related field. Master’s degree preferred. Data: 5+ years of experience with data analytics and data warehousing. Sound knowledge of data warehousing concepts. SQL: 5+ years of hands-on experience on SQL and query optimization for data pipelines. ELT/ETL: 5+ years of experience in Informatica/ 3+ years of experience in IICS/IDMC Migration Experience: Experience Informatica on prem to IICS/IDMC migration Cloud: 5+ years’ experience working in AWS cloud environment Python: 5+ years of hands-on experience of development with Python Workflow: 4+ years of experience in orchestration and scheduling tools (e.g. Apache Airflow) Advanced Data Processing: Experience using data processing technologies such as Apache Spark or Kafka Troubleshooting: Experience with troubleshooting and root cause analysis to determine and remediate potential issues Communication: Excellent communication, problem-solving and organizational and analytical skills Able to work independently and to provide leadership to small teams of developers. Reporting: Experience with data reporting (e.g. MicroStrategy, Tableau, Looker) and data cataloging tools (e.g. Alation) Experience in Design and Implementation of ETL solutions with effective design and optimized performance, ETL Development with industry standard recommendations for jobs recovery, fail over, logging, alerting mechanisms. Specify the minimum acceptable level of education, experience, certifications necessary for the role Application Requirements No special requirements Support Hours India GCC – US (EST) hours overlap for 2-3 hours

Posted 1 month ago

Apply

10.0 - 15.0 years

22 - 37 Lacs

Hyderabad

Work from Office

Naukri logo

Senior Java AWS Technical Lead OneTax Platform About the Role: Join our OneTax engineering team as a Senior Java AWS Technical Lead. You will drive the design and development of scalable, resilient tax processing systems on AWS. As a hands-on technical leader, you will shape architecture, mentor engineers, and ensure delivery excellence in a high impact, compliance driven environment. Key Responsibilities : Lead and mentor a team of engineers, fostering technical growth and best practices. Architect and design robust, secure microservices and APIs using Java, Spring Boot, and AWS. Deliver hands on solutions for complex business challenges, ensuring high performance and reliability. Integrate with internal and external systems, leveraging AWS services (EC2, Lambda, RDS, DynamoDB, S3, SQS, etc.). Drive DevOps, CI/CD, and automation for rapid, safe deployments. Collaborate with product, QA, and cross functional teams to deliver high quality features on schedule. Troubleshoot and optimize distributed systems for scalability and cost efficiency. Contribute to technical strategy, evaluating new technologies and driving innovation. Required Qualifications: Bachelor’s or master’s in computer science or related field. 10+ years of software development experience, with deep expertise in Java and Spring Boot. Proven technical leadership and mentoring experience. Strong background in AWS architecture and cloud native development. Experience with microservices, RESTful APIs, and distributed systems. Proficiency in relational and NoSQL databases. Solid understanding of DevOps, CI/CD, and containerization (Docker, ECS/EKS). Excellent problem solving, debugging, and communication skills. Experience working in Agile teams. Preferred: Experience in Financial Services or Tax domain. AWS certifications (Associate/Professional). Familiarity with monitoring, logging, and tracing tools. Contributions to open-source or technical community. Why OneTax? Being part of a global leader, building mission critical Tax solutions that impact millions. You’ll work with cutting edge technology, a talented team, and could shape the future of tax processing.

Posted 1 month ago

Apply

8.0 - 13.0 years

17 - 25 Lacs

Pune

Remote

Naukri logo

Senior Java Developer Job mode: Remote (EST working hrs) Notice - 30 days Job Description: • Should have strong hands-on experience of 810 years in Java development. • Should have strong knowledge of Java 11+, Spring, Spring Boot, Hibernate, REST Webservices. • Should have strong knowledge of J2EE design patterns and Microservices design patterns. • Should have strong hands-on knowledge of SQL/PostGreSQL DB. Good to have exposure to NoSQL DB. • Should have strong knowledge of AWS services (Lambda, EC2, RDS, API Gateway, S3, CloudFront, Airflow). • Good to have hands-on knowledge on React. • Good to have exposure to Python, Pyspark as a secondary skill. • Implement solutions that are aligned with business/IT strategies and comply with Nuveen’s Architectural standards. • Should have good knowledge of CI-CD Pipeline. • Should be strong in writing unit test cases, debug Sonar issues. • Should be able to lead/guide team of junior developers. • Should be able to collaborate with BA and Solution Architects to create HLD and LLD documents. • Should be able to create architecture flow diagram, logical flow diagram and data flow diagrams. • Good to have experience in Asset Management Domain.

Posted 1 month ago

Apply

10 - 17 years

37 - 55 Lacs

Bengaluru

Remote

Naukri logo

Role & responsibilities 1. Overall, 12 to 16 years of C++ Development experience 2. Experience managing a team of 10 3. Must have handled scrum calls 4. Experience with helping Team resolve technical queries. 5. Experience in resolving complex Technical issues 6. Must have handled Projects independently. 7. Experience working with US/UK Clients 8. Ability to install and configure additional software and packages on Linux primarily those needed for coding, testing, dumping memory footprint, debugging etc. 9. Agile/Scrum experience 10. Ability to develop and triage on Linux 11. Ability to setup Linux IDE 12. Ability to integrate IDE with Source Code system such as Clearcase 13. Ability to debug, test, compile and rerun the modified executables on Linux OS 14. Ability to code and test in C++. 15. Nice to have docker knowledge and ability to deploy containers and software in it.

Posted 1 month ago

Apply

5 - 10 years

5 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

We are seeking an experienced Senior DevOps Engineer with deep expertise in building automation and CI/CD pipelines within a serverless AWS environment . The ideal candidate will have hands-on experience managing AWS Lambda at scale , designing infrastructure with AWS CDK , and implementing pipelines using GitHub Actions . This role will play a key part in scaling, securing, and optimizing our cloud-native architecture. Key Responsibilities: Design, implement, and maintain robust CI/CD pipelines using GitHub Actions and AWS CDK . Build and manage serverless applications with a focus on scalability, performance, and reliability . Configure and maintain key AWS services including: IAM, API Gateway, Lambda (600+ functions), SNS, SQS, EventBridge, CloudFront, S3, RDS, RDS Proxy, Secrets Manager, KMS, and CloudWatch . Develop infrastructure as code (IaC) using AWS CDK and CloudFormation Templates . Code primarily in TypeScript , with additional scripting in Python as needed. Implement and optimize DynamoDB and other AWS-native databases. Enforce best practices for cloud security, monitoring, and cost optimization. Collaborate with development, QA, and architecture teams to enhance deployment workflows and reduce release cycles. Required Skills & Experience: Strong expertise in AWS serverless technologies , including large-scale Lambda function management. Extensive experience with AWS CDK and GitHub Actions for pipeline automation. Hands-on with AWS services: IAM, API Gateway, Lambda, SQS, SNS, S3, CloudWatch, RDS, EventBridge , and others. Proficient in TypeScript ; familiarity with Python is a plus. Solid understanding of CI/CD practices , infrastructure automation, and Git-based workflows . Experience building scalable and secure serverless systems in production.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies