Home
Jobs

52470 Aws Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

Experience 7 to 8 Years Location Bangalore Notice Period - Immediate Design, implement, and manage CI/CD pipelines. Automate infrastructure provisioning and configuration using Ansible and Terraform. Collaborate with development and operations teams to ensure smooth deployment and operation of applications. Monitor system performance and troubleshoot issues as they arise. Ensure security and compliance across all environments. Requirements: Proven experience with Ansible and Terraform. Strong background in DevOps practices and tools. Experience with cloud platforms (AWS, Azure, GCP). Proficiency in scripting languages (Python, Bash, etc.). Excellent problem-solving skills and attention to detail. Ability to work in a fast-paced, collaborative environment. Devops

Posted 4 hours ago

Apply

12.0 - 15.0 years

20 - 25 Lacs

Thane

Work from Office

Naukri logo

Do You Think in Queries? Dream in Architecture? Scale in Petabytes? Then you might be who we re looking for. At Netcore Cloud, we re building systems that move with the speed of millions of users. Behind that scale? Data. Behind that data? You. We re searching for a Head of Database Engineering not just someone to manage databases, but someone who can reimagine what s possible with distributed systems, real-time data, and cloud-native scale. Architect the Future: Lead the end-to-end design of secure, scalable, polyglot DB systems across hashtag # AWS & hashtag # GCP . Outsmart the Cloud Bills: Slice costs without sacrificing performance and brag about it in reviews. Automate or Die Trying: Manual tasks? Outdated. Everything from provisioning to patching should run while you sleep. Security = Default: hashtag # IAM , encryption, compliance, audit trails you don t bolt it on, you build it in. Own the Numbers: Uptime, latency, throughput you ll lead with metrics, not guesswork. Our Ideal Data Genius: 12 15 years wrangling large-scale DBs, with 5+ years leading smart, scrappy, high-impact teams Fluency across hashtag # MySQL , hashtag # PostgreSQL , hashtag # MongoDB , hashtag # Cassandra , hashtag # Druid , hashtag # Vertica , hashtag # Elasticsearch Hands-on with hashtag # Python / hashtag # Bash , hashtag # Terraform , and a mindset to build once, scale forever Deep experience in GCP & AWS not just using cloud, but taming it Obsessed with observability, reliability, and building DBs that developers love to work with What s In It for You? Your architecture will impact millions of users across the globe Work alongside high-caliber product, DevOps & security teams Lead from the front your vision, your roadmap, your team Be the brain behind the DB that powers some of the fastest-growing digital platforms Real autonomy, real challenges, real scale Mysql, Mongodb, Cassandara, Python, Database, Aws, Gcp, Polyglot, Terraform, Postgres

Posted 4 hours ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

The future is our choice At Atos, as the global leader in secure and decarbonized digital, our purpose is to help design the future of the information space. Together we bring the diversity of our people s skills and backgrounds to make the right choices with our clients, for our company and for our own futures. Role: Java Backend Developer Total Experience: 4-8 years Job Location: Pune and Chennai Mode of Hire: Permanent Educational Qualification: Any (Full Time Graduate) Job Roles and Responsibilities: At least 4+ years of Experience in developing web-based applications using React, Spring Boot, Micro Services, REST APIs/Web Services, Spring MVC, JEE (Java 8, JSF, JPA, JSP, Servlets, JDBC), XML (DOM, SAX, XSLT), DHTML/HTML5, JavaScript, JBOSS etc. Good experience in development IDE tools such as IntelliJ IDE e.t.c.. The core technologies used are: Java 8 (or above), Spring Framework, Junit 4 & 5, FreeMarker Templates, Oracle (approximately 10% of the time), Git | Ruby | Bundler | Gradle | Leiningen | GoCD | Ansible (Mandatory), JavaScript/ReactJS/Redux | NodeJS (Optional) Ideal experience in development on AWS Job Requirements: Must have knowledge and experience working in Agile Environment using SCRUM framework Knowledge or Experience is required Jenkins, Maven, CI/CD Excellent analytical, communication and Team Player skills are required Onsite Client facing experience would be plus Knowledge in FreeMarker would be plus Knowledge in Performance monitoring & metrics tools such as Glowroot, Jprofiler e.t.c.. would be plus. Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment Wellbeing programs & work-life balance - integration and passion sharing events Company Initiative Benefits Courses and conferences Attractive Salary Hybrid work culture

Posted 4 hours ago

Apply

6.0 - 11.0 years

16 - 31 Lacs

Chennai

Hybrid

Naukri logo

Role & responsibilities Software Requirements: Proficiency in Java programming and understanding of RESTful APIs. Expertise in Java 8, Core Java, OOPS Concepts, Springboot, micro-services Knowledge of the Software Development Life Cycle (SDLC) and Agile methodologies. Cloud - AWS OR GCP - Must have Overall Responsibilities: Collaborate closely with cross-functional teams to gather and understand technology requirements, designing solutions to meet business needs. Develop technical specifications and detailed documentation for new features and enhancements. Stay current with the latest technology trends and advancements, suggesting ways to incorporate them into existing solutions. Conduct code reviews to ensure the quality, maintainability, and performance of the codebase. Participate in the resolution of technical issues and provide technical support to team members. Collaborate with the testing team to ensure that software solutions are thoroughly tested and meet quality standards. Technical Skills: Back-end Development: Experience with RESTful API design and development. Micro services Design Patterns: (CQRS, SAGA, Circuit Breaker, API Gateway, Service Discovery and others) Database Knowledge: Familiarity with SQL Cloud : AWS or GCP (Must have)

Posted 5 hours ago

Apply

5.0 - 8.0 years

6 - 12 Lacs

Pune

Hybrid

Naukri logo

A Day In The Life Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes. You will play a critical role in our software development. Primary responsibility will be building enterprise grade interactive web applications, a keen eye for user experience, and the ability to deploy and scale enterprise grade applications . In this role, you will collaborate closely with our Data Science team to design, prototype, develop, and deploy full-stack applications. This role offers a dynamic opportunity to join Medtronic's Diabetes business. Medtronic has announced its intention to separate the Diabetes division to promote future growth and innovation within the business and reallocate investments and resources across Medtronic, subject to applicable information and consultation requirements. While you will start your employment with Medtronic, upon establishment of SpinCo or the transition of the Diabetes business to another company, your employment may transfer to either SpinCo or the other company, at Medtronic's discretion and subject to any applicable information and consultation requirements in your jurisdiction. Responsibilities may include the following and other duties may be assigned: Collaborate with data scientists and stakeholders to understand business requirements, translate them to technical solutions. Design and built end to end solutions in React. Build responsive, user-friendly interfaces with an emphasis on usability, performance, and accessibility. Develop responsive, interactive UIs using modern JavaScript frameworks such as React, Vue, or Angular to present data science insights and AI-driven results. Build and maintain enterprise full-stack web applications, which serves for large user base more than 10,000 users. Design interfaces that allow business users to interact with AI/ML, Gen AI models, provide input parameters, view predictions, and interpret results. Collaborate with data scientists to integrate model outputs into front-end components in real time or batch. Deploy, monitor, and maintain applications on AWS, leveraging services like EC2, Fargate, Lambda, S3, and AWS Cognito. Ensure performance, security, and scalability of applications in cloud environments. Participate in code reviews, testing, and agile ceremonies to ensure high-quality delivery. Mentor and provide guidance to junior engineers. Required Knowledge and Experience: Bachelor or Master degree in Computer Science, Computer Engineering, or related fields. 5+ years of experience in building web applications using React, preferably working on data-driven applications. Frontend development using HTML5, CSS3, JS and one or more JavaScript frameworks like React, Angular or equivalent. React is preferred. Understanding of authentication and security best practices for web applications, e.g. SSO, Contrast Security integrations. Basic knowledge on SQL, Python and database technologies. Hands-on experience with cloud computing platforms (AWS). Familiarity with CI/CD pipelines and automated testing frameworks. Expert in deploying, scaling, and monitoring enterprise-level web applications. Portfolios or GitHub links for React projects. Preferred Qualifications: Have JavaScript/React certifications. Experience in JavaScript frameworks for building web applications, Preferably React JS. Experience with micro-services architecture and containerization (e.g., Docker, Kubernetes). Ability to learn and apply new technologies quickly. If interested please share your updated CV on ashwini.ukekar@medtronic.com

Posted 5 hours ago

Apply

8.0 - 11.0 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

A Day In The Life Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes. In this role as a Senior Machine Learning Engineer in the Digital Technology team, you will collaborate with data scientists and stakeholders to understand business requirements and translate them into machine learning solutions. This role offers a dynamic opportunity to join Medtronic's Diabetes business. Medtronic has announced its intention to separate the Diabetes division to promote future growth and innovation within the business and reallocate investments and resources across Medtronic, subject to applicable information and consultation requirements. While you will start your employment with Medtronic, upon establishment of SpinCo or the transition of the Diabetes business to another company, your employment may transfer to either SpinCo or the other company, at Medtronic's discretion and subject to any applicable information and consultation requirements in your jurisdiction. Responsibilities may include the following and other duties may be assigned: Design, develop, and implement machine learning models and algorithms to solve complex business problems. Use software design principles to develop production ready code. Collect, preprocess, and analyze large datasets to train and evaluate machine learning models. Optimize and fine-tune machine learning models for performance and accuracy. Deploy machine learning models into production environments and monitor their performance. Develop processes and tools to monitor and analyze model performance and data accuracy. Stay up to date with the latest advancements in machine learning and apply them to improve existing models and algorithms. Collaborate with cross-functional teams to integrate machine learning solutions into existing systems and workflows. Document and communicate machine learning solutions, methodologies, and results to technical and non-technical stakeholders. Mentor and provide guidance to junior Machine Learning Engineers. Required Knowledge and Experience: Bachelors degree from an accredited institution in a technical discipline such as the sciences, technology, engineering, or mathematics 5+ years of industry experience in writing production level, scalable code (e.g. in Python) 4+ years of experience with one or more of the following machine learning topics: classification, clustering, optimization, recommendation system, deep learning 4+ years of industry experience with distributed computing frameworks such as PySpark 4+ years of industry experience with popular ml frameworks such as Spark MLlib, Keras, Tensorflow, PyTorch, HuggingFace Transformers and other libraries (like scikit-learn, spacy, genism etc.). 4+ years of industry experience with major cloud computing services like AWS or Azure or GCP 1+ years of experience in building and scaling Generative AI Applications, specifically around frameworks like Langchain, PGVector, Pinecone, Bedrock. Experience in building Agentic AI applications. 2+ years of technical leadership leading junior engineers in a product development setting. Preferred Qualifications: MS or Ph.D. in Computer Science, Software Engineering, Electrical Engineering, or related fields. Proficient in containerization services Proficient in Sagemaker to deploy the models Experience with working in CICD framework Experience in deploying and scaling Open source LLM Models If interested, please share your updated CV on ashwini.ukekar@medtronic.com

Posted 5 hours ago

Apply

0.0 - 3.0 years

4 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Join Tortez Solutions as Python Developer(0–3 yrs). Work on backend development, REST APIs, automation and real-world AI solutions. Skills: Python, Flask/FastAPI, Git, SQL. Follow us: https://www.linkedin.com/company/tortez-solutions to know more.

Posted 5 hours ago

Apply

8.0 - 13.0 years

18 - 20 Lacs

Nashik, Pune, Bengaluru

Work from Office

Naukri logo

8+ years total experience in software testing, with 5+ years focused on performance testing. expertise in JMeter (preferred), LoadRunner, or other similar tools. Good understanding of APM tools such as Dynatrace, AppDynamics, or New Relic. Required Candidate profile Knowledge of web technologies (HTTP, REST, SOAP), databases (SQL/NoSQL), and cloud platforms (AWS/Azure). Experience working with CI/CD tools (Jenkins, Git, Docker, etc.).

Posted 6 hours ago

Apply

4.0 - 9.0 years

24 - 36 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities: * Design, develop, test & maintain software solutions using Python, React, AWS, Kubernetes, LLMs, Langchain & Crewai. * Collaborate with cross-functional teams on project delivery & technical strategy.

Posted 6 hours ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

About Client Hiring for One of the top most MNC!! Job Description Job Title : Snowflake Developer /Snowflake Data Engineer Qualification : Any Graduate or Above Relevant Experience : 4 to 12 Years SKILL Snowflake Python/Pyspark SQL AWS services Role descriptions / Expectations from the Role Strong experience in building/designing the data warehouse, data lake, and data mart end-to-end implementation experience focusing on large enterprise scale and Snowflake implementations on any of the hyper scalers. Strong experience with building productionized data ingestion and data pipelines in Snowflake Should have good exp on Snowflake RBAC and data security. Strong experience in Snowflake features including new snowflake features. Should have good experience in Python/Pyspark. Should have experience in AWS services (S3, Glue, Lambda, Secrete Manager, DMS) and few Azure services (Blob storage, ADLS, ADF) Should have experience/knowledge in orchestration and scheduling tools experience like Airflow Should have good understanding on ETL processes and ETL tools. Location : Hyderabad CTC Range : 20 LPA TO 30 LPA Notice period : ANY Shift Timing : N/A Mode of Interview : VIRTUAL Mode of Work : WORK FROM OFFICE Vardhani IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8686127477 I vardhani@blackwhite.in I www.blackwhite.in

Posted 7 hours ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Data Engineering Qualification : Any Graduate or Above Relevant Experience : 4 -10Years Required Technical Skill Set (Skill Name) : SQL, Snowflake, Python, Cloud (AWS, GCP, Azure Either) Location : Bangalore CTC Range : 1 5 LPA-30 LPA Notice period : Immediate Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. pooja.singh@blackwhite.in I www.blackwhite.in

Posted 7 hours ago

Apply

4.0 - 9.0 years

10 - 18 Lacs

Noida

Work from Office

Naukri logo

Precognitas Health Pvt. Ltd., a fully owned subsidiary of Foresight Health Solutions LLC, is seeking a Data Engineer to build and optimize our data pipelines, processing frameworks, and analytics infrastructure that power critical healthcare insights. Are you a bright, energetic, and skilled data engineer who wants to make a meaningful impact in a dynamic environment? Do you enjoy designing and implementing scalable data architectures, ML pipelines, automating ETL workflows, and working with cloud-native solutions to process large datasets efficiently? Are you passionate about transforming raw data into actionable insights that drive better healthcare outcomes? If so, join us! Youll play a crucial role in shaping our data strategy, optimizing data ingestion, and ensuring seamless data flow across our systems while leveraging the latest cloud and big data technologies. Required Skills & Experience : 4+ years of experience in data engineering, data pipelines, and ETL/ELT workflows. Strong Python programming skills with expertise in Python Programming, NumPy, Pandas, and data manipulation techniques. Hands-on experience with orchestration tools like Prefect, Apache Airflow, or AWS Step Functions for managing complex workflows. Proficiency in AWS services, including AWS Glue, AWS Batch, S3, Lambda, RDS, Athena, and Redshift. Experience with Docker containerization and Kubernetes for scalable and efficient data processing. Strong understanding of data processing layers, batch and streaming data architectures, and analytics frameworks. Expertise in SQL and NoSQL databases, query optimization, and data modeling for structured and unstructured data. Familiarity with big data technologies like Apache Spark, Hadoop, or similar frameworks. Experience implementing data validation, quality checks, and observability for robust data pipelines. Strong knowledge of Infrastructure as Code (IaC) using Terraform or AWS CDK for managing cloud-based data infrastructure. Ability to work with distributed systems, event-driven architectures (Kafka, Kinesis), and scalable data storage solutions. Experience with CI/CD for data workflows, including version control (Git), automated testing, and deployment pipelines. Knowledge of data security, encryption, and access control best practices in cloud environments. Strong problem-solving skills and ability to collaborate with cross-functional teams, including data scientists and software engineers. Compensation will be commensurate with experience. If you are interested, please send your application to jobs@precognitas.com. For more information about our work, visit www.caliper.care

Posted 7 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

InnoVites , who are we? InnoVites is one of the world’s leading providers of software solutions for Cable and Wire manufacturing and distribution Industry, transforming cable factories into the smart factories. It feels great to build and deliver software solutions that exactly address customer needs. And that’s exactly what InnoVites is about. We have a laser sharp focus on our target market: the wire and cable industry. With our deep understanding of this vertical we love building software that creates high value for our customers. It has resulted in continuous growth of the company, while we are serving our customers located in over 20 countries worldwide. As our customers benefit from the global energy transition, the future at InnoVites is electrifying! As a team we cherish craftmanship and celebrate team results. Every day we seek opportunities to learn from each other and grow. Together, we can make a difference – and you can too. Dream it. Build it. Do it here. Job Title: Generative AI (GenAI) Solutions Engineer Location: Chennai Type: Full-time / Internship Department: Design / Product About the Role We are looking for a skilled and proactive Generative AI Solutions Engineer with 1–2 years of relevant background in building intelligent applications using Large Language Models (LLMs) , RAG architectures , semantic search , and vector databases . The ideal candidate should also have hands-on experience in AI-powered chatbot development and integration. Strong proficiency in Python , Node.js , and PostgreSQL with PGVector is required. This role is suited for someone passionate about building real-world applications powered by cutting-edge Generative AI technologies. . Key Responsibilities Design and develop intelligent chatbots using LLMs and vector-based retrieval systems. Implement RAG pipelines for combining document retrieval with generative model responses. Build semantic search systems using PGVector and high-dimensional embeddings. Develop solutions to automatically read and interpret emails , extract key content, and trigger workflows. Create tools to scan and understand documents (PDFs, Word files, etc.) using OCR, embeddings, and LLMs. Integrate AI models with backend systems and APIs using Python and Node.js. Optimize performance of AI pipelines and ensure high-quality, context-aware outputs. Document workflows, prompt strategies, and AI integration best practices Qualifications Postgraduate degree in Computer Science, AI/ML, Data Science, or related fields. 1–2 years of experience working with LLMs , semantic embeddings , and RAG architectures . Experience with PGVector and PostgreSQL for vector-based search. Proficiency in Python and Node.js for application and API development. Experience developing chatbots , email automation , or document intelligence tools . Familiarity with OCR tools , document parsing libraries , and AI frameworks like Hugging Face Transformers , LangChain , or FAISS . Bonus Points (Good to Have): Hands-on with email classification , intent detection , or outlook/Gmail API integration Experience using tools like Tesseract OCR , pdfplumber , LangChain agents , or vector indexers Deployment experience on cloud platforms (AWS/GCP/Azure) Exposure to MLOps , containerization (Docker), or conversational UX design Interest or background in enterprise knowledge search , RPA , or process automation Why Join Us? Gain real-world experience working on enterprise-grade applications. Work with a highly experienced team and receive mentorship . Encouragement to self-learn, explore, and innovate in a supportive environment. Opportunity to transition into a full-time role after a successful internship. Flexible and collaborative work culture with room for growth and creativity.

Posted 8 hours ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

8+ years of experience in business systems architecture, enterprise applications management, or IT consulting. End-to-End Data Architecture & Insights Enablement: Design and implement scalable data pipelines and analytics frameworks. This includes creating and maintaining data models in Google BigQuery, applying dbt transformations, integrating with data ingestion tools like Fivetran, and enabling self-serve analytics via Tableau dashboards and reports. Collaborate with data engineering and business stakeholders to ensure consistent, trustworthy, and actionable insights. ● Technical Expertise: ○ Deep understanding of ERP, CRM, CLM, HRIS, and financial systems (e.g., Salesforce, NetSuite, Oracle, Bob, Ironclad). ○ Strong hands-on experience with data architecture and analytics platforms, including: - Google BigQuery (data modeling, table design) - dbt for data transformation - Fivetran or similar ETL/ELT tools for ingestion - Tableau (or equivalent BI tools) for building interactive dashboards and reports ○ Experience with API integration, Workato, and automation tools. ○ Proficiency in cloud platforms (AWS, Azure, or GCP) and SaaS solutions. ● Business Acumen & Leadership:

Posted 8 hours ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description We are seeking a highly skilled and motivated Principal Engineer/Team Lead to join our dynamic Claim Solutions team. This role demands a strong technical leader with a proven track record in designing, developing, and deploying robust API solutions within the insurance domain. You will be responsible for driving technical excellence, leading complex projects, and mentoring a team of 4-5 engineers. This position is a blend of 80% hands-on technical work and 20% team leadership, offering a unique opportunity to shape the future of our insurance technology platform. Day-to-Day Responsibilities Writing code and participating in code reviews and design discussions. Designing and developing services for the claims organization. Troubleshooting and resolving production issues. Participating in team meetings and technical discussions. Mentoring and guiding team members. Conducting performance reviews and providing feedback. Collaborating with all cross-functional teams and stakeholders. Staying up-to-date on industry trends and technologies. Responsibilities Technical Leadership (80%): API Design & Architecture: Design and develop reliable, scalable, secure and high-performance APIs for the claims organization. Define API standards, best practices, and architectural patterns. Conduct technical feasibility studies and proof-of-concepts for new initiatives. Development & Implementation: Write clean, efficient, and well-documented code using Python and other technologies as needed. Work with various AWS services including Lambda, API Gateway, S3, SQS, SNS, Dynamodb, RDS, EC2, Fargate etc.. Esnure code quality by writing unit tests. Ensure adherence to security and compliance requirements. Optimization & Troubleshooting: Ensure systems and services meet web scale. Identify and resolve performance bottlenecks. Implement monitoring and logging solutions to ensure system stability. Conduct root cause analysis for production issues and implement preventive measures. Implement Cost optimization mechanisms. Insurance Domain Expertise: Develop a deep understanding of insurance industry processes and data models. Collaborate with business analysts and product owners to translate requirements into technical solutions. Stay abreast of emerging trends and technologies in the insurtech space. Code Review and Standards: Conduct rigorous code reviews to maintain code quality and consistency. Enforce coding standards and best practices across the team. Contribute to the development and maintenance of technical documentation Team Leadership (20%): Team Mentorship & Guidance Provide technical mentorship and guidance to team members. Conduct regular one-on-one meetings to discuss progress, challenges, and career development. Foster a collaborative and supportive team environment. Task Assignment & Management: Assign tasks and responsibilities to team members based on their skills and experience. Monitor team progress and ensure timely delivery of projects. Identify and address any roadblocks or challenges faced by the team. Performance Evaluation & Feedback: Conduct performance evaluations and provide constructive feedback to team members. Identify training and development needs for team members. Assist in recruitment and onboarding of new team members. Communication & Collaboration: Facilitate effective communication and collaboration within the team and with other stakeholders. Represent the team in technical discussions and meetings. Help define and refine Agile processes within the team. Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Experience: 8-10 years of experience in software development, with a focus on API design and development. Technical Skills: Strong proficiency in at least one programming language (Java, Python, .NET, Node.js). Python is highly preferred. Experience with API design and development (REST, GraphQL). Experience/Knowledge working with Data Science teams in building and deploying their AI/ML models to Production is highly desirable. Strong experience leading projects end to end. Knowledge of API security best practices (OAuth, JWT). Experience with any public cloud platforms (AWS, Azure, GCP). AWS is highly preferred. Experience with database systems (SQL, NoSQL). Knowledge of CI/CD pipelines and DevOps practices. Experience with containerization and orchestration technologies (Docker, Fargate, ECS, Kubernetes etc). Insurance Domain Knowledge: Understanding of insurance industry processes and data models is highly desirable. Experience with insurtech solutions is a plus. Leadership Skills: Proven ability to lead and mentor engineers. Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work in a fast-paced and dynamic environment. Other Skills: Strong understanding of software development methodologies (Agile, Scrum). Excellent problem-solving and debugging skills. Ability to work independently and as part of a team. Strong attention to detail and commitment to quality. About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice

Posted 8 hours ago

Apply

5.0 years

0 Lacs

India

On-site

GlassDoor logo

Job Summary: Job Title: Senior Data Engineer – Machine Learning & Data Engineering Location: Gurgaon [IND] Department: Data Engineering / Data Science Employment Type: Full-Time YoE: 5-10 About the Role: We are looking for a Senior Data Engineer with a strong background in machine learning infrastructure , data pipeline development , and collaboration with data scientists to drive the deployment and scalability of advanced analytics and AI solutions. You will play a pivotal role in building and optimizing data systems that power ML models, dashboards, and strategic insights across the company. Key Responsibilities: Design, develop, and optimize scalable data pipelines and ETL/ELT processes to support ML workflows and analytics. Collaborate with data scientists to operationalize machine learning models in production environments (batch, real-time). Build and maintain data lakes, data warehouses, and feature stores using modern cloud technologies (e.g., AWS/GCP/Azure, Snowflake, Databricks). Implement and maintain ML infrastructure, including model versioning, CI/CD for ML, and monitoring tools (MLflow, Airflow, Kubeflow, etc.). Develop and enforce data quality, governance, and security standards. Troubleshoot data issues and support the lifecycle of model development to deployment. Partner with software engineers and DevOps teams to ensure data systems are robust, scalable, and secure. Mentor junior engineers and provide technical leadership on data and ML infrastructure. Qualifications: Required: 5+ years of experience in data engineering, ML infrastructure, or a related field. Proficient in Python, SQL, and big data processing frameworks (Spark, Flink, or similar). Experience with orchestration tools like Apache Airflow, Prefect, or Luigi. Hands-on experience deploying and managing machine learning models in production. Deep knowledge of cloud platforms (AWS, GCP, or Azure) and containerization (Docker, Kubernetes). Familiarity with CI/CD tools for data and ML pipelines. Experience with version control, testing, and reproducibility in data workflows. Preferred: Experience with feature stores (e.g., Feast), ML experiment tracking (e.g., MLflow), and monitoring solutions. Background in supporting NLP, computer vision, or time-series ML models. Strong communication skills and ability to work cross-functionally with data scientists, analysts, and engineers. Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.

Posted 8 hours ago

Apply

5.0 years

3 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

Job Requirements Quest Global is an organization at the forefront of innovation and one of the world’s fastest growing engineering services firms with deep domain knowledge and recognized expertise in the top OEMs across seven industries. We are a twenty-five-year-old company on a journey to becoming a centenary one, driven by aspiration, hunger and humility. We are looking for humble geniuses, who believe that engineering has the potential to make the impossible, possible; innovators, who are not only inspired by technology and innovation, but also perpetually driven to design, develop, and test as a trusted partner for Fortune 500 customers. As a team of remarkably diverse engineers, we recognize that what we are really engineering is a brighter future for us all. If you want to contribute to meaningful work and be part of an organization that truly believes when you win, we all win, and when you fail, we all learn, then we’re eager to hear from you. The achievers and courageous challenge-crushers we seek, have the following characteristics and skills: Roles & Responsibilities: Collaborate with business stakeholders to gather and translate data requirements into analytical solutions. Analyze large and complex datasets to identify trends, patterns, and actionable insights. Design, develop, and maintain interactive dashboards and reports using Elasticsearch/Kibana or Power BI. Conduct ad-hoc analyses and deliver data-driven narratives to support business decision- making. Ensure data accuracy, consistency, and integrity through rigorous validation and quality checks. Write and optimize SQL queries, views, and data models for reporting and analysis. Present findings through compelling visualizations, presentations, and written summaries. Work closely with data engineers and architects to enhance data pipelines and infrastructure. Contribute to the development and standardization of KPIs, metrics, and data governance practices Work Experience Required Skills (Technical Competency): Bachelor’s or master’s degree in data science, Computer Science, Statistics, or a related field. 5+ years of experience in a data analyst or business intelligence role. Proficiency in SQL and data visualization tools such as Power BI, Kibana, or similar. Proficiency in Python, Excel and data storytelling. Understanding of data modelling, ETL concepts, and basic data architecture. Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management skills To adhere to the Information Security Management policies and procedures. Desired Skills: Elasticsearch/Kibana, Power BI, AWS, Python, SQL, Data modelling, Data analysis, Data quality checks, Data validation, Data visualization, Stakeholder communication, Excel, Data storytelling, Team collaboration, Problem-solving, Analytical thinking, Presentation skills, ETL concepts.

Posted 8 hours ago

Apply

1.0 years

0 Lacs

Cochin

On-site

GlassDoor logo

Position: Cloud Technical Engineer (Microsoft & AWS) Experience: 1+ Years Salary: 3-15 LPA Location: Kochi Job Description: Looking for a skilled Cloud Technical Engineer with hands-on experience in both Microsoft and AWS environments. The ideal candidate should be capable of managing, deploying, and troubleshooting cloud-based infrastructure and services. Key Skills: AWS services (EC2, S3, VPC, IAM, Lambda) Microsoft Azure or Windows Server/Active Directory Cloud automation (Terraform/CloudFormation/Powershell) Server administration & cloud security Good communication and troubleshooting skills Job Types: Full-time, Permanent Pay: ₹240,352.44 - ₹1,380,545.48 per year Benefits: Health insurance Paid sick time Provident Fund Schedule: Day shift Morning shift Supplemental Pay: Commission pay Performance bonus Yearly bonus Experience: Cloud Engineer: 1 year (Required) Work Location: In person Speak with the employer +91 8281289352

Posted 8 hours ago

Apply

3.0 - 5.0 years

8 Lacs

India

On-site

GlassDoor logo

Back-End Developer (Onsite – Kerala) Location: Onsite – Kerala Salary: ₹8 LPA (Annual) Job Type: Full-Time About Xpress Health: Xpress Health is a rapidly growing healthcare technology company, connecting medical professionals with hospitals and clinics across Ireland and India. We're building reliable, secure, and scalable systems to support our healthcare staffing platform, and are seeking a talented Back-End Developer to join our Kerala-based engineering team. Role Overview: As a Back-End Developer , you will be responsible for designing, building, and maintaining server-side applications and APIs that support our mobile and web platforms. This is an onsite role based in Kerala. Key Responsibilities: Design and implement robust, scalable, and secure backend systems Develop APIs and integrate third-party services Optimize application performance, latency, and scalability Collaborate with frontend developers, product managers, and QA teams Maintain clear documentation for code, systems, and APIs Ensure data security and compliance best practices Debug issues and implement fixes in a timely manner Requirements: Bachelor's degree in Computer Science, Engineering, or related field 3–5 years of experience in backend development Proficient in Node.js, Python, or similar backend languages Experience with RESTful APIs, microservices, and cloud infrastructure (AWS/GCP) Strong database knowledge (MySQL, PostgreSQL, MongoDB, etc.) Understanding of Git, CI/CD pipelines, and containerization (Docker) Ability to work onsite at our Kerala office Strong problem-solving and team collaboration skills Nice to Have: Experience in healthcare or workforce platforms Knowledge of security protocols and data compliance (e.g., GDPR) Job Type: Full-time Pay: ₹800,000.00 per year Benefits: Paid time off Location Type: In-person Schedule: Evening shift Fixed shift Monday to Friday UK shift Application Question(s): Tell us about a book or movie you love. What would you say to get someone else interested in it? Experience: Back-end development: 4 years (Required) Language: English (Required) Work Location: In person

Posted 8 hours ago

Apply

5.0 years

12 Lacs

India

On-site

GlassDoor logo

Tech Lead – Xpress Health Location: Onsite - Ernakulam, Kerala Salary: As per experience Job Type: Full-Time About Xpress Health: Xpress Health is a fast-growing healthcare technology company that connects medical professionals with healthcare institutions across Ireland and India. As we expand our digital infrastructure, we're looking for an experienced Tech Lead to drive technology decisions, mentor the engineering team, and ensure the delivery of high-quality, scalable software solutions. Role Overview: The Tech Lead will be responsible for leading the development team, architecting technical solutions, and managing the end-to-end software development lifecycle. You’ll collaborate closely with product, design, and leadership teams to translate business requirements into robust and scalable systems. Key Responsibilities: Lead and mentor the development team across web, mobile, and backend platforms Design and implement scalable software architecture and code standards Drive full-cycle product development: planning, coding, testing, deployment Review and improve code quality through code reviews and best practices Collaborate with Product Managers and UX Designers for feature planning Ensure systems are secure, performant, and maintainable Stay updated on emerging technologies and recommend tools/frameworks Manage sprints, timelines, and technical documentation Requirements: Bachelor's or Master’s degree in Computer Science, Engineering, or related field 5+ years of hands-on development experience Strong proficiency in one or more stacks (e.g., MERN, MEAN, Flutter, Node.js, Python) Experience with cloud platforms (AWS, GCP, or Azure) Good understanding of CI/CD, DevOps practices, and scalable system design Proven leadership and mentoring skills Strong communication and project management skills Nice to Have: Experience working in healthcare or staffing platforms Familiarity with security and compliance in healthcare tech (e.g., GDPR, HIPAA) Job Type: Full-time Pay: ₹1,200,000.00 per year Benefits: Paid time off Schedule: Evening shift Fixed shift Monday to Friday UK shift Application Question(s): What’s a challenge you’ve faced in life, and how did you explain it to others at the time? Experience: hands-on development: 5 years (Required) Language: English (Required) Work Location: In person

Posted 8 hours ago

Apply

10.0 years

12 Lacs

Cochin

On-site

GlassDoor logo

Qualifications & Experience Education: Diploma or bachelor’s degree in mechanical engineering. Experience: Minimum 10 years in QA/QC roles, preferably in oil & gas projects. Certifications: CSWIP, AWS, API, or equivalent. Knowledge: Familiarity with ISO 9001, industry codes, and Oman regulatory standards. Responsibilities · Implementing the project construction quality program at site. · Monitor the quality performance of welders and conduct Toolbox talk for welders to improve their performance. · Monitor the welding activities are as per WPS and Project specification. · Coordinate with client regarding quality and technical issues. · Responsible for all the quality control activities at site. · Attending monthly QC meeting with Client. · Carry out WQT for welders as per required specifications. · Ensure painting has been done within the scope and as per approved client specification. · Inspection of holiday and discontinuity using high voltage and low voltage holiday detector. · Preparing of hydro test package and ensure pre-test punch list items (item ‘A’) are cleared before hydro test. · Carry out MEI completion for the completed lines & preparing of MEI completion package. · Ensure Bolt Torquing is carried out at required Torque values and as per procedure. Job Type: Full-time Pay: Up to ₹100,000.00 per month Schedule: Day shift Application Question(s): How many years of experience do you have? Do you have an experience in Oil & Gas? If you get a chance to work in UAE Oman so, are you interested Have uou CSWIP Certificate? Work Location: In person

Posted 8 hours ago

Apply

4.0 years

6 - 7 Lacs

Panchkula

On-site

GlassDoor logo

We are seeking a highly analytical and forward-thinking Senior Data Analyst with 4+ years of experience in data analytics and AI automation. The ideal candidate will play a key role in driving data-driven decisions and optimizing business operations through intelligent automation. You will work closely with cross-functional teams to translate business needs into scalable analytical solutions and AI-powered tools. Key Responsibilities Design, develop, and maintain dashboards, reports, and data visualizations to support strategic initiatives. Conduct advanced statistical analysis, forecasting, and predictive modeling to extract actionable insights. Automate data pipelines, reporting processes, and business workflows using AI and machine learning tools. Collaborate with stakeholders to identify key metrics, KPIs, and data requirements for business growth. Leverage AI/ML models to solve business challenges and streamline manual operations. Develop and maintain documentation for data models, analytics processes, and automation logic. Ensure data quality, consistency, and governance across all analytics initiatives. Mentor junior analysts and contribute to the data team's best practices. Qualifications & Skills Bachelor’s or Master’s degree in Data Science, Computer Science, Statistics, Economics, or a related field. 4+ years of experience in data analytics, business intelligence, and AI-based automation. Proficiency in SQL, Python (Pandas, NumPy, Scikit-learn), and data visualization tools (e.g., Power BI, Tableau). Experience with AI/ML frameworks and tools (e.g., TensorFlow, PyTorch, Azure ML, or AutoML platforms). Strong knowledge of ETL processes, data modeling, and database management. Demonstrated ability to automate workflows using AI tools such as RPA (e.g., UiPath, Power Automate) or custom solutions. Excellent problem-solving skills and ability to communicate complex ideas clearly to non-technical stakeholders. Experience working in Agile or cross-functional teams is a plus. Preferred Skills Experience with cloud platforms (AWS, Azure, or GCP) and cloud-native analytics tools. Knowledge of Natural Language Processing (NLP) or Computer Vision applications. Background in business strategy or domain-specific analytics (e.g., finance, marketing, operations). Job Types: Full-time, Permanent Pay: ₹50,000.00 - ₹60,000.00 per month Benefits: Paid sick time Paid time off Schedule: Monday to Friday Night shift US shift Ability to commute/relocate: Panchkula, Haryana: Reliably commute or planning to relocate before starting work (Required) Experience: Data Analytics: 4 years (Required) Location: Panchkula, Haryana (Required) Shift availability: Night Shift (Required) Overnight Shift (Required) Work Location: In person

Posted 8 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

About us Bain & Company is a global management consulting that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with Bain Capability Network (BCN) collaborates with global case teams to address clients' pressing business challenges. Integrated with Bain's diverse capabilities and industry practices, leveraging sector expertise, data, research, and analytics to enhance intellectual property and deliver impactful client solutions. As part of BCN Data Engineering team, you will play a pivotal role in supporting Bain & Company’s client engagements (case work) and the development of innovative, data-driven products. This role requires a blend of technical expertise, problem-solving, and collaboration, as you’ll work closely with Bain consultants, product teams, and global stakeholders to deliver impactful data solutions. What you’ll do Write complex code to develop scalable, flexible, user-friendly applications across a robust technology stack. Evaluate potential technologies for adoption, including open-source frameworks, libraries, and tools. Construct, test, install, and maintain software applications. Contribute to the planning for acceptance testing and implementation of new software, performing supporting activities to ensure that customers have the information and assistance they need for a successful implementation. Develop secure and highly performant services and APIs. Ensure the maintainability and quality of code. About you A Bachelor’s or Master’s degree in Computer Science or related field 3 to 5 years of experience in full stack development Proficiency in back-end technologies such as Node.js, Python (Django/Flask) Experience working with relational and non-relational databases (e.g., MySQL, PostgreSQL, MongoDB) Strong proficiency in JavaScript, TypeScript, or similar programming languages Familiarity with modern development tools like Git, Docker, and CI/CD pipelines. Experience with front-end frameworks (e.g., React.js, Angular, or Vue.js). Knowledge of RESTful APIs and/or GraphQL. Understanding of front-end and back-end architecture and design principles. Basic knowledge of cloud platforms (e.g., AWS, Azure, or Google Cloud) and containerization tools like Docker or Kubernetes. Sound SDLC skills, preferably with experience in an agile environment Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business. What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 8 hours ago

Apply

5.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

The Role Context: This is an exciting opportunity to join a dynamic and growing organization, working at the forefront of technology trends and developments in social impact sector. Wadhwani Center for Government Digital Transformation (WGDT) works with the government ministries and state departments in India with a mission of “ Enabling digital transformation to enhance the impact of government policy, initiatives and programs ”. We are seeking a highly motivated and detail-oriented individual to join our team as a Data Engineer with experience in the designing, constructing, and maintaining the architecture and infrastructure necessary for data generation, storage and processing and contribute to the successful implementation of digital government policies and programs. You will play a key role in developing, robust, scalable, and efficient systems to manage large volumes of data, make it accessible for analysis and decision-making and driving innovation & optimizing operations across various government ministries and state departments in India. Key Responsibilities: a. Data Architecture Design : Design, develop, and maintain scalable data pipelines and infrastructure for ingesting, processing, storing, and analyzing large volumes of data efficiently. This involves understanding business requirements and translating them into technical solutions. b. Data Integration: Integrate data from various sources such as databases, APIs, streaming platforms, and third-party systems. Should ensure the data is collected reliably and efficiently, maintaining data quality and integrity throughout the process as per the Ministries/government data standards. c. Data Modeling: Design and implement data models to organize and structure data for efficient storage and retrieval. They use techniques such as dimensional modeling, normalization, and denormalization depending on the specific requirements of the project. d. Data Pipeline Development/ ETL (Extract, Transform, Load): Develop data pipeline/ETL processes to extract data from source systems, transform it into the desired format, and load it into the target data systems. This involves writing scripts or using ETL tools or building data pipelines to automate the process and ensure data accuracy and consistency. e. Data Quality and Governance: Implement data quality checks and data governance policies to ensure data accuracy, consistency, and compliance with regulations. Should be able to design and track data lineage, data stewardship, metadata management, building business glossary etc. f. Data lakes or Warehousing: Design and maintain data lakes and data warehouse to store and manage structured data from relational databases, semi-structured data like JSON or XML, and unstructured data such as text documents, images, and videos at any scale. Should be able to integrate with big data processing frameworks such as Apache Hadoop, Apache Spark, and Apache Flink, as well as with machine learning and data visualization tools. g. Data Security : Implement security practices, technologies, and policies designed to protect data from unauthorized access, alteration, or destruction throughout its lifecycle. It should include data access, encryption, data masking and anonymization, data loss prevention, compliance, and regulatory requirements such as DPDP, GDPR, etc. h. Database Management: Administer and optimize databases, both relational and NoSQL, to manage large volumes of data effectively. i. Data Migration: Plan and execute data migration projects to transfer data between systems while ensuring data consistency and minimal downtime. a. Performance Optimization : Optimize data pipelines and queries for performance and scalability. Identify and resolve bottlenecks, tune database configurations, and implement caching and indexing strategies to improve data processing speed and efficiency. b. Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with access to the necessary data resources. They also work closely with IT operations teams to deploy and maintain data infrastructure in production environments. c. Documentation and Reporting: Document their work including data models, data pipelines/ETL processes, and system configurations. Create documentation and provide training to other team members to ensure the sustainability and maintainability of data systems. d. Continuous Learning: Stay updated with the latest technologies and trends in data engineering and related fields. Should participate in training programs, attend conferences, and engage with the data engineering community to enhance their skills and knowledge. Desired Skills/ Competencies Education: A Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or equivalent with at least 5 years of experience. Database Management: Strong expertise in working with databases, such as SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Big Data Technologies: Familiarity with big data technologies, such as Apache Hadoop, Spark, and related ecosystem components, for processing and analyzing large-scale datasets. ETL Tools: Experience with ETL tools (e.g., Apache NiFi, Talend, Apache Airflow, Talend Open Studio, Pentaho, Infosphere) for designing and orchestrating data workflows. Data Modeling and Warehousing: Knowledge of data modeling techniques and experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Data Governance and Security: Understanding of data governance principles and best practices for ensuring data quality and security. Cloud Computing: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services for scalable and cost-effective data storage and processing. Streaming Data Processing: Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink) for handling streaming data. KPIs: Data Pipeline Efficiency: Measure the efficiency of data pipelines in terms of data processing time, throughput, and resource utilization. KPIs could include average time to process data, data ingestion rates, and pipeline latency. Data Quality Metrics: Track data quality metrics such as completeness, accuracy, consistency, and timeliness of data. KPIs could include data error rates, missing values, data duplication rates, and data validation failures. System Uptime and Availability: Monitor the uptime and availability of data infrastructure, including databases, data warehouses, and data processing systems. KPIs could include system uptime percentage, mean time between failures (MTBF), and mean time to repair (MTTR). Data Storage Efficiency: Measure the efficiency of data storage systems in terms of storage utilization, data compression rates, and data retention policies. KPIs could include storage utilization rates, data compression ratios, and data storage costs per unit. Data Security and Compliance: Track adherence to data security policies and regulatory compliance requirements such as DPDP, GDPR, HIPAA, or PCI DSS. KPIs could include security incident rates, data access permissions, and compliance audit findings. Data Processing Performance: Monitor the performance of data processing tasks such as ETL (Extract, Transform, Load) processes, data transformations, and data aggregations. KPIs could include data processing time, CPU usage, and memory consumption. Scalability and Performance Tuning: Measure the scalability and performance of data systems under varying workloads and data volumes. KPIs could include scalability benchmarks, system response times under load, and performance improvements achieved through tuning. Resource Utilization and Cost Optimization: Track resource utilization and costs associated with data infrastructure, including compute resources, storage, and network bandwidth. KPIs could include cost per data unit processed, cost per query, and cost savings achieved through optimization. Incident Response and Resolution: Monitor the response time and resolution time for data-related incidents and issues. KPIs could include incident response time, time to diagnose and resolve issues, and customer satisfaction ratings for support services. Documentation and Knowledge Sharing : Measure the quality and completeness of documentation for data infrastructure, data pipelines, and data processes. KPIs could include documentation coverage, documentation update frequency, and knowledge sharing activities such as internal training sessions or knowledge base contributions. Years of experience of the current role holder New Position Ideal years of experience 3 – 5 years Career progression for this role CTO WGDT (Head of Incubation Centre) ******************************************************************************* Wadhwani Corporate Profile: (Click on this link) Our Culture: WF is a global not-for-profit, and works like a start-up, in a fast-moving, dynamic pace where change is the only constant and flexibility is the key to success. Three mantras that we practice across job roles, levels, functions, programs and initiatives, are Quality, Speed, Scale, in that order. We are an ambitious and inclusive organization, where everyone is encouraged to contribute and ideate. We are intensely and insanely focused on driving excellence in everything we do. We want individuals with the drive for excellence, and passion to do whatever it takes to deliver world class outcomes to our beneficiaries. We set our own standards often more rigorous than what our beneficiaries demand, and we want individuals who love it this way. We have a creative and highly energetic environment – one in which we look to each other to innovate new solutions not only for our beneficiaries but for ourselves too. Open to collaborate with a borderless mentality, often going beyond the hierarchy and siloed definitions of functional KRAs, are the individuals who will thrive in our environment. This is a workplace where expertise is shared with colleagues around the globe. Individuals uncomfortable with change, constant innovation, and short learning cycles and those looking for stability and orderly working days may not find WF to be the right place for them. Finally, we want individuals who want to do greater good for the society leveraging their area of expertise, skills and experience. The foundation is an equal opportunity firm with no bias towards gender, race, colour, ethnicity, country, language, age and any other dimension that comes in the way of progress. Join us and be a part of us! Bachelors in Technology / Masters in Technology

Posted 8 hours ago

Apply

4.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

Job Title: Software Engineer – AI/ML Location: Mohali/Chandigarh/Delhi Experience: 4-8 years About the Role: We are seeking a highly experienced and innovative AI & ML engineer to lead the design, development, and deployment of advanced AI/ML solutions, including Large Language Models (LLMs), for enterprise-grade applications. You will work closely with cross-functional teams to drive AI strategy, define architecture, and ensure scalable and efficient implementation of intelligent systems. Key Responsibilities: Design and architect end-to-end AI/ML solutions including data pipelines, model development, training, and deployment. Develop and implement ML models for classification, regression, NLP, computer vision, and recommendation systems. Build, fine-tune, and integrate Large Language Models (LLMs) such as GPT, BERT, LLaMA, etc., into enterprise applications. Evaluate and select appropriate frameworks, tools, and technologies for AI/ML projects. Lead AI experimentation, proof-of-concepts (PoCs), and model performance evaluations. Collaborate with data engineers, product managers, and software developers to integrate models into production environments. Ensure robust MLOps practices, version control, reproducibility, and model monitoring. Stay up to date with advancements in AI/ML, especially in generative AI and LLMs, and apply them innovatively. Requirements : Bachelor’s or Master’s degree in Computer Science, Data Science, AI/ML, or related field. Min 4+ years of experience in AI/ML. Deep understanding of machine learning algorithms, neural networks, and deep learning architectures. Proven experience working with LLMs, transformer models, and prompt engineering. Hands-on experience with ML frameworks such as TensorFlow, PyTorch, Hugging Face, LangChain, etc. Proficiency in Python and experience with cloud platforms (AWS, Azure, or GCP) for ML workloads. Strong knowledge of MLOps tools (MLflow, Kubeflow, SageMaker, etc.) and practices. Excellent problem-solving and communication skills. Preferred Qualifications: Experience with vector databases (e.g., Pinecone, FAISS, Weaviate) and embeddings. Exposure to real-time AI systems, streaming data, or edge AI. Contributions to AI research, open-source projects, or publications in AI/ML. Interested ones, kindly apply here or share resume at hr@softprodigy.com

Posted 8 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies