Home
Jobs

17333 Automate Jobs - Page 34

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

10 - 42 Lacs

Delhi, India

On-site

Industry & Sector: A fast-growing company in the Web Application and SaaS development space, we build scalable B2B and consumer products for fintech, e-commerce and enterprise automation clients. Our on-site engineering hub in India is expanding to deliver end-to-end digital solutions using modern JavaScript, cloud and DevOps stacks. Position: Full Stack Developer (Node.js & React) Role & Responsibilities Design, develop and ship high-performing web applications using React for the front end and Node.js/Express for the back end. Create RESTful and GraphQL APIs, integrate third-party services and ensure secure data exchange with SQL or NoSQL stores. Implement responsive UI components, state management and reusable design patterns for optimal user experience. Optimise application performance, conduct code reviews and enforce best-practice standards across the stack. Collaborate with Product, UX and QA teams to refine requirements, estimate effort and deliver sprint commitments on time. Automate CI/CD pipelines, unit tests and monitoring dashboards to guarantee reliability in production. Skills & Qualifications Must-Have 3–6 years professional experience building full-stack JavaScript applications. Advanced proficiency in React.js, Redux/Context and modern ES6+ features. Hands-on expertise with Node.js, Express/Koa and REST API design. Solid understanding of relational (MySQL/PostgreSQL) and document databases (MongoDB). Comfort with Git workflows, Agile/Scrum and writing unit/integration tests. Ability to work on-site in India and collaborate in a fast-paced team environment. Preferred Experience with TypeScript, Next.js or server-side rendering. Knowledge of Docker, Kubernetes and cloud platforms (AWS, Azure or GCP). Exposure to microservices, message queues and event-driven architectures. Familiarity with performance profiling, Web Vitals and accessibility standards. Understanding of DevOps metrics, Grafana/Prometheus or similar monitoring. Benefits & Culture Highlights Mentor-driven engineering culture that values clean code and continuous learning. Competitive salary, performance bonuses and rapid career-growth pathways. On-site hackathons, tech talks and sponsored certifications to keep your skills future-ready. Join us to build high-impact products that reach millions while elevating your full-stack mastery. Skills: sql,express,git,html/css,rest api,node.js,gcp,integration testing,javascript,next.js,kubernetes,mongodb,agile,aws,react.js,nosql,unit testing,typescript,azure,docker,redux

Posted 2 days ago

Apply

3.0 years

8 - 30 Lacs

Hyderabad, Telangana, India

On-site

Industry: Information Technology Services – Cloud Data Engineering & Analytics. We empower enterprises across finance, retail, and healthcare to unlock actionable insights by modernizing their data estates on cloud-native platforms. Our on-site engineering team in India designs, builds, and optimizes high-throughput data pipelines, delivering real-time analytics and governed self-service reporting at scale. Role & Responsibilities Design and implement highly performant Snowflake data warehouses, including database objects, schema design, and access controls. Develop end-to-end ELT pipelines using Snowflake Snowpipe, Streams, Tasks, and Python/SQL scripts to ingest structured and semi-structured data. Optimize query performance through clustering, caching, micro-partitioning, and metadata management, ensuring sub-second analytics response times. Integrate Snowflake with AWS, Azure, and on-prem sources via S3, Kafka, API, and third-party ETL tools such as Matillion or Fivetran. Automate code deployment, testing, and monitoring with Git, CI/CD, and Infrastructure-as-Code frameworks (Terraform/CloudFormation). Collaborate with data architects, analysts, and business stakeholders to translate requirements into secure, reusable data models and marts. Skills & Qualifications Must-Have 3+ years hands-on Snowflake engineering with deep SQL proficiency. Expertise in designing dimensional and data-vault models and implementing ELT pipelines. Solid understanding of AWS or Azure cloud storage, IAM, and networking concepts. Proficiency in Python or Scala for data transformation and automation. Experience tuning warehouses, resource monitors, and access rules for cost governance. Preferred Familiarity with dbt, Airflow, or similar orchestration frameworks. Knowledge of data-mesh and data-governance best practices. Exposure to CI/CD pipelines with Docker and Kubernetes. Benefits & Culture Competitive salary with performance bonuses and on-going Snowflake certification sponsorship. On-site collaborative workspace fostering continuous learning and weekly tech talks. Robust health insurance and flexible leave policy to support work-life balance. Skills: sql,snowpipe,performance tuning,terraform,streams,cloudformation,kubernetes,tasks,snowflake data engineer,aws,snowflake,dbt,python,data modeling,azure,docker,airflow

Posted 2 days ago

Apply

3.0 years

8 - 30 Lacs

Hyderabad, Telangana, India

On-site

Industry & Sector: A fast-scaling provider of analytics & data engineering services within the enterprise software and digital transformation sector in India seeks an onsite PySpark Engineer to build and optimize high-volume data pipelines on modern big-data platforms. Role & Responsibilities Design, develop, and maintain PySpark-based batch and streaming pipelines for data ingestion, cleansing, transformation, and aggregation. Optimize Spark jobs for performance and cost, tuning partitions, caching strategies, and join execution plans. Integrate diverse data sources—RDBMS, NoSQL, cloud storage, and REST APIs—into unified, consumable datasets for analytics and reporting teams. Implement robust data quality, error-handling, and lineage tracking using Spark SQL, Delta Lake, and metadata tools. Collaborate with Data Architects and BI teams to translate analytical requirements into scalable data models. Follow Agile delivery practices, write unit and integration tests, and automate deployments through Git-driven CI/CD pipelines. Skills & Qualifications Must-Have 3+ years hands-on PySpark development in production environments. Deep knowledge of Spark SQL, DataFrames, RDD optimizations, and performance tuning. Proficiency in Python 3, object-oriented design, and writing reusable modules. Experience with Hadoop ecosystem, Hive/Impala, and cloud object storage such as S3, ADLS, or GCS. Strong SQL skills and understanding of star/snowflake schema modeling. Preferred Exposure to Delta Lake, Apache Airflow, or Kafka for orchestration and streaming. Experience deploying on Databricks or EMR and configuring autoscaling clusters. Knowledge of Docker or Kubernetes for containerized data workloads. Benefits & Culture Highlights Hands-on work with modern open-source tech stacks and leading cloud platforms. Mentorship from senior data engineers and architects, fostering rapid skill growth. Performance-based bonuses, skill-development stipends, and a collaborative, innovation-driven environment. Skills: sql,hadoop ecosystem,pyspark engineer,scala,python 3,performance tuning,problem solving,apache airflow,hive,emr,kubernetes,agile,impala,pyspark,dataframes,delta lake,rdd optimizations,object-oriented design,python,spark,data modeling,databricks,spark sql,docker,hadoop,etl,kafka

Posted 2 days ago

Apply

6.0 years

10 - 42 Lacs

Hyderabad, Telangana, India

On-site

Industry & Sector: A fast-growing company in the Web Application and SaaS development space, we build scalable B2B and consumer products for fintech, e-commerce and enterprise automation clients. Our on-site engineering hub in India is expanding to deliver end-to-end digital solutions using modern JavaScript, cloud and DevOps stacks. Position: Full Stack Developer (Node.js & React) Role & Responsibilities Design, develop and ship high-performing web applications using React for the front end and Node.js/Express for the back end. Create RESTful and GraphQL APIs, integrate third-party services and ensure secure data exchange with SQL or NoSQL stores. Implement responsive UI components, state management and reusable design patterns for optimal user experience. Optimise application performance, conduct code reviews and enforce best-practice standards across the stack. Collaborate with Product, UX and QA teams to refine requirements, estimate effort and deliver sprint commitments on time. Automate CI/CD pipelines, unit tests and monitoring dashboards to guarantee reliability in production. Skills & Qualifications Must-Have 3–6 years professional experience building full-stack JavaScript applications. Advanced proficiency in React.js, Redux/Context and modern ES6+ features. Hands-on expertise with Node.js, Express/Koa and REST API design. Solid understanding of relational (MySQL/PostgreSQL) and document databases (MongoDB). Comfort with Git workflows, Agile/Scrum and writing unit/integration tests. Ability to work on-site in India and collaborate in a fast-paced team environment. Preferred Experience with TypeScript, Next.js or server-side rendering. Knowledge of Docker, Kubernetes and cloud platforms (AWS, Azure or GCP). Exposure to microservices, message queues and event-driven architectures. Familiarity with performance profiling, Web Vitals and accessibility standards. Understanding of DevOps metrics, Grafana/Prometheus or similar monitoring. Benefits & Culture Highlights Mentor-driven engineering culture that values clean code and continuous learning. Competitive salary, performance bonuses and rapid career-growth pathways. On-site hackathons, tech talks and sponsored certifications to keep your skills future-ready. Join us to build high-impact products that reach millions while elevating your full-stack mastery. Skills: sql,express,git,html/css,rest api,node.js,gcp,integration testing,javascript,next.js,kubernetes,mongodb,agile,aws,react.js,nosql,unit testing,typescript,azure,docker,redux

Posted 2 days ago

Apply

4.0 years

8 - 30 Lacs

Hyderabad, Telangana, India

On-site

Industry & Sector: We are a high-growth software engineering consultancy serving enterprise clients in banking, e-commerce, and telecommunications, delivering robust back-end platforms built on the Java ecosystem. Primary Title: Java Software Engineer (On-site, India) Role & Responsibilities Develop and enhance microservice APIs using Java 11+, Spring Boot, and Hibernate, ensuring low-latency and fault-tolerant operation. Translate business requirements into technical designs, writing clean, modular, and testable code that adheres to SOLID principles. Optimize application performance through profiling, tuning JVM parameters, and refactoring for concurrency. Integrate relational and NoSQL databases, design efficient queries, and manage data migrations. Collaborate with DevOps to containerize services with Docker and automate CI/CD pipelines in Jenkins or GitLab. Mentor junior engineers, perform code reviews, and uphold engineering excellence standards. Skills & Qualifications Must-Have 4 years professional experience building production systems in Core Java and Spring Boot. Hands-on expertise with RESTful API design, JSON, and Swagger/OpenAPI. Proficiency in SQL (MySQL, PostgreSQL) and ORM frameworks such as JPA or Hibernate. Strong grasp of data structures, algorithms, multithreading, and JVM internals. Working knowledge of Git, unit testing with JUnit/Mockito, and Agile Scrum ceremonies. Bachelors degree in Computer Science or equivalent. Preferred Exposure to Microservices patterns (circuit breaker, service discovery) and messaging queues like Kafka or RabbitMQ. Experience deploying containers on Kubernetes or OpenShift. Familiarity with cloud services (AWS, Azure, GCP) and infrastructure as code. Knowledge of observability stacks—Prometheus, Grafana, ELK. Certification in Java or cloud platform. Benefits & Culture Highlights Collaborative, engineer-led culture that rewards innovation and continuous learning. On-premises training labs, sponsored certifications, and access to global tech conferences. Transparent career ladder with fast-track leadership opportunities. Join us to craft high-impact software that powers millions of daily transactions while advancing your Java mastery in a supportive, growth-oriented environment. Skills: elk,openshift,mysql,mockito,swagger/openapi,junit,prometheus,infrastructure as code,grafana,spring boot,agile scrum,ci/cd pipelines,git,multithreading,jvm internals,jpa,gcp,sql,docker,hibernate,data structures,azure,restful api design,microservices,json,core java,kubernetes,aws,postgresql,algorithms

Posted 2 days ago

Apply

4.0 years

0 Lacs

India

On-site

a Bit About Us Appknox is a leading mobile application security platform that helps enterprises automate security testing across their mobile apps, APIs, and DevSecOps pipelines. Trusted by global banks, fintechs, and government agencies, we enable secure mobile experiences with speed and confidence. About The Role We're looking for a Jr. Technical Support Engineer to join our global support team and provide world-class assistance to customers in the US time zones from 8pm to 5am IST. You will troubleshoot, triage, and resolve technical issues related to Appknox’s mobile app security platform, working closely with Engineering, Product, and Customer Success teams. Key Responsibilities Respond to customer issues via email, chat, and voice/voip calls during US business hours. Diagnose, replicate, and resolve issues related to DAST, SAST, and API security modules. Troubleshoot integration issues across CI/CD pipelines, API connections, SDKs, and mobile app builds. Document known issues and solutions in the internal knowledge base and help center. Escalate critical bugs to engineering with full context, reproduction steps, and logs. Guide customers on secure implementation best practices and platform usage. Collaborate with product and QA teams to suggest feature improvements based on customer feedback. Participate in on-call support rotations if needed. Requirements 1–4 years of experience in technical support, Delivery or QA roles at a SaaS or cybersecurity company. Excellent communication and documentation skills in English. Comfortable working independently and handling complex technical conversations with customers. Basic understanding of mobile platforms (Android, iOS), REST APIs, Networking Architecture, and security concepts (OWASP, CVEs, etc.). Familiarity with command-line tools, mobile build systems (Gradle/Xcode), and HTTP proxies (Burp). Ability to work full-time within US time zones. Ensure that you have a stable internet connection and work station. Good To Have Skills Experience working in a product-led cybersecurity company. Knowledge of scripting languages (Python, Bash) or log analysis tools. Familiarity with CI/CD tools (Jenkins, GitHub Actions, GitLab CI) is a plus. Familiarity with ticketing and support tools like Freshdesk, Jira, Postman, and Slack. Compensation As per Industry Standards Interview Process Application- Submit your resume and complete your application via our job portal. Screening-We’ll review your background and fit—typically invite you on cutshort for a Profile Evaluation call (15 mins) Assignment Round- You'll receive a real-world take-home task to complete within 48 hours. Panel Interview- Meet with a cross-functional interview panel to assess technical skills, problem-solving, and collaboration. Stakeholder Interview- A focused discussion with the Director to evaluate strategic alignment and high-level fit. HR Round- Final chat to discuss cultural fit, compensation, notice period, and next steps. Personality Traits We Admir A confident and dynamic working persona, which can bring fun to the team, and a sense of humour, is an added advantage. Great attitude to ask questions, learn and suggest process improvements. Has attention to details and helps identify edge cases. Highly motivated and coming up with fresh ideas and perspectives to help us move towards our goals faster. Follow timelines and absolute commitment to deadlines. Why Join Us Freedom & Responsibility: If you are a person who enjoys challenging work & pushing your boundaries, then this is the right place for you. We appreciate new ideas & ownership as well as flexibility with working hours. Great Salary & Equity: We keep up with the market standards & provide pay packages considering updated standards. Also as Appknox continues to grow, you’ll have a great opportunity to earn more & grow with us. Moreover, we also provide equity options for our top performers. Holistic Growth: We foster a culture of continuous learning and take a much more holistic approach to train and develop our assets: the employees. We shall also support you all on that journey of yours. Transparency: Being a part of a start-up is an amazing experience, one of the reasons being open communication & transparency at multiple levels. Working with Appknox will give you the opportunity to experience it all first-hand. Skills:- SaaS, Cyber Security, Technical support, JIRA, SDK, CI/CD and API

Posted 2 days ago

Apply

3.0 years

8 - 30 Lacs

Kochi, Kerala, India

On-site

Industry: Information Technology Services – Cloud Data Engineering & Analytics. We empower enterprises across finance, retail, and healthcare to unlock actionable insights by modernizing their data estates on cloud-native platforms. Our on-site engineering team in India designs, builds, and optimizes high-throughput data pipelines, delivering real-time analytics and governed self-service reporting at scale. Role & Responsibilities Design and implement highly performant Snowflake data warehouses, including database objects, schema design, and access controls. Develop end-to-end ELT pipelines using Snowflake Snowpipe, Streams, Tasks, and Python/SQL scripts to ingest structured and semi-structured data. Optimize query performance through clustering, caching, micro-partitioning, and metadata management, ensuring sub-second analytics response times. Integrate Snowflake with AWS, Azure, and on-prem sources via S3, Kafka, API, and third-party ETL tools such as Matillion or Fivetran. Automate code deployment, testing, and monitoring with Git, CI/CD, and Infrastructure-as-Code frameworks (Terraform/CloudFormation). Collaborate with data architects, analysts, and business stakeholders to translate requirements into secure, reusable data models and marts. Skills & Qualifications Must-Have 3+ years hands-on Snowflake engineering with deep SQL proficiency. Expertise in designing dimensional and data-vault models and implementing ELT pipelines. Solid understanding of AWS or Azure cloud storage, IAM, and networking concepts. Proficiency in Python or Scala for data transformation and automation. Experience tuning warehouses, resource monitors, and access rules for cost governance. Preferred Familiarity with dbt, Airflow, or similar orchestration frameworks. Knowledge of data-mesh and data-governance best practices. Exposure to CI/CD pipelines with Docker and Kubernetes. Benefits & Culture Competitive salary with performance bonuses and on-going Snowflake certification sponsorship. On-site collaborative workspace fostering continuous learning and weekly tech talks. Robust health insurance and flexible leave policy to support work-life balance. Skills: sql,snowpipe,performance tuning,terraform,streams,cloudformation,kubernetes,tasks,snowflake data engineer,aws,snowflake,dbt,python,data modeling,azure,docker,airflow

Posted 2 days ago

Apply

3.0 years

8 - 30 Lacs

Kochi, Kerala, India

On-site

Industry & Sector: A fast-scaling provider of analytics & data engineering services within the enterprise software and digital transformation sector in India seeks an onsite PySpark Engineer to build and optimize high-volume data pipelines on modern big-data platforms. Role & Responsibilities Design, develop, and maintain PySpark-based batch and streaming pipelines for data ingestion, cleansing, transformation, and aggregation. Optimize Spark jobs for performance and cost, tuning partitions, caching strategies, and join execution plans. Integrate diverse data sources—RDBMS, NoSQL, cloud storage, and REST APIs—into unified, consumable datasets for analytics and reporting teams. Implement robust data quality, error-handling, and lineage tracking using Spark SQL, Delta Lake, and metadata tools. Collaborate with Data Architects and BI teams to translate analytical requirements into scalable data models. Follow Agile delivery practices, write unit and integration tests, and automate deployments through Git-driven CI/CD pipelines. Skills & Qualifications Must-Have 3+ years hands-on PySpark development in production environments. Deep knowledge of Spark SQL, DataFrames, RDD optimizations, and performance tuning. Proficiency in Python 3, object-oriented design, and writing reusable modules. Experience with Hadoop ecosystem, Hive/Impala, and cloud object storage such as S3, ADLS, or GCS. Strong SQL skills and understanding of star/snowflake schema modeling. Preferred Exposure to Delta Lake, Apache Airflow, or Kafka for orchestration and streaming. Experience deploying on Databricks or EMR and configuring autoscaling clusters. Knowledge of Docker or Kubernetes for containerized data workloads. Benefits & Culture Highlights Hands-on work with modern open-source tech stacks and leading cloud platforms. Mentorship from senior data engineers and architects, fostering rapid skill growth. Performance-based bonuses, skill-development stipends, and a collaborative, innovation-driven environment. Skills: sql,hadoop ecosystem,pyspark engineer,scala,python 3,performance tuning,problem solving,apache airflow,hive,emr,kubernetes,agile,impala,pyspark,dataframes,delta lake,rdd optimizations,object-oriented design,python,spark,data modeling,databricks,spark sql,docker,hadoop,etl,kafka

Posted 2 days ago

Apply

4.0 years

8 - 30 Lacs

Mumbai Metropolitan Region

On-site

Industry & Sector: We are a high-growth software engineering consultancy serving enterprise clients in banking, e-commerce, and telecommunications, delivering robust back-end platforms built on the Java ecosystem. Primary Title: Java Software Engineer (On-site, India) Role & Responsibilities Develop and enhance microservice APIs using Java 11+, Spring Boot, and Hibernate, ensuring low-latency and fault-tolerant operation. Translate business requirements into technical designs, writing clean, modular, and testable code that adheres to SOLID principles. Optimize application performance through profiling, tuning JVM parameters, and refactoring for concurrency. Integrate relational and NoSQL databases, design efficient queries, and manage data migrations. Collaborate with DevOps to containerize services with Docker and automate CI/CD pipelines in Jenkins or GitLab. Mentor junior engineers, perform code reviews, and uphold engineering excellence standards. Skills & Qualifications Must-Have 4 years professional experience building production systems in Core Java and Spring Boot. Hands-on expertise with RESTful API design, JSON, and Swagger/OpenAPI. Proficiency in SQL (MySQL, PostgreSQL) and ORM frameworks such as JPA or Hibernate. Strong grasp of data structures, algorithms, multithreading, and JVM internals. Working knowledge of Git, unit testing with JUnit/Mockito, and Agile Scrum ceremonies. Bachelors degree in Computer Science or equivalent. Preferred Exposure to Microservices patterns (circuit breaker, service discovery) and messaging queues like Kafka or RabbitMQ. Experience deploying containers on Kubernetes or OpenShift. Familiarity with cloud services (AWS, Azure, GCP) and infrastructure as code. Knowledge of observability stacks—Prometheus, Grafana, ELK. Certification in Java or cloud platform. Benefits & Culture Highlights Collaborative, engineer-led culture that rewards innovation and continuous learning. On-premises training labs, sponsored certifications, and access to global tech conferences. Transparent career ladder with fast-track leadership opportunities. Join us to craft high-impact software that powers millions of daily transactions while advancing your Java mastery in a supportive, growth-oriented environment. Skills: elk,openshift,mysql,mockito,swagger/openapi,junit,prometheus,infrastructure as code,grafana,spring boot,agile scrum,ci/cd pipelines,git,multithreading,jvm internals,jpa,gcp,sql,docker,hibernate,data structures,azure,restful api design,microservices,json,core java,kubernetes,aws,postgresql,algorithms

Posted 2 days ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Digital Engineering – NCLC - M365 Copilot Developer The opportunity We are seeking resources with expertise in the M365 Suite, particularly in M365 CoPilot development, extended configurations, and integrations, along with experience in Azure Open AI / GenAI and related technologies. These professionals will join our Digital Engineering Consulting team. This is a great opportunity to be part of a leading firm and play a key role in the growth of our service offerings. Your Key Responsibilities Produces high-quality solution or infrastructure deliverables in accordance with project timelines and specifications, using sound coding and programming skills Performs coding, debugging, testing and troubleshooting throughout the development process contributing to moderately complex aspects of a project Maintains and enhances systems by fixing complicated errors, raising risks and escalating issues where necessary Works with users to capture detailed requirements, translating designs and solution architecture into design specifications Monitors and reports on potential risks/opportunities of emerging technologies, and seeks areas for continuous improvement Ensures all activities adhere to the relevant processes, procedures, standards and technical design Develop and promote best practices for usage, operations and development Strong analytical and communication skills with intense drive to learn and adopt. Skills And Attributes 4-6 years development in O365 / M365 6-12 months of hands-on experience with Microsoft 365 Copilot. (must have) Core Copilot Skills: Configure / customize robust AI solutions using Microsoft's Copilot Studio, M365 Copilot. Experience in creating custom agents. Develop Declarative Agents to extend the capabilities of M365 Copilot. Integrate Power Platform (Power Automate, Power Apps, Power BI) for automation and custom apps. Leverage Azure services (Functions, OpenAI, etc.) for scalable Copilot solutions. Craft effective prompts to guide AI models and optimize their performance. Ability to work on integration of services using Power Automate Cloud flows and Azure functions, Azure APIM. Strong expertise on Microsoft Copilot studio. Core Microsoft 365 Skills: Microsoft 365 Suite: Deep understanding of core applications like Word, Excel, PowerPoint, Outlook, Teams, SharePoint, OneDrive, and Power BI. Microsoft Graph API: Proficiency in using Graph API to access and manipulate data across various Microsoft 365 services. Power Platform: Knowledge of Power Apps, Power Automate, and Power BI for building custom solutions and automating workflows. AI and Machine Learning: (optional) AI Fundamentals: Understanding of AI concepts, machine learning, and natural language processing. Prompt Engineering: Ability to craft effective prompts to guide the AI model and generate desired outputs. AI Frameworks: Knowledge of AI Frame works such as Lang Chain. Model Training and Fine-tuning: Knowledge of training and fine-tuning AI models for specific use cases. Development and Engineering: (Must have) Programming Languages: Proficiency in languages like C#, Python, JavaScript, and TypeScript. Web Development: Experience with HTML, CSS, and JavaScript frameworks like React or Angular. Azure Functions and Logic Apps: Knowledge of serverless computing and workflow automation. Azure DevOps: Understanding of CI/CD pipelines and DevOps practices. Soft Skills: Problem-solving and Analytical Skills: Ability to identify problems, analyze data, and propose solutions. Communication Skills: Effective communication with both technical and non-technical stakeholders. Collaboration: Working effectively in cross-functional teams. Adaptability: Staying updated with the latest trends and technologies. Customer Focus: Understanding client needs and delivering solutions that meet their requirements. Maintain a positive and constructive outlook, focus on driving results, finding solutions / breakthroughs to solve problems and challenges; has a positive influence on peers. Interpersonal and communication skills. To qualify for the role, you must have A bachelor's or master's degree A minimum of 4-6 years of experience, preferably background in a professional services firm. Strong knowledge with M365 Suite of products Excellent communication skills with consulting experience preferred Ideally, you’ll also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

3.0 years

8 - 30 Lacs

Pune, Maharashtra, India

On-site

Industry: Information Technology Services – Cloud Data Engineering & Analytics. We empower enterprises across finance, retail, and healthcare to unlock actionable insights by modernizing their data estates on cloud-native platforms. Our on-site engineering team in India designs, builds, and optimizes high-throughput data pipelines, delivering real-time analytics and governed self-service reporting at scale. Role & Responsibilities Design and implement highly performant Snowflake data warehouses, including database objects, schema design, and access controls. Develop end-to-end ELT pipelines using Snowflake Snowpipe, Streams, Tasks, and Python/SQL scripts to ingest structured and semi-structured data. Optimize query performance through clustering, caching, micro-partitioning, and metadata management, ensuring sub-second analytics response times. Integrate Snowflake with AWS, Azure, and on-prem sources via S3, Kafka, API, and third-party ETL tools such as Matillion or Fivetran. Automate code deployment, testing, and monitoring with Git, CI/CD, and Infrastructure-as-Code frameworks (Terraform/CloudFormation). Collaborate with data architects, analysts, and business stakeholders to translate requirements into secure, reusable data models and marts. Skills & Qualifications Must-Have 3+ years hands-on Snowflake engineering with deep SQL proficiency. Expertise in designing dimensional and data-vault models and implementing ELT pipelines. Solid understanding of AWS or Azure cloud storage, IAM, and networking concepts. Proficiency in Python or Scala for data transformation and automation. Experience tuning warehouses, resource monitors, and access rules for cost governance. Preferred Familiarity with dbt, Airflow, or similar orchestration frameworks. Knowledge of data-mesh and data-governance best practices. Exposure to CI/CD pipelines with Docker and Kubernetes. Benefits & Culture Competitive salary with performance bonuses and on-going Snowflake certification sponsorship. On-site collaborative workspace fostering continuous learning and weekly tech talks. Robust health insurance and flexible leave policy to support work-life balance. Skills: sql,snowpipe,performance tuning,terraform,streams,cloudformation,kubernetes,tasks,snowflake data engineer,aws,snowflake,dbt,python,data modeling,azure,docker,airflow

Posted 2 days ago

Apply

3.0 - 6.0 years

8 - 24 Lacs

Pune, Maharashtra, India

On-site

A well-funded technology consulting and managed services firm in the enterprise cloud and digital transformation sector is expanding its on-site engineering team in India. We deliver mission-critical Azure architectures, migrations, and automation for BFSI, retail, and manufacturing clients, enabling high availability, security, and cost efficiency at scale. Role & Responsibilities Design, deploy, and harden Azure landing zones, subscriptions, and resource groups aligned with Microsoft Cloud Adoption Framework. Automate provisioning with ARM/Bicep or Terraform, embedding governance, tagging, and policy compliance from day one. Build CI/CD pipelines in Azure DevOps that integrate testing, security scans, and multi-stage releases to AKS, App Services, and Functions. Monitor performance, cost, and security using Azure Monitor, Log Analytics, and Sentinel, driving proactive remediation. Troubleshoot production incidents across networking, storage, and compute layers, restoring services within agreed RTO/RPO. Document runbooks and mentor junior engineers on cloud best practices and Infrastructure as Code. Skills & Qualifications Must-Have 3-6 years of hands-on Azure engineering across IaaS, PaaS, and serverless workloads. Expertise in Infrastructure as Code with ARM, Bicep, or Terraform. Proficiency in scripting (PowerShell or Python) for automation and DevOps tasks. Solid grasp of virtual networking—VNet peering, VPN/ExpressRoute, NSG, and Azure Firewall. Experience configuring Azure DevOps or GitHub Actions for build, test, and release. Working knowledge of identity and access management with Azure AD and RBAC. Preferred Exposure to containers and orchestration with AKS and Docker. Certification AZ-104 or AZ-305. Operational knowledge of Sentinel, Defender for Cloud, and cost-management tooling. Background in Windows and Linux system administration. Client-facing consulting or migration project experience. Benefits & Culture On-site, engineer-driven culture with rapid career progression and sponsored Azure certifications. Access to enterprise-grade lab environments, hackathons, and global knowledge-sharing forums. Competitive salary, performance bonus, and flexible leave policy. Skills: nsg,sentinel,identity and access management,networking,rbac,virtual networking,expressroute,infrastructure as code,containers,terraform,aks,serverless workloads,iaas,scripting,paas,vnet peering,azure engineering,azure devops,azure ad,arm,docker,vpn,bicep,powershell,cost-management,azure firewall,azure cloud engineer,python,defender for cloud,orchestration,github actions

Posted 2 days ago

Apply

3.0 years

8 - 30 Lacs

Pune, Maharashtra, India

On-site

Industry & Sector: A fast-scaling provider of analytics & data engineering services within the enterprise software and digital transformation sector in India seeks an onsite PySpark Engineer to build and optimize high-volume data pipelines on modern big-data platforms. Role & Responsibilities Design, develop, and maintain PySpark-based batch and streaming pipelines for data ingestion, cleansing, transformation, and aggregation. Optimize Spark jobs for performance and cost, tuning partitions, caching strategies, and join execution plans. Integrate diverse data sources—RDBMS, NoSQL, cloud storage, and REST APIs—into unified, consumable datasets for analytics and reporting teams. Implement robust data quality, error-handling, and lineage tracking using Spark SQL, Delta Lake, and metadata tools. Collaborate with Data Architects and BI teams to translate analytical requirements into scalable data models. Follow Agile delivery practices, write unit and integration tests, and automate deployments through Git-driven CI/CD pipelines. Skills & Qualifications Must-Have 3+ years hands-on PySpark development in production environments. Deep knowledge of Spark SQL, DataFrames, RDD optimizations, and performance tuning. Proficiency in Python 3, object-oriented design, and writing reusable modules. Experience with Hadoop ecosystem, Hive/Impala, and cloud object storage such as S3, ADLS, or GCS. Strong SQL skills and understanding of star/snowflake schema modeling. Preferred Exposure to Delta Lake, Apache Airflow, or Kafka for orchestration and streaming. Experience deploying on Databricks or EMR and configuring autoscaling clusters. Knowledge of Docker or Kubernetes for containerized data workloads. Benefits & Culture Highlights Hands-on work with modern open-source tech stacks and leading cloud platforms. Mentorship from senior data engineers and architects, fostering rapid skill growth. Performance-based bonuses, skill-development stipends, and a collaborative, innovation-driven environment. Skills: sql,hadoop ecosystem,pyspark engineer,scala,python 3,performance tuning,problem solving,apache airflow,hive,emr,kubernetes,agile,impala,pyspark,dataframes,delta lake,rdd optimizations,object-oriented design,python,spark,data modeling,databricks,spark sql,docker,hadoop,etl,kafka

Posted 2 days ago

Apply

6.0 years

10 - 42 Lacs

Pune, Maharashtra, India

On-site

Industry & Sector: A fast-growing company in the Web Application and SaaS development space, we build scalable B2B and consumer products for fintech, e-commerce and enterprise automation clients. Our on-site engineering hub in India is expanding to deliver end-to-end digital solutions using modern JavaScript, cloud and DevOps stacks. Position: Full Stack Developer (Node.js & React) Role & Responsibilities Design, develop and ship high-performing web applications using React for the front end and Node.js/Express for the back end. Create RESTful and GraphQL APIs, integrate third-party services and ensure secure data exchange with SQL or NoSQL stores. Implement responsive UI components, state management and reusable design patterns for optimal user experience. Optimise application performance, conduct code reviews and enforce best-practice standards across the stack. Collaborate with Product, UX and QA teams to refine requirements, estimate effort and deliver sprint commitments on time. Automate CI/CD pipelines, unit tests and monitoring dashboards to guarantee reliability in production. Skills & Qualifications Must-Have 3–6 years professional experience building full-stack JavaScript applications. Advanced proficiency in React.js, Redux/Context and modern ES6+ features. Hands-on expertise with Node.js, Express/Koa and REST API design. Solid understanding of relational (MySQL/PostgreSQL) and document databases (MongoDB). Comfort with Git workflows, Agile/Scrum and writing unit/integration tests. Ability to work on-site in India and collaborate in a fast-paced team environment. Preferred Experience with TypeScript, Next.js or server-side rendering. Knowledge of Docker, Kubernetes and cloud platforms (AWS, Azure or GCP). Exposure to microservices, message queues and event-driven architectures. Familiarity with performance profiling, Web Vitals and accessibility standards. Understanding of DevOps metrics, Grafana/Prometheus or similar monitoring. Benefits & Culture Highlights Mentor-driven engineering culture that values clean code and continuous learning. Competitive salary, performance bonuses and rapid career-growth pathways. On-site hackathons, tech talks and sponsored certifications to keep your skills future-ready. Join us to build high-impact products that reach millions while elevating your full-stack mastery. Skills: sql,express,git,html/css,rest api,node.js,gcp,integration testing,javascript,next.js,kubernetes,mongodb,agile,aws,react.js,nosql,unit testing,typescript,azure,docker,redux

Posted 2 days ago

Apply

4.0 years

8 - 30 Lacs

Pune, Maharashtra, India

On-site

Industry & Sector: We are a high-growth software engineering consultancy serving enterprise clients in banking, e-commerce, and telecommunications, delivering robust back-end platforms built on the Java ecosystem. Primary Title: Java Software Engineer (On-site, India) Role & Responsibilities Develop and enhance microservice APIs using Java 11+, Spring Boot, and Hibernate, ensuring low-latency and fault-tolerant operation. Translate business requirements into technical designs, writing clean, modular, and testable code that adheres to SOLID principles. Optimize application performance through profiling, tuning JVM parameters, and refactoring for concurrency. Integrate relational and NoSQL databases, design efficient queries, and manage data migrations. Collaborate with DevOps to containerize services with Docker and automate CI/CD pipelines in Jenkins or GitLab. Mentor junior engineers, perform code reviews, and uphold engineering excellence standards. Skills & Qualifications Must-Have 4 years professional experience building production systems in Core Java and Spring Boot. Hands-on expertise with RESTful API design, JSON, and Swagger/OpenAPI. Proficiency in SQL (MySQL, PostgreSQL) and ORM frameworks such as JPA or Hibernate. Strong grasp of data structures, algorithms, multithreading, and JVM internals. Working knowledge of Git, unit testing with JUnit/Mockito, and Agile Scrum ceremonies. Bachelors degree in Computer Science or equivalent. Preferred Exposure to Microservices patterns (circuit breaker, service discovery) and messaging queues like Kafka or RabbitMQ. Experience deploying containers on Kubernetes or OpenShift. Familiarity with cloud services (AWS, Azure, GCP) and infrastructure as code. Knowledge of observability stacks—Prometheus, Grafana, ELK. Certification in Java or cloud platform. Benefits & Culture Highlights Collaborative, engineer-led culture that rewards innovation and continuous learning. On-premises training labs, sponsored certifications, and access to global tech conferences. Transparent career ladder with fast-track leadership opportunities. Join us to craft high-impact software that powers millions of daily transactions while advancing your Java mastery in a supportive, growth-oriented environment. Skills: elk,openshift,mysql,mockito,swagger/openapi,junit,prometheus,infrastructure as code,grafana,spring boot,agile scrum,ci/cd pipelines,git,multithreading,jvm internals,jpa,gcp,sql,docker,hibernate,data structures,azure,restful api design,microservices,json,core java,kubernetes,aws,postgresql,algorithms

Posted 2 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Role: Senior Manager - Performance Marketing You should apply if you have : Expertise in media mix optimization and analytics in a D2C growth startup . Proven success in executing multi-channel marketing campaigns for D2C and eCommerce businesses. Prior experience and proven record in solving performance marketing campaigns for D2C businesses - Website & App Strong skills building advanced dashboards for real-time reporting. A data-driven mindset focused on measuring and optimizing campaign performance. Experience in marketing leadership roles at D2C brands or marketplaces. A passion for scaling businesses, supported by an entrepreneurial spirit. Proficiency in analyzing marketing data, user feedback, and campaign results to shape strategies. Knowledge of A/B testing, incrementality studies, and channel optimization. Ability to develop strategies for user acquisition, retention, and lifecycle optimization. Skills in identifying and testing innovative approaches to capture market demand. Proven track record of working across teams to share insights and drive results. You should not apply if you : Can’t work in ambiguity. Can’t build processes or structures. Prefer to work in silo as an individual contributor. Are not passionate about the health and nutrition industry. Have not managed teams before. Have not worked directly with a large consumer brand before. Don't enjoy a hands-on approach and prefer a purely delegative leadership style. Skills Required : Bachelor's/Master’s degree in Marketing, Business Analytics, or related fields. Experience in marketing analytics, performance marketing, or MarTech. Advanced knowledge of marketing platforms like Google, Meta, Appsflyer, Moengage & able to do media mix optimisations basis incrementality Proficiency in visualization tools like Power BI and Tableau, with experience in automation. Strong analytical skills for deriving actionable insights from large datasets. Expertise in A/B testing, lift studies, and performance tracking . Excellent communication and stakeholder management. What will you do? Lead data-driven strategies to drive growth across D2C and eCommerce channels. Develop and automate dashboards for performance monitoring and insights. Oversee campaign optimization across platforms like Amazon, Meta, and Google. Analyze customer lifecycle data to improve acquisition, retention, and conversion. Manage marketing budgets to maximize ROI through targeted campaigns. Collaborate with internal and external stakeholders for seamless project execution. Stay updated on trends and leverage analytics for strategic decision-making. Work Experience : Proven experience in analytics-focused marketing roles within D2C or eCommerce. Working Days : Monday - Friday Location : Golf Course Ex Road, Sector 58- Gurugram, Haryana Perks: Friendly atmosphere High learning & personal growth opportunity Diverse work environment About Nutrabay: Nutrabay is one of the largest health & nutrition store in India. We are proudly a bootstrapped business with lakhs of customers that trust us. Our vision is to keep growing, having a sustainable business model and continue to be the market leader in this segment by launching many innovative products. Why Nutrabay: We believe in an open, intellectually honest culture where everyone is given the autonomy to contribute and do their life’s best work. As a part of the dynamic team at Nutrabay, you will have a chance to learn new things, solve new problems, build your competence and be a part of an innovative marketing-and-tech startup that’s revolutionising the health industry. Working with Nutrabay can be fun, and a place of a unique growth opportunity. Here you will learn how to maximise the potential of your available resources. You will get the opportunity to do work that helps you master a variety of transferable skills, or skills that are relevant across roles and departments. You will be feeling appreciated and valued for the work you delivered. We are creating a unique company culture that embodies respect and honesty that will create more loyal employees than a company that simply shells out cash. We trust our employees and their voice and ask for their opinions on important business issues. Funding: We raised $5 Million in a Series A funding

Posted 2 days ago

Apply

4.0 years

8 - 30 Lacs

Gurugram, Haryana, India

On-site

Industry & Sector: We are a high-growth software engineering consultancy serving enterprise clients in banking, e-commerce, and telecommunications, delivering robust back-end platforms built on the Java ecosystem. Primary Title: Java Software Engineer (On-site, India) Role & Responsibilities Develop and enhance microservice APIs using Java 11+, Spring Boot, and Hibernate, ensuring low-latency and fault-tolerant operation. Translate business requirements into technical designs, writing clean, modular, and testable code that adheres to SOLID principles. Optimize application performance through profiling, tuning JVM parameters, and refactoring for concurrency. Integrate relational and NoSQL databases, design efficient queries, and manage data migrations. Collaborate with DevOps to containerize services with Docker and automate CI/CD pipelines in Jenkins or GitLab. Mentor junior engineers, perform code reviews, and uphold engineering excellence standards. Skills & Qualifications Must-Have 4 years professional experience building production systems in Core Java and Spring Boot. Hands-on expertise with RESTful API design, JSON, and Swagger/OpenAPI. Proficiency in SQL (MySQL, PostgreSQL) and ORM frameworks such as JPA or Hibernate. Strong grasp of data structures, algorithms, multithreading, and JVM internals. Working knowledge of Git, unit testing with JUnit/Mockito, and Agile Scrum ceremonies. Bachelors degree in Computer Science or equivalent. Preferred Exposure to Microservices patterns (circuit breaker, service discovery) and messaging queues like Kafka or RabbitMQ. Experience deploying containers on Kubernetes or OpenShift. Familiarity with cloud services (AWS, Azure, GCP) and infrastructure as code. Knowledge of observability stacks—Prometheus, Grafana, ELK. Certification in Java or cloud platform. Benefits & Culture Highlights Collaborative, engineer-led culture that rewards innovation and continuous learning. On-premises training labs, sponsored certifications, and access to global tech conferences. Transparent career ladder with fast-track leadership opportunities. Join us to craft high-impact software that powers millions of daily transactions while advancing your Java mastery in a supportive, growth-oriented environment. Skills: elk,openshift,mysql,mockito,swagger/openapi,junit,prometheus,infrastructure as code,grafana,spring boot,agile scrum,ci/cd pipelines,git,multithreading,jvm internals,jpa,gcp,sql,docker,hibernate,data structures,azure,restful api design,microservices,json,core java,kubernetes,aws,postgresql,algorithms

Posted 2 days ago

Apply

6.0 years

10 - 42 Lacs

Gurugram, Haryana, India

On-site

Industry & Sector: A fast-growing company in the Web Application and SaaS development space, we build scalable B2B and consumer products for fintech, e-commerce and enterprise automation clients. Our on-site engineering hub in India is expanding to deliver end-to-end digital solutions using modern JavaScript, cloud and DevOps stacks. Position: Full Stack Developer (Node.js & React) Role & Responsibilities Design, develop and ship high-performing web applications using React for the front end and Node.js/Express for the back end. Create RESTful and GraphQL APIs, integrate third-party services and ensure secure data exchange with SQL or NoSQL stores. Implement responsive UI components, state management and reusable design patterns for optimal user experience. Optimise application performance, conduct code reviews and enforce best-practice standards across the stack. Collaborate with Product, UX and QA teams to refine requirements, estimate effort and deliver sprint commitments on time. Automate CI/CD pipelines, unit tests and monitoring dashboards to guarantee reliability in production. Skills & Qualifications Must-Have 3–6 years professional experience building full-stack JavaScript applications. Advanced proficiency in React.js, Redux/Context and modern ES6+ features. Hands-on expertise with Node.js, Express/Koa and REST API design. Solid understanding of relational (MySQL/PostgreSQL) and document databases (MongoDB). Comfort with Git workflows, Agile/Scrum and writing unit/integration tests. Ability to work on-site in India and collaborate in a fast-paced team environment. Preferred Experience with TypeScript, Next.js or server-side rendering. Knowledge of Docker, Kubernetes and cloud platforms (AWS, Azure or GCP). Exposure to microservices, message queues and event-driven architectures. Familiarity with performance profiling, Web Vitals and accessibility standards. Understanding of DevOps metrics, Grafana/Prometheus or similar monitoring. Benefits & Culture Highlights Mentor-driven engineering culture that values clean code and continuous learning. Competitive salary, performance bonuses and rapid career-growth pathways. On-site hackathons, tech talks and sponsored certifications to keep your skills future-ready. Join us to build high-impact products that reach millions while elevating your full-stack mastery. Skills: sql,express,git,html/css,rest api,node.js,gcp,integration testing,javascript,next.js,kubernetes,mongodb,agile,aws,react.js,nosql,unit testing,typescript,azure,docker,redux

Posted 2 days ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Senior Associate Application Developer Bangalore, Karnataka, India The mission of the Power Platform (PWP) Center of Excellence at AXA XL is to leverage the tools to improve business activities by reducing costs through automation and digitization of existing operations, growing revenue through improving existing operation processes, reducing risk by eliminating shadow IT and the use of unapproved applications, and transforming the business through innovation, strategic alignment and contributions. The overarching vision is a coordinating function that ensures change (application development) initiatives are delivered consistently and efficiently, through standard processes and competent staff. Driving innovation and improvement, breaking down siloes to share knowledge and successes. Miro is an online collaborative whiteboard platform designed for teams to brainstorm, plan, and organize ideas visually. It supports real-time collaboration, offers a wide range of templates and pre-built elements, and integrates with tools like Jira. The Enterprise Plan includes advanced security, compliance, and dedicated customer support. The Power Platform (PWP) Engineer is a significant role within PWP COE team working closely with citizen developers, architects, testers and stakeholders to deliver high quality application changes to meet the needs of our insurance business stakeholders. This position is expected to provide excellent service to internal customers \ stakeholders and have experience sustaining the entire Power Platforms service while collaborating with other technical teams to deliver successful projects within the platform. Excellent understanding in security related to Microsoft collaboration platforms is highly desirable. Miro, as a SaaS product, is jointly supported by the Power Platform @ AXA XL as the Miro Product Team @ AXA XL. The Power Automate cloud flows are utilized in support of the ticket processing handled by the Miro Service Desk team. What You’ll Be DOING What will your essential responsibilities include? Develop, maintain, and test applications and automations using Power Platform products for COE use. Ability to follow standard development practices for implementing automation solutions. Participate in mentor development activities within the development community. Review and handle PWP Silva tickets for new environments, connectors, development, and Power BI workspaces. Ensure the maintenance of PWP @ AXA XL COE components, such as service accounts. Assist with the implementation plans for features released within the PWP\Miro suite. Facilitate the development and maintenance of PWP\Miro champion networks. Act as an SME for PWP\Miro, coordinating with design and architecture teams to ensure high service quality. Create documentation, reports, articles, and presentations for training and knowledge sharing. Manage service levels to meet all commitments and address variances to minimize impact on cost, schedule, and quality. Adhere to internal procedures and audit requirements, such as data requests and related activities. Establish and sustain relationships with internal stakeholders to foster enduring and successful partnerships. Review, mentor, and coach, while promoting standards, best practices, and lessons learned. Stay updated with emerging technologies. Collaborate with end users, product analysts, and developers to comprehend features and technical implementations. Complete Software Development Lifecycle deliverables promptly and accurately, ensuring they are auditable, testable, and maintainable. Estimate work requests with varying degrees of confidence. Meet with internal customers, technical teams and stakeholders to discuss requirements and prepare documentation or presentations. You will report to Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Agile software development lifecycle experience. Experience with Jira features and functionalities. Experience with Silva features and functionalities. Stakeholder management. Power BI dashboard and report building. A highly motivated candidate possessing enthusiasm, excellent communication skills, the ability to rapidly acquire new knowledge, and a commitment to delivering value. Ability to manage their individual workload and collaborate well with other members of the COE\Product teams. Bachelor's (or equivalent industrial experiences) degree in science or engineering with software experience or education. Experience with agile software development practices, specifically the Scrum Agile paradigm. Desired Skills And Abilities Experience with test/behavior-driven development. Experience with good test design and application development. Experience working with third party vendors. Experience with .NET Development/DevOps/Full Stack. Experience with Python. Knowledge of scripting languages such as: PowerShell, Azure CLI, Bash. Exposure to these and other preview Power Platform products: Power Virtual Agents\CoPilot Studio, Power Pages, Dataverse. Experience with Microsoft Office 365 workload implementation and administration - Teams, SharePoint Online, etc. Understanding of security concepts such as identity management, https certification, identity federation. Experience with Power Platform tenant administration. Experience with Azure Platform. Familiarity with governance practices and strategies. SCIM implementation experience. Miro Addin enablement experience. Miro Guard implementation experience. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What We OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

We’re Hiring: Mobile Application Developer (Node.js) Location: "Remote" Experience: 5+ Years CTC: ₹5–8 LPA Company: iCloudEMS – Education Management Solutions About Us iCloudEMS is a trusted name in delivering cloud-based Education ERP solutions for higher education institutions. We empower universities and colleges to automate and streamline academic and administrative processes, making education more efficient and accessible. Role Overview We’re looking for a skilled Mobile Application Developer with strong expertise in Node.js to build and maintain high-performing, scalable mobile apps for our ERP and EdTech platforms. You'll work closely with our product and backend teams to deliver seamless user experiences. What You’ll Do Develop and maintain mobile applications using React Native or Flutter. Build robust backend services and RESTful APIs using Node.js. Collaborate with cross-functional teams to define, design, and ship new features. Integrate third-party APIs and internal microservices. Optimize applications for maximum speed and scalability. Participate in code reviews and follow best development practices. What We’re Looking For 5+ years of experience in mobile app development. Strong proficiency in Node.js and JavaScript. Basic knowledge of PHP. Hands-on experience with React Native or Flutter. Experience with REST APIs, Git, and mobile debugging tools. Familiarity with databases like MongoDB, MySQL, or PostgreSQL. Good understanding of app architecture and security best practices. Strong problem-solving and communication skills. Nice to Have Experience in EdTech or ERP systems. Familiarity with AWS, Firebase, or CI/CD tools. Knowledge of Docker or containerization.

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Bangalore, Karnataka, India AXA XL are creating a new delivery model based on agile and a new vendor model to enable more efficient delivery of technology with the business fully embedded in what we deliver. This Digital Factory will first deliver new capabilities for Fusion Transformation and then will be rolled out to other capabilities within GT (Global Technology) .As a Platform Engineer, your role is crucial in maintaining and optimizing the reliability, availability, and performance of our Salesforce platform. You'll collaborate with cross-functional teams to ensure a seamless experience for users and contribute to the overall success of our business processes. What You’ll Be DOING What will your essential responsibilities include? Customize and configure the Salesforce platform to enhance its reliability, incorporating features like custom objects, fields, workflows, and validation rules. Manage user accounts, profiles, and permission sets, ensuring optimal access levels and security settings. Import, export, and maintain data within Salesforce, conducting cleanup and deduplication when necessary. Experience in Salesforce administration, including user account maintenance, security settings, and workflow rules. Experience in application support, including triaging incidents and assisting support teams in resolving issues. Proven experience in optimizing the Salesforce platform for performance, security, and audit. Experience in understanding and implementing Salesforce AI capabilities will be an added advantage. Hands on experience in Implementing Integrations with various systems. Support on SSO, certificates, backup & recovery, and key encryption issues. Monitor transactions and performance with Dynatrace, including event monitoring, notifications, and logging errors. Support for pen testing requirements. Onboard all privileged users to CyberArk and non-privileged users to Aveksa for regular user access management. Rotate service/integration account credentials every 60 days and store them in CyberArk Safe. Conduct a proof of concept on using Azure KeyVault as an alternative to CyberArk to automate credential rotation. Perform non-release dependent changes based on requirements and approvals. Implement findings of Salesforce Optimizer to enhance performance and security of the platform. Implement and maintain automated deployment pipelines for Salesforce applications using CI/CD tools. Ensure smooth and efficient deployment processes across different Salesforce environments. Develop and maintain automation processes using Salesforce Process Builder, Workflow Rules, and Flow to streamline business processes. Create and maintain reports and dashboards to offer insights into sales, marketing, and customer service performance. Collaborate with different departments to understand their requirements and provide Salesforce solutions. Provide user support, answer questions, and offer training to ensure effective platform utilization. Work with the IT team to integrate Salesforce with other business systems and applications. Maintain data security and compliance with relevant regulations and company policies. Keep detailed records of configuration changes, customizations, and user guides for the Salesforce platform. Excellent understanding of Salesforce, including experience with Sales Cloud, Service Cloud, or other Salesforce products. Salesforce Administrator certification is often required or highly recommended. Proficiency in using Salesforce tools and features, with familiarity in data management, database concepts, and reporting. In-depth knowledge of Salesforce architecture, components, and deployment mechanisms. Understanding of Salesforce security features and best practices. Ability to analyze complex business processes and design solutions within Salesforce. Excellent communication skills for collaboration, understanding requirements, and providing effective training and support. Project management skills for handling Salesforce implementations, upgrades, and integrations. Attention to detail to maintain data accuracy and system integrity. Ability to troubleshoot and resolve technical issues, collaborating with Salesforce support when necessary. Collaborative attitude to work with different teams and departments. Stay up-to-date with new Salesforce features and best practices. Must have: Salesforce Admin /Developer background, Dynatrace, Commvault, Sheild Encryption, Azure Key Vault, SSO, CyberArk. You will report to the Platform Lead. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Excellent understanding of Salesforce, including experience with Sales Cloud, Service Cloud, or other Salesforce products. Salesforce Administrator certification is often required or highly recommended. Proficiency in using Salesforce tools and features, with familiarity in data management, database concepts, and reporting. Ability to analyze complex business processes and design solutions within Salesforce. Excellent communication skills for collaboration, understanding requirements, and providing effective training and support. Desired Skills And Abilities Adaptability to new/different strategies, programs, technologies, practices, cultures, etc.; comfortable with change, able to make transitions easily. Effective communication skills, both verbal and written. Proven ability to clearly articulate goals and desired outcomes and influence key decisions to ensure deliverables are met. Proven ability to establish and maintain effective relationships and leverage those relationships to deliver on goals. Ability to effectively integrate colleagues and teams that are currently disparate, introducing new technologies and processes. Proven planning and organization skills, creating work schedules, prioritizing workload, preparing in advance, and setting realistic timescales. Bachelor’s degree or equivalent work experience. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview Job Purpose At ICE, we never rest. We are on a mission as a team. We are problem solvers and partners, always starting with our customers to solve their challenges and create opportunities. Our start-up roots keep us nimble, flexible, and moving fast. We take ownership and make decisions. We all work for one company and work together to drive growth across the business. We engage in robust debates to find the best path, and then we move forward as one team. We take pride in what we do, acting with integrity and passion, so that our customers can perform better. We are experts and enthusiasts - combining ever-expanding knowledge with leading technology to consistently deliver results, solutions and opportunities for our customers and stakeholders. Every day we work toward transforming global markets. The Manager, Systems Engineering is responsible for guiding and supporting a team of engineers in creating and maintaining the core automation services within ICE’s infrastructure. This role involves setting clear objectives, providing resources, and ensuring that the team adheres to high standards for automated tooling. A successful manager will oversee the software development lifecycle, establish best practices, and foster a collaborative environment where engineers can excel. Key responsibilities include mentoring team members, providing constructive feedback, and facilitating professional development. Additionally, the manager will drive root cause analysis discussions, recommend and approve automation tools, and actively participate in strategic technical decisions to align team efforts with business goals. Responsibilities Assist in the design, planning and implementation of solutions using Python programming language Provide education and mentorship to team members, operations staff, and other departments on best practices, automation methodologies. Delegate, assign, and ensure work is completed by subordinate staff Assist in the design, planning and implementation of server and automation solutions Tune and design systems infrastructure for maximum available performance Automation of manual tasks using scripting development Oversee the development and maintenance of automation scripts in Ansible and Python, ensuring the delivery of reusable, testable, and efficient code. Collaborate with cross-functional teams to facilitate the development and upkeep of RESTful APIs while ensuring adherence to best practices. Mentor and coach junior developers to enhance their skills and knowledge. Facilitate productive code reviews, design reviews, and architecture discussions. Oversee the analysis, programming, and modification of software enhancement requests. Identify and resolve complex software development challenges promptly, providing technical guidance to the team. Collaborate with internal teams to understand business and functional requirements to ensure automation processes meet organizational needs. Guide the team in using various architectures, tools, and frameworks to automate internal processes. Actively participate in technical analysis, problem resolution, and proposing solutions. Coordinate efforts across developers, operations staff, and release engineers, fostering a service-oriented team environment. Manage on-call rotations to ensure efficient after-hours support. Lead the team to ensure robust support for production operations in a 24/7 environment, driving technical excellence to meet organizational goals. Knowledge And Experience 5+ years of experience with engineering Operating Systems as well as Software Development Engineering, Tools Automation, or similar role in platform delivery Experience as a people manager or in a team lead role with delegation duties Degree in engineering discipline or equivalent experience in Systems Engineering / Development Solid experience coding with any one or combination of PowerShell, Python, Ruby, etc Fundamental understanding of the SDLC processes, and tools (GIT, Puppet, etc.) Experience with automation/configuration management using either Puppet, Chef, Ansible or equivalent Working knowledge of multi-tiered, highly available, and resilient application design Working knowledge of horizontal and vertical scaling for performance and high availability Top-tier critical thinking, analytics, and problem-solving skills Ability to work in a service-oriented team environment Strong understanding of project Management, organization, and time management Customer focus and dedication to the best possible user experience Ability to communicate effectively with technical or business resources Understanding of Continuous Integration and Delivery concepts Fluent speaking, reading, and writing in English Desired Knowledge And Experience Experience working in a GitOps organization with a drive to automate everything Working knowledge of the creation, support and deployment of Docker Containers Working knowledge of the setup and configuration of Kubernetes 2+ years of experience in Ansible code development

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your Primary Responsibilities Include Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time Preferred Education Master's Degree Required Technical And Professional Expertise Business change adoption: Develop and deploy change management approach including stakeholder analysis, change impact analysis, communication plan, sponsor roadmap, coaching plan, training plan, resistance management plan and adoption plan. OCM COE: Build Lenovo corporate-level change management competency and change mindset, culture. Provides technical guidance to the business in area of expertise. Understands how areas within departments integrate to drive functional or business unit objectives. Provides some input on technical direction and strategy Preferred Technical And Professional Experience Excellent communication written and oral and interpersonal skills. PROSCI, APMG Certification. Creating communication plans/strategies sending impactful communication building narratives around progress, measures and KPIs understanding how to utilize different comms channels

Posted 3 days ago

Apply

3.0 years

0 Lacs

India

Remote

About BeGig BeGig is the leading tech freelancing marketplace. We empower innovative, early-stage, non-tech founders to bring their visions to life by connecting them with top-tier freelance talent. By joining BeGig, you’re not just taking on one role—you’re signing up for a platform that will continuously match you with high-impact opportunities tailored to your expertise. Your Opportunity Join our network as a Data Engineer and work directly with visionary startups to design, build, and optimize data pipelines and systems. You’ll help transform raw data into actionable insights, ensuring that data flows seamlessly across the organization to support informed decision-making. Enjoy the freedom to structure your engagement on an hourly or project basis—all while working remotely. Role Overview As a Data Engineer, you will: Design & Develop Data Pipelines: Build and maintain scalable, robust data pipelines that power analytics and machine learning initiatives. Optimize Data Infrastructure: Ensure data is processed efficiently, securely, and in a timely manner. Collaborate & Innovate: Work closely with data scientists, analysts, and other stakeholders to streamline data ingestion, transformation, and storage. What You’ll Do Data Pipeline Development: Design, develop, and maintain end-to-end data pipelines using modern data engineering tools and frameworks. Automate data ingestion, transformation, and loading processes across various data sources. Implement data quality and validation measures to ensure accuracy and reliability. Infrastructure & Optimization: Optimize data workflows for performance and scalability in cloud environments (AWS, GCP, or Azure). Leverage tools such as Apache Spark, Kafka, or Airflow for data processing and orchestration. Monitor and troubleshoot pipeline issues, ensuring smooth data operations. Technical Requirements & Skills Experience: 3+ years in data engineering or a related field. Programming: Proficiency in Python, SQL, and familiarity with Scala or Java is a plus. Data Platforms: Experience with big data technologies like Hadoop, Spark, or similar. Cloud: Working knowledge of cloud-based data solutions (e.g., AWS Redshift, BigQuery, or Azure Data Lake). ETL & Data Warehousing: Hands-on experience with ETL processes and data warehousing solutions. Tools: Familiarity with data orchestration tools such as Apache Airflow or similar. Database Systems: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases. What We’re Looking For A detail-oriented data engineer with a passion for building efficient, scalable data systems. A proactive problem-solver who thrives in a fast-paced, dynamic environment. A freelancer with excellent communication skills and the ability to collaborate with cross-functional teams. Why Join Us? Immediate Impact: Tackle challenging data problems that drive real business outcomes. Remote & Flexible: Work from anywhere with engagements structured to suit your schedule. Future Opportunities: Leverage BeGig’s platform to secure additional data-focused roles as your expertise grows. Innovative Work: Collaborate with startups at the forefront of data innovation and technology. Ready to Transform Data? Apply now to become a key Data Engineer for our client and a valued member of the BeGig network!

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Description Candidate will be responsible for forecast performance analysis across model inventory, conceptualization of metrics to track performance at model and aggregated level, evaluation of different macro-economic and business drivers and corresponding impact on models. Candidate will also be involved in maintaining macroeconomic variables inventory across PPNR forecasting models, conducting reviews of different variables and expected linkages between different variable types, performing sensitivity testing and support model developers with required information on different macro drivers. The candidate will evaluate innovative techniques to identify performance improving drivers in models to aid future iterations. Candidate will be required to analyze granular level data, identify patterns and outlier behavior across different balance sheet, income streams and drive stakeholder insights via thorough understanding of the data. Role would include conceptualization of quantitative and qualitative techniques, decision-based modeling approaches for incorporation of drivers with sparse data into models. The candidate will be required to have a strong understanding of different macro-economic variables, and an ability to understand and interpret macro variable and index computations. Candidate should be able to understand interlinkages between variables and interpret economic scenarios in terms of possible impact on different variables. Key tasks would include identification and analysis of outliers in historical series and conducting comparative analysis across different scenarios to assess trends, anomalies and performing sensitivity analysis of macro drivers on model forecasts. Candidate should possess relevant programming knowledge (such as Python, R, SAS or SQL) to automate data processes, generate summary reports and presentation of outcomes to senior Business/ Finance stakeholders supporting data-driven insights with impactful representation. Required Skillsets Bachelor’s/Master’s degree in economics/ statistics/ engineering, MBA or any degree with a focus on economics and analytics Past experience on analyzing large datasets to relevant information and drive business insights 4-8 years of experience with experience on working in analytics with exposure to analyzing macro-economic variables Good command over Python/ R/ SAS, SQL, excel, proficient in handling and analyzing large datasets from multiple source systems and alternate analytical/visualization tools Detail oriented with good understanding of statistical models and forecasting techniques Adequate understanding of finance, banking business and products and associated income streams. Good presentation and interpretation skills Ability to communicate and manage senior stakeholders across cross-functional teams Ready to work within a collaborative environment with teams Creative thinking for identifying new opportunities ------------------------------------------------------ Job Family Group: Risk Management ------------------------------------------------------ Job Family: Risk Analytics, Modeling, and Validation ------------------------------------------------------ Time Type: ------------------------------------------------------ Most Relevant Skills Credible Challenge, Data Analysis, Laws and Regulations, Management Reporting, Policy and Procedure, Referral and Escalation, Risk Controls and Monitors, Risk Identification and Assessment, Risk Remediation. ------------------------------------------------------ Other Relevant Skills Financial Planning and Analysis (FP&A), Model Governance, Model Transformations, Performance Modeling. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies