Jobs
Interviews

30073 Gcp Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

haryana

On-site

As a core member of Periscope's technology team at McKinsey, you will play a vital role in developing and deploying enterprise products, ensuring the firm stays at the forefront of technology. Your responsibilities will include software development projects, focusing on building and improving deployment pipelines, automation, and toolsets for cloud-based solutions on AWS, GCP, and Azure platforms. You will also delve into database management, Kubernetes cluster setup, performance tuning, and continuous delivery domains. Your role will involve hands-on software development, spending approximately 80% of your time on tasks related to deployment pipelines and cloud-based solutions. You will continually expand your expertise by experimenting with new technologies, frameworks, and approaches independently. Additionally, your strong understanding of agile engineering practices will enable you to guide teams on enhancing their engineering processes. In this position, you will not only contribute to software development projects but also provide coaching and mentoring to other DevOps engineers to enhance the organizational capability. Your base will be in either Bangalore or Gurugram office as a member of Periscope's technology team within McKinsey. With Periscope being McKinsey's asset-based arm in the Marketing & Sales practice, your role signifies the firm's commitment to innovation and delivering exceptional client solutions. By combining consulting approaches with solutions, Periscope aims to provide actionable insights that drive revenue growth and optimize various aspects of commercial decision-making. The Periscope platform offers a unique blend of intellectual property, prescriptive analytics, and cloud-based tools to deliver over 25 solutions focused on insights and marketing. Your qualifications should include a Bachelor's or Master's degree in computer science or a related field, along with a minimum of 6 years of experience in technology solutions, particularly in microservices architectures and SaaS delivery models. You should demonstrate expertise in working with leading Cloud Providers such as AWS, GCP, and Azure, as well as proficiency in Linux systems and DevOps tools like Ansible, Terraform, Helm, Docker, and Kubernetes. Furthermore, your experience should cover areas such as load balancing, network security, API management, and supporting web front-end and data-intensive applications. Your problem-solving skills, systems design expertise in Python, and strong monitoring and troubleshooting abilities will be essential in contributing to development tasks and ensuring uninterrupted system operation. Your familiarity with automation practices and modern monitoring tools like Elastic, Sentry, and Grafana will also be valuable in upholding system quality attributes. Strong communication skills and the ability to collaborate effectively in a team environment are crucial for conveying technical decisions, fostering alignment, and thriving in high-pressure situations. Your role at Periscope By McKinsey will not only involve technical contributions but also leadership in enhancing the team's capabilities and driving continuous improvement in technology solutions.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Red Hat Customer Experience and Engagement (CEE) team is looking for an experienced engineer to join our Solutions Support team in LOCATION. In this role, you will become an expert in Red Hat's offerings and technologies, like Red Hat OpenShift, Red Hat Enterprise Linux (RHEL), and Red Hat Ansible Automation Platform. You'll provide skilful, direct technical support to a very small subset of our customers. You'll work closely with your customer, Red Hat's Global Support team, Critical Accounts team, and Engineering teams. You will interact with some of Red Hat's most strategic, critical, and innovative customers. You will be a part of Red Hat's unique culture that is enriched with Open Practices in management, decision-making, DEI, and associate growth. Red Hat consistently ranks as one of the best workplaces in technology due to our culture and our focus on associate growth, work/life balance, and associate opportunity. You'll be able to bring innovative solutions to complex problems. You will also have the opportunity to be a part of several Red Hat Recognition programs to connect, recognize, and celebrate success. As a Red Hat engineer, you can collaborate with international teams to improve open-source software. Provide high-level technical support to your customers through web-based support and phone support. Work with Red Hat enterprise customers across the globe on a 24x7 basis that requires you to work in different shifts periodically. Meet with your customers regularly to ensure that Red Hat is aligned with the customers" support priorities. Collaborate with other Red Hat teams engaged with your customers. Perform technical diagnostics and troubleshoot customer technical issues to develop solutions. Exceed customer expectations with outstanding communication and customer service. Consult with and develop relationships with Red Hat engineering teams to guide solutions and improve customer satisfaction. Share your knowledge by contributing to the global Red Hat Knowledge Management System; present troubleshooting instructions and solutions to other engineers within Red Hat. 5+ years of relevant experience. Ability to communicate clearly and effectively with your customer across technical and non-technical communications. Excellent troubleshooting and debugging skills. A passion for technical investigation and issue resolution. Linux system administration experience, including system installation, configuration, and maintenance. Basic knowledge of Linux containers. Experience with container orchestration (Kubernetes), cloud services such as AWS, Azure, GCP, knowledge of Ansible and YAML, Linux scripting experience, understanding of typical change window/change controls. Prior Red Hat Certified Engineer (RHCE) or other Linux certifications; A successful associate in this role is expected to be able to pass the RHCE certification within 90 days. Red Hat is the world's leading provider of enterprise open-source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. Red Hat's culture is built on the open-source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. We empower people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We welcome and encourage applicants from all the beautiful dimensions that compose our global village.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for designing, developing, and optimizing ETL pipelines using PySpark on Google Cloud Platform (GCP). Your role will involve working with BigQuery, Cloud Dataflow, Cloud Composer (Apache Airflow), and Cloud Storage for data transformation and orchestration. You will need to develop and optimize Spark-based ETL processes for large-scale data processing. It will be essential to implement best practices for data governance, security, and monitoring in a cloud environment. Collaboration with data engineers, analysts, and business stakeholders to understand data requirements is a key aspect of your role. Troubleshooting performance bottlenecks and optimizing Spark jobs for efficient execution will be part of your daily tasks. Automation of data workflows using Apache Airflow or Cloud Composer is crucial. You will also be responsible for ensuring data quality, validation, and consistency across pipelines. To excel in this role, you should have at least 5 years of experience in ETL development with a focus on PySpark. Strong hands-on experience with Google Cloud Platform (GCP) services such as BigQuery, Cloud Dataflow/Apache Beam, Cloud Composer (Apache Airflow), and Cloud Storage is required. Proficiency in Python and PySpark for big data processing is a must. Experience with data lake architectures, data warehousing concepts, and SQL for data querying and transformation is essential. Experience with CI/CD pipelines for data pipeline automation, strong debugging and problem-solving skills, and familiarity with Kafka or Pub/Sub for real-time data processing are desirable. Knowledge of Terraform for infrastructure automation on GCP, experience with containerization (Docker, Kubernetes), and familiarity with DevOps and monitoring tools like Prometheus, Stackdriver, or Datadog will be beneficial for this role. Your skills in GCP, PySpark, and ETL will be put to the test as you contribute to the development and optimization of data pipelines in a cloud environment.,

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Software Engineer Java at SAP, you will be an integral part of the Data Foundation Services team in Business Data Cloud. Your focus will be on developing robust and scalable integration mechanisms between SAP's business applications and the unified data fabric. This will involve enabling seamless data movement and real-time interoperability across systems. Additionally, you will contribute to the creation of customer observability features to capture service-level interactions and behavioral data points across multiple applications, providing actionable insights, ensuring data quality, and enhancing user experience through real-time analytics and monitoring. Your responsibilities will include the end-to-end development of services and pipelines to support distributed data processing, data transformations, and intelligent automation. This role offers a unique opportunity to contribute to SAP's evolving data platform initiatives, utilizing your expertise in Java, Python, Kafka, DevOps, Real-Time Analytics, Intelligent Monitoring, BTP, and Hyperscaler ecosystems. Key Responsibilities: - Develop Microservices using Java, RESTful APIs, and messaging frameworks like Apache Kafka. - Collaborate with cross-functional teams to establish secure, reliable, and performant communication across SAP applications. - Build and maintain distributed data processing pipelines for large-scale data ingestion, transformation, and routing. - Develop an Observability Framework for Customer Insights. - Work closely with DevOps to enhance CI/CD pipelines, monitoring, and deployment strategies using modern GitOps practices. - Ensure platform reliability, scalability, and security through automated testing, logging, and telemetry. - Support cloud-native deployment of services on SAP BTP and major Hyperscalers (AWS, Azure, GCP). - Engage in SAP's broader Data Platform efforts, including Datasphere, SAP Analytics Cloud, and BDC runtime architecture. - Adhere to best practices in microservices architecture, including service discovery, load balancing, and fault tolerance. - Stay updated with industry trends and technologies to drive continuous improvement in the development process. Required Skills and Qualifications: - 5+ years of hands-on experience in backend development using Java with strong object-oriented design and integration patterns. - Exposure to Log Aggregator Tools like Splunk, ELK, etc. - Proven experience with Apache Kafka or similar messaging systems in distributed environments. - Familiarity with SAP Business Technology Platform (BTP), SAP Datasphere, SAP Analytics Cloud, or HANA is highly desirable. - Knowledge of CI/CD pipelines, containerization (Docker), Kubernetes, and DevOps best practices. - Working experience with Hyperscaler environments such as AWS, Azure, or GCP. - Passion for clean code, automated testing, performance tuning, and continuous improvement. - Strong communication skills and ability to collaborate effectively with global teams across different time zones. Join SAP's Business Data Cloud (BDC) organization and be part of the Foundation Services team that drives SAP's Data & AI strategy. Located in Bangalore, India, you'll work in a collaborative, inclusive, and high-impact environment, contributing to cutting-edge engineering efforts that enable innovation and integration across SAP's data platform. At SAP, we value inclusion, health, well-being, and flexible working models to ensure that every individual, regardless of background, feels included and empowered to perform at their best. We are committed to creating a diverse and equitable workplace, investing in our employees" development, and unleashing all talents to build a better world. SAP is an equal opportunity employer and affirmative action employer, providing accessibility accommodations to applicants with disabilities. If you require assistance during the application process, please email Careers@sap.com. Join SAP in transforming the way businesses operate and experience growth through innovative enterprise application software and services. Be part of a purpose-driven, future-focused team that collaborates to deliver solutions that meet global challenges. At SAP, you have the opportunity to bring out your best every day.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You should have proven experience in designing and developing large-scale, distributed backend systems and services. A strong understanding of Object-Oriented Programming (OOP) principles, design patterns, and best practices is required. You should be proficient in working with relational databases such as MySQL and PostgreSQL, as well as skilled with NoSQL databases like MongoDB and Redis. Familiarity with microservices architecture, containerization tools like Docker and Kubernetes, and experience using version control systems (e.g., Git) is essential. Knowledge of CI/CD pipelines for automated integration and deployment is also necessary. You must possess strong analytical and problem-solving skills to handle complex technical challenges and be comfortable working in Agile/Scrum development environments. Excellent interpersonal, consultative, and communication abilities are expected. You should be self-driven, with a proactive approach and a commitment to high-quality software delivery. Strong programming skills in Python and SQL are required, along with hands-on experience managing large datasets and optimizing data processing workflows. A deep understanding of scalability, performance optimization, and security best practices is crucial. Familiarity with cloud platforms like AWS, GCP, and Azure (preferred, not mandatory) is desirable. You should be able to thrive in a fast-paced, dynamic environment with minimal supervision. This is a full-time, permanent position with benefits including paid sick time and paid time off. The work location is in-person with a schedule consisting of day shift, fixed shift, and morning shift. The application deadline is 08/07/2025, and the expected start date is also 08/07/2025. For further details, you can speak with the employer at +91 9137737544.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a part of Maropost, you will be contributing to a unified commerce experience that transforms ecommerce, retail, marketing automation, merchandising, helpdesk, and AI operations for fast-growing businesses. With over 5,000 global brands already benefiting from our platform, we are on a journey to empower 100,000+ brands with the same relentless focus on customer success. At Maropost, we are driven by customer obsession, extreme urgency, excellence, and resourcefulness, and we are looking for individuals who are ready to make a significant impact and be part of our transformative journey. If you are a bold thinker who thrives on change and sees opportunities in every challenge, Maropost is the place for you to turn your ideas into action. Your responsibilities will include building and managing a REST API stack for Maropost Web Apps, working on architecture design aligned with our big data and analytics product vision, driving innovation within the engineering team, and ensuring technical leadership while following industry-standard best practices. You will also be involved in designing and developing complex web applications, integrating with ML and NLP engines, and handling DevOps, DBMS, and scaling on Azure or GCP. To be successful at Maropost, you should hold a B.E./B.Tech degree and have at least 3 years of experience in building and architecting backend applications, web apps, and analytics, preferably in the commerce cloud or marketing automation domain. Your experience should also include deploying applications at scale in production systems, managing API endpoints for multimodal clients, and a strong grasp of platform security capabilities. Additionally, you should have a knack for problem-solving, efficient coding practices, and possess very strong interpersonal communication and collaboration skills. If you are enthusiastic about learning, contributing to a challenging startup environment, and embodying Maropost's core values of being Customer Obsessed, having Extreme Urgency, striving for Excellence, and being Resourceful, we encourage you to join us in our mission to drive the business forward and achieve both short and long-term goals for mutual success. Join Maropost as a builder and be a part of the team that is shaping the future of commerce and customer success. Contact us today if you are ready to make a difference and drive results that will benefit us all.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

As a Backend Developer at our company, you will be responsible for designing and developing scalable backend services and RESTful APIs using Node.js and Express.js. Your focus will be on building microservices with an emphasis on modularity, performance, and reliability. You will also be tasked with implementing containerized solutions using Docker and orchestrating them using Kubernetes or equivalent technologies. Additionally, you will integrate and deploy services on cloud platforms such as AWS, Azure, or GCP. Collaboration with front-end developers and product teams will be crucial as you work together to define APIs and application flows. Ensuring security, performance, and observability best practices across all services will be a key part of your role. You will participate in agile development, conduct code reviews, and contribute to CI/CD automation and deployment pipelines. To be successful in this role, you should have at least 3 years of hands-on experience with Node.js and Express.js (or NestJS). A strong understanding of REST APIs, asynchronous programming, and event-driven architecture is essential. You must also possess solid experience with Docker and knowledge of Kubernetes or other orchestration tools. Practical experience deploying and managing services on cloud platforms like AWS, Azure, or GCP is required. Proficiency with Git, Agile processes, and CI/CD tooling (such as GitHub Actions, Bitbucket, GitLab CI, Jenkins, etc.) is expected. Additionally, familiarity with databases like PostgreSQL, MongoDB, or others will be beneficial. Bonus skills that would be nice to have include experience with React.js or Angular, familiarity with serverless architectures (e.g., AWS Lambda, Azure Functions), exposure to message queues (Kafka, RabbitMQ, etc.), and knowledge of unit testing and performance monitoring tools (e.g., Jest, Mocha, New Relic, Prometheus). Qualifications for this role include a Bachelor's degree in Computer Science, Engineering, or equivalent practical experience. A proven track record of shipping production-ready backend services is also required. Certifications in AWS/Azure/GCP are optional but considered a significant plus. This is a full-time position with benefits including health insurance and Provident Fund. The work schedule is during the day shift, and the work location is in person.,

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Node.js Developer at Fitelo, a fast-growing health and wellness platform, you will play a crucial role in leading the data strategy. Collaborating with a team of innovative thinkers, front-end experts, and domain specialists, you will be responsible for designing robust architectures, implementing efficient APIs, and ensuring that our systems are both lightning-fast and rock-solid. Your role goes beyond mere coding; it involves shaping the future of health and wellness technology by crafting elegant solutions, thinking creatively, and making a significant impact on our platform. Your responsibilities will include taking complete ownership of designing, developing, deploying, and maintaining server-side components and APIs using Node.js. You will manage the database operations lifecycle with MongoDB and PostgreSQL, collaborate with front-end developers for seamless integration, optimize application performance and scalability, and implement security protocols to safeguard data integrity. Additionally, you will oversee the entire development process, conduct code reviews, maintain documentation, research and integrate new technologies, and drive collaboration across teams to ensure successful project delivery. The ideal candidate for this role would have at least 3 years of experience in backend development primarily with Node.js. Advanced proficiency in JavaScript and Typescript, along with experience in frameworks like Express.js or Nest.js, is required. A strong understanding of asynchronous programming, event-driven architecture, SQL and NoSQL databases, RESTful APIs, GraphQL services, microservices architecture, and front-end integration is essential. Proficiency in version control tools, CI/CD pipelines, cloud platforms, problem-solving skills, debugging, and testing frameworks are also key qualifications for this role. If you are passionate about technology, enjoy crafting innovative solutions, and want to contribute to the future of health and wellness tech, we welcome you to join our team at Fitelo. Qualifications: - Bachelor's degree in technology This is a full-time position with a day shift schedule, based in Gurugram.,

Posted 2 days ago

Apply

4.0 - 5.0 years

0 Lacs

Greater Kolkata Area

On-site

Role : Data Integration Specialist Experience : 4 - 5 Years Location : India Employment Type : Full-time About The Role We are looking for a highly skilled and motivated Data Integration Specialist with 4 to 5 years of hands-on experience to join our growing team in India. In this role, you will be responsible for designing, developing, implementing, and maintaining robust data pipelines and integration solutions that connect disparate systems and enable seamless data flow across the enterprise. You'll play a crucial part in ensuring data availability, quality, and consistency for various analytical and operational needs. Key Responsibilities ETL/ELT Development : Design, develop, and optimize ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes using industry-standard tools and technologies. Data Pipeline Construction : Build and maintain scalable and efficient data pipelines from various source systems (databases, APIs, flat files, streaming data, cloud sources) to target data warehouses, data lakes, or analytical platforms. Tool Proficiency : Hands-on experience with at least one major ETL tool such as Talend, Informatica PowerCenter, SSIS, Apache NiFi, IBM DataStage, or similar platforms. Database Expertise : Proficient in writing and optimizing complex SQL queries across various relational databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL) and NoSQL databases. Cloud Data Services : Experience with cloud-based data integration services on platforms like AWS (Glue, Lambda, S3, Redshift), Azure (Data Factory, Synapse Analytics), or GCP (Dataflow, BigQuery) is highly desirable. Scripting : Develop and maintain scripts (e.g., Python, Shell scripting) for automation, data manipulation, and orchestration of data processes. Data Modeling : Understand and apply data modeling concepts (e.g., dimensional modeling, Kimball/Inmon methodologies) for data warehousing solutions. Data Quality & Governance : Implement data quality checks, validation rules, and participate in establishing data governance best practices to ensure data accuracy and reliability. Performance Tuning : Monitor, troubleshoot, and optimize data integration jobs and pipelines for performance, scalability, and reliability. Collaboration & Documentation : Work closely with data architects, data analysts, business intelligence developers, and business stakeholders to gather requirements, design solutions, and deliver data assets. Create detailed technical documentation for data flows, mappings, and transformations. Problem Solving : Identify and resolve complex data-related issues, ensuring data integrity and consistency. Qualifications Education : Bachelor's or Master's degree in Computer Science, Information Technology, Engineering, or a related quantitative field. Experience : 4 to 5 years of dedicated experience in data integration, ETL development, or data warehousing. Core Skills : Strong proficiency in SQL and at least one leading ETL tool (as listed above). Programming : Hands-on experience with Python or Shell scripting for data manipulation and automation. Databases : Solid understanding of relational database concepts and experience with various database systems. Analytical Thinking : Excellent analytical, problem-solving, and debugging skills with attention to detail. Communication : Strong verbal and written communication skills to articulate technical concepts to both technical and non-technical audiences. Collaboration : Ability to work effectively in a team environment and collaborate with cross-functional teams. Preferred/Bonus Skills Experience with real-time data integration or streaming technologies (e.g., Kafka, Kinesis). Knowledge of Big Data technologies (e.g., Hadoop, Spark). Familiarity with CI/CD pipelines for data integration projects. Exposure to data visualization tools (e.g., Tableau, Power BI). Experience in specific industry domains (e.g., Finance, Healthcare, Retail) (ref:hirist.tech)

Posted 2 days ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in integration and platform architecture focus on designing and implementing seamless integration solutions and robust platform architectures for clients. They enable efficient data flow and optimise technology infrastructure for enhanced business performance. Those in integration architecture at PwC will focus on designing and implementing seamless integration solutions to connect various organisational systems and applications. Your work will involve creating robust architectures that enable efficient data flow and enhance overall business processes. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Additional Job Description Years of Experience: 6+ year of experience Educational Qualifications: BE / B Tech / MCA / M Tech Certifications : MCD -L1 (Must) , MCD-L2 (Preferred) , MCIA (Preferred) , MCPA (Preferred), Workato Automation Pro I, II & III (Preferred), Azure Certifications for Integrations (Preferred) & Any additional certification on Integration platforms are added advantages. About Us PricewaterhouseCoopers Acceleration Center (PwC AC) is the natural extension of PwC's leading-class global delivery capabilities. Our highly skilled resources assist with software development, ERP programming development, application integration and support and maintenance services. Bangalore AC provides premium, cost-effective, high-quality technology services for projects based in the United States and global clients focused on key horizontal and vertical end-to-end solutions. Roles And Responsibilities At PwC - AC, as a Senior Integration Developer, the candidate will interact with Offshore / Onshore Managers, Architects, Developer & Business Analyst to understand the requirements and is responsible for the design, development using Mulesoft or other integration platform such as Workato, Kong or Azure Integration Services. The candidate should be able to design RAML, implement using Anypoint studio, perform mappings and transformations, deploy to Anypoint platform (CloudHub/OnPrem/Hybrid) manually or using CI/CD tools. Experience in leading and providing technical assistance to the team, performing design, code reviews and meeting the critical deadlines. Mandatory Skills Experience working with latest MuleSoft versions. (Mule 4 preferred). Experience / Knowledge on other ESB tools such Workato, Azure, Kong, Apigee, Boomi ,Tibco etc. Experience in designing RAML/OAS in the Design Center. Develop integration solutions using MuleSoft Anypoint Platform, including creating APIs, connectors, data mappings, and orchestrations. Develop flows using Anypoint studio with leading industry standards.Follow the API led connectivity approach. Should be able to work with responses of various types including XML, JSON, flat file etc. Should be able to use best fit Integration patterns according to the use case. Experience with Source control systems (SVN / GIT). Experience with Waterfall and Agile SDLC models (SCRUM). Experience with SOAP and RESTful webservices is a must . Should be able to work with all platform features including API Manager, Runtime Manager and Access Management. Should be able to fine tune performance incase of bottlenecks . Should be able to monitor applications' core/vcore/memory usage. Should be able to develop reusable assets as deemed fit. Should be able to transform data from one format to another using dataweave. Should be able to design schemas for App/API communication. Should have exposure to Messaging Queue(s) such as ActiveMQ, RabbitMQ, Kafka, Anypoint MQ etc. Should have a good understanding of Data Quality processes, methods and project lifecycle. Should have experience in writing SQL queries. Should have experience in working on technical design documents (HLD/LLD). Should be proactive in communications and lead by self. Exposure to multiple connectors/systems such as Salesforce, SAP, Netsuite etc Write M-units for all the flows across various scenarios. Prepare the unit test document. Perform integration testing and prepare the End to end testing document. Should have development experience with both real time and batch integrations. Should have knowledge on Java, microservices, Spring and Hibernate frameworks Should follow the best coding standards and processes defined for the project. Recommend improvements to the defined processes if any. Follow the defined code migration strategy to higher environments. Provide technical leadership and guidance to junior developers on best practices for development and integration design. Participate in code reviews, ensuring adherence to coding standards, best practices, and established guidelines. Preferred Knowledge Experience with cloud data technologies such as AWS, Azure, GCP, Databases Workato Expertise - Proven hands-on experience in designing and implementing simple to complex Workato Recipes & Integrations. Workato Recipe Design and Implementation- Create and optimize complex Workato Recipes to automate business processes and integrations. Strong understanding of integration concepts, including REST APIs, Webhooks, and middleware solutions. Proficient in JavaScript, Python, C# or similar scripting languages for custom script development within Workato, Azure or other iPaaS tools. Data Management- Expertise in Data Mapping, Transformation, and real-time synchronization. Error Handling: Experience with implementing comprehensive error handling and logging mechanisms. Understanding of Workato Frameworks such as AOF and concepts related to Workato. Experience or exposure to Integration Design and Patterns using API, Webhooks, PubSub. Worked in Offshore / Onsite Engagements Preferred Skills Good analytical & problem-solving skills Good communication and presentation skills Ability to work collaboratively in a team environment and engage with cross-functional teams. Additional Information Travel Requirements: Travel to client locations may be required as per project requirements. Designation: Senior Associate Location: Bangalore, India

Posted 2 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Enhancing your leadership style, you motivate, develop and inspire others to deliver quality. You are responsible for coaching, leveraging team member’s unique strengths, and managing performance to deliver on client expectations. With your growing knowledge of how business works, you play an important role in identifying opportunities that contribute to the success of our Firm. You are expected to lead with integrity and authenticity, articulating our purpose and values in a meaningful way. You embrace technology and innovation to enhance your delivery and encourage others to do the same. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Analyse and identify the linkages and interactions between the component parts of an entire system. Take ownership of projects, ensuring their successful planning, budgeting, execution, and completion. Partner with team leadership to ensure collective ownership of quality, timelines, and deliverables. Develop skills outside your comfort zone, and encourage others to do the same. Effectively mentor others. Use the review of work as an opportunity to deepen the expertise of team members. Address conflicts or issues, engaging in difficult conversations with clients, team members and other stakeholders, escalating where appropriate. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Job Summary Minimum Degree Required: Bachelor’s Degree in Computer science/IT or relevant field Degree Preferred : Bachelor’s Degree or higher Minimum Years of Experience : 10 year(s) Certifications Required : XXX Certifications Preferred : PMP certification or certified scrum master Required Knowledge/Skills The Data & Analytics manager's primary responsibility is to manage and Lead projects/activities related to building the modern data ecosystem while working with IT and business to convert insights into strategic opportunities. We are in search of passionate, motivated, and creative Data & Analytics managers to join our Managed services team who would lead and manage a team of Data architects, Data technology engineers, Data scientists and Reporting and BI experts to drive business outcomes through analytics interventions. Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 10 years of experience in data and analytics management with at least 6 years of Experience of leading various Data and Analytics program/initiatives of large sizes; Should have minimum 3 years’ Experience of managing and delivering Managed Data and Analytics projects/engagements. Managed Services/Operate/Production Support Experience is a must Oversee the delivery of data analytics engagements – development and maintenance of data pipelines, data models and analytics platforms Lead the implementation of analytic solutions including dashboards, reports and predictive models. Strong understanding of service delivery and Support models Should have extensive Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc.; Monitor and optimize the performance of the analytics platforms ensuring all SLAs are consistently met or exceeded Provide technical insights and recommendations to stakeholders delivering consistent high-quality support Collaborate with cross-functional teams to address client requirements and resolve technical issues. Contribute to the development of service offerings and strategic initiatives for data and analytics managed services. Identify opportunities for expanding services and enhancing client value. Should have led and driven estimations and sizing effort related to Data analytics work, plan and build teams utilizing onsite/offshore mix, utilizing deployment models for efficient delivery of solution; Should have excellent communication, problem solving, quantitative and analytical abilities.; Should be Confident in decision making and the ability to explain processes or choices as needed. Should Perform as a team leader by creating a positive environment, build proven team members based on coaching and mentoring, shaping next generation of Data analytics leaders; Be able to effectively Monitor workloads of the team while meeting client expectations and respecting the work-life quality of team members; Be able to Recruit, train, develop, mentor, and supervise team members. Preferred Knowledge/Skills : Candidate should have good Experience in working with a managed services organization with a passion for leading multiple Data and Analytics projects and demonstrate proven experience in following key areas/activities 8+ years of experience in data analytics with a focus on solution design and technical delivery Proven experience in managed services or a similar operational support role Proficiency in data visualization tools (e.g. Tableau, Power BI) Hands-on experience with ETL, ELT, Data warehousing, and cloud platforms (AWS, Azure, GCP etc) Strong knowledge of SQL or other relevant programming languages like Python etc Strong problem solving and analytical abilities Ability to lead and work in collaborative, fast paced environments. Flexibility to work across time zones and support global clients Experience with Sales and Estimations is a plus

Posted 2 days ago

Apply

58.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description We are seeking a high-impact AI/ML Engineer to lead the design, development, and deployment of machine learning and AI solutions across vision, audio, and language modalities. You'll be part of a fast-paced, outcome-oriented AI & Analytics team, working alongside data scientists, engineers, and product leaders to transform business use cases into real-time, scalable AI systems. Responsibilities This role demands strong technical leadership, a product mindset, and hands-on expertise in Computer Vision, Audio Intelligence, and Deep Responsibilities : Architect, develop, and deploy ML models for multimodal problems, including vision (image/video), audio (speech/sound), and NLP tasks. Own the complete ML lifecycle: data ingestion, model development, experimentation, evaluation, deployment, and monitoring. Leverage transfer learning, foundation models, or self-supervised approaches where suitable. Design and implement scalable training pipelines and inference APIs using frameworks like PyTorch or TensorFlow. Collaborate with MLOps, data engineering, and DevOps to productionize models using Docker, Kubernetes, or serverless infrastructure. Continuously monitor model performance and implement retraining workflows to ensure accuracy over time. Stay ahead of the curve on cutting-edge AI research (e.g., generative AI, video understanding, audio embeddings) and incorporate innovations into production systems. Write clean, well-documented, and reusable code to support agile experimentation and long-term platform : Bachelors or Masters degree in Computer Science, Artificial Intelligence, Data Science, or a related field. 58 years of experience in AI/ML Engineering, with at least 3 years in applied deep learning. Technical Skills Languages : Expert in Python; good knowledge of R or Java is a plus. ML/DL Frameworks : Proficient with PyTorch, TensorFlow, Scikit-learn, ONNX. Computer Vision : Image classification, object detection, OCR, segmentation, tracking (YOLO, Detectron2, OpenCV, MediaPipe). Audio AI : Speech recognition (ASR), sound classification, audio embedding models (Wav2Vec2, Whisper, etc. Data Engineering : Strong with Pandas, NumPy, SQL, and preprocessing pipelines for structured and unstructured data. NLP/LLMs : Working knowledge of Transformers, BERT/LLAMA, Hugging Face ecosystem is preferred. Cloud & MLOps : Experience with AWS/GCP/Azure, MLFlow, SageMaker, Vertex AI, or Azure ML. Deployment & Infrastructure : Experience with Docker, Kubernetes, REST APIs, serverless ML inference. CI/CD & Version Control : Git, DVC, ML pipelines, Jenkins, Airflow, etc. Soft Skills & Competencies Strong analytical and systems thinking; able to break down business problems into ML components. Excellent communication skills able to explain models, results, and decisions to non-technical stakeholders. Proven ability to work cross-functionally with designers, engineers, product managers, and analysts. Demonstrated bias for action, rapid experimentation, and iterative delivery of impact. (ref:hirist.tech)

Posted 2 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Enhancing your leadership style, you motivate, develop and inspire others to deliver quality. You are responsible for coaching, leveraging team member’s unique strengths, and managing performance to deliver on client expectations. With your growing knowledge of how business works, you play an important role in identifying opportunities that contribute to the success of our Firm. You are expected to lead with integrity and authenticity, articulating our purpose and values in a meaningful way. You embrace technology and innovation to enhance your delivery and encourage others to do the same. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Analyse and identify the linkages and interactions between the component parts of an entire system. Take ownership of projects, ensuring their successful planning, budgeting, execution, and completion. Partner with team leadership to ensure collective ownership of quality, timelines, and deliverables. Develop skills outside your comfort zone, and encourage others to do the same. Effectively mentor others. Use the review of work as an opportunity to deepen the expertise of team members. Address conflicts or issues, engaging in difficult conversations with clients, team members and other stakeholders, escalating where appropriate. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Job Summary Minimum Degree Required: Bachelor’s Degree in Computer science/IT or relevant field Degree Preferred : Bachelor’s Degree or higher Minimum Years of Experience : 10 year(s) Certifications Required : XXX Certifications Preferred : PMP certification or certified scrum master Required Knowledge/Skills The Data & Analytics manager's primary responsibility is to manage and Lead projects/activities related to building the modern data ecosystem while working with IT and business to convert insights into strategic opportunities. We are in search of passionate, motivated, and creative Data & Analytics managers to join our Managed services team who would lead and manage a team of Data architects, Data technology engineers, Data scientists and Reporting and BI experts to drive business outcomes through analytics interventions. Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 10 years of experience in data and analytics management with at least 6 years of Experience of leading various Data and Analytics program/initiatives of large sizes; Should have minimum 3 years’ Experience of managing and delivering Managed Data and Analytics projects/engagements. Managed Services/Operate/Production Support Experience is a must Oversee the delivery of data analytics engagements – development and maintenance of data pipelines, data models and analytics platforms Lead the implementation of analytic solutions including dashboards, reports and predictive models. Strong understanding of service delivery and Support models Should have extensive Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc.; Monitor and optimize the performance of the analytics platforms ensuring all SLAs are consistently met or exceeded Provide technical insights and recommendations to stakeholders delivering consistent high-quality support Collaborate with cross-functional teams to address client requirements and resolve technical issues. Contribute to the development of service offerings and strategic initiatives for data and analytics managed services. Identify opportunities for expanding services and enhancing client value. Should have led and driven estimations and sizing effort related to Data analytics work, plan and build teams utilizing onsite/offshore mix, utilizing deployment models for efficient delivery of solution; Should have excellent communication, problem solving, quantitative and analytical abilities.; Should be Confident in decision making and the ability to explain processes or choices as needed. Should Perform as a team leader by creating a positive environment, build proven team members based on coaching and mentoring, shaping next generation of Data analytics leaders; Be able to effectively Monitor workloads of the team while meeting client expectations and respecting the work-life quality of team members; Be able to Recruit, train, develop, mentor, and supervise team members. Preferred Knowledge/Skills : Candidate should have good Experience in working with a managed services organization with a passion for leading multiple Data and Analytics projects and demonstrate proven experience in following key areas/activities 8+ years of experience in data analytics with a focus on solution design and technical delivery Proven experience in managed services or a similar operational support role Proficiency in data visualization tools (e.g. Tableau, Power BI) Hands-on experience with ETL, ELT, Data warehousing, and cloud platforms (AWS, Azure, GCP etc) Strong knowledge of SQL or other relevant programming languages like Python etc Strong problem solving and analytical abilities Ability to lead and work in collaborative, fast paced environments. Flexibility to work across time zones and support global clients Experience with Sales and Estimations is a plus

Posted 2 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Enhancing your leadership style, you motivate, develop and inspire others to deliver quality. You are responsible for coaching, leveraging team member’s unique strengths, and managing performance to deliver on client expectations. With your growing knowledge of how business works, you play an important role in identifying opportunities that contribute to the success of our Firm. You are expected to lead with integrity and authenticity, articulating our purpose and values in a meaningful way. You embrace technology and innovation to enhance your delivery and encourage others to do the same. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Analyse and identify the linkages and interactions between the component parts of an entire system. Take ownership of projects, ensuring their successful planning, budgeting, execution, and completion. Partner with team leadership to ensure collective ownership of quality, timelines, and deliverables. Develop skills outside your comfort zone, and encourage others to do the same. Effectively mentor others. Use the review of work as an opportunity to deepen the expertise of team members. Address conflicts or issues, engaging in difficult conversations with clients, team members and other stakeholders, escalating where appropriate. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Job Summary Minimum Degree Required: Bachelor’s Degree in Computer science/IT or relevant field Degree Preferred : Bachelor’s Degree or higher Minimum Years of Experience : 10 year(s) Certifications Required : XXX Certifications Preferred : PMP certification or certified scrum master Required Knowledge/Skills The Data & Analytics manager's primary responsibility is to manage and Lead projects/activities related to building the modern data ecosystem while working with IT and business to convert insights into strategic opportunities. We are in search of passionate, motivated, and creative Data & Analytics managers to join our Managed services team who would lead and manage a team of Data architects, Data technology engineers, Data scientists and Reporting and BI experts to drive business outcomes through analytics interventions. Candidate is expected to demonstrate extensive knowledge and/or a proven record of success in the following areas: Should have minimum 10 years of experience in data and analytics management with at least 6 years of Experience of leading various Data and Analytics program/initiatives of large sizes; Should have minimum 3 years’ Experience of managing and delivering Managed Data and Analytics projects/engagements. Managed Services/Operate/Production Support Experience is a must Oversee the delivery of data analytics engagements – development and maintenance of data pipelines, data models and analytics platforms Lead the implementation of analytic solutions including dashboards, reports and predictive models. Strong understanding of service delivery and Support models Should have extensive Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc.; Monitor and optimize the performance of the analytics platforms ensuring all SLAs are consistently met or exceeded Provide technical insights and recommendations to stakeholders delivering consistent high-quality support Collaborate with cross-functional teams to address client requirements and resolve technical issues. Contribute to the development of service offerings and strategic initiatives for data and analytics managed services. Identify opportunities for expanding services and enhancing client value. Should have led and driven estimations and sizing effort related to Data analytics work, plan and build teams utilizing onsite/offshore mix, utilizing deployment models for efficient delivery of solution; Should have excellent communication, problem solving, quantitative and analytical abilities.; Should be Confident in decision making and the ability to explain processes or choices as needed. Should Perform as a team leader by creating a positive environment, build proven team members based on coaching and mentoring, shaping next generation of Data analytics leaders; Be able to effectively Monitor workloads of the team while meeting client expectations and respecting the work-life quality of team members; Be able to Recruit, train, develop, mentor, and supervise team members. Preferred Knowledge/Skills : Candidate should have good Experience in working with a managed services organization with a passion for leading multiple Data and Analytics projects and demonstrate proven experience in following key areas/activities 8+ years of experience in data analytics with a focus on solution design and technical delivery Proven experience in managed services or a similar operational support role Proficiency in data visualization tools (e.g. Tableau, Power BI) Hands-on experience with ETL, ELT, Data warehousing, and cloud platforms (AWS, Azure, GCP etc) Strong knowledge of SQL or other relevant programming languages like Python etc Strong problem solving and analytical abilities Ability to lead and work in collaborative, fast paced environments. Flexibility to work across time zones and support global clients Experience with Sales and Estimations is a plus

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

You will be joining an emerging software development and IT services company that is at the forefront of enterprise digital transformation. As a Full Stack Lead with over 8 years of experience, you will play a crucial role in architecting scalable applications while also leading and mentoring a team of developers. Your responsibilities will include making technical decisions, ensuring timely project delivery, and collaborating with cross-functional teams to develop cutting-edge web applications. In terms of backend technologies, you must have a strong foundation in Java, Spring Boot, REST APIs, and SQL/NoSQL databases. You should possess at least 5 years of experience in Java, Spring Boot, and microservices architecture, along with a deep understanding of REST APIs and databases. Proficiency in tools like Maven/Gradle and version control systems like Git is essential. On the frontend side, expertise in Angular 8+, TypeScript, RxJS, and NgRx is required. You should have at least 3 years of experience working with Angular 8+, TypeScript, RxJS, and NgRx, and have a track record of building responsive UIs and reusable component libraries. While not mandatory, experience with AWS/Azure/GCP, Docker/Kubernetes, and CI/CD pipelines would be advantageous. Familiarity with Agile/Scrum methodologies and the ability to thrive in fast-paced environments are also desirable qualities. This is a full-time position with office timings from 10:00 AM to 7:00 PM, based in Agra, Noida, or Hyderabad. The work mode is in-office with a 5-day working week. We are looking for candidates who can join immediately or within a notice period of 0-30 days. If you meet the requirements and are ready to take on this hands-on leadership role, please share your resume with us at client@asajsolutions.com. Feel free to spread the word within your network as well. This is an excellent opportunity for an experienced Full Stack Lead to make a significant impact in a dynamic and innovative environment.,

Posted 2 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Years of Experience: Candidates with 4+ years of hands on experience Position: Senior Associate Industry: Telecom / Network Analytics / Customer Analytics Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Proven experience with telco data including call detail records (CDRs), customer churn models, and network analytics Deep understanding of predictive modeling for customer lifetime value and usage behavior Experience working with telco clients or telco data platforms (like Amdocs, Ericsson, Nokia, AT&T etc) Proficiency in machine learning techniques, including classification, regression, clustering, and time-series forecasting Strong command of statistical techniques (e.g., logistic regression, hypothesis testing, segmentation models) Strong programming in Python or R, and SQL with telco-focused data wrangling Exposure to big data technologies used in telco environments (e.g., Hadoop, Spark) Experience working in the telecom industry across domains such as customer churn prediction, ARPU modeling, pricing optimization, and network performance analytics Strong communication skills to interface with technical and business teams Nice To Have Exposure to cloud platforms (Azure ML, AWS SageMaker, GCP Vertex AI) Experience working with telecom OSS/BSS systems or customer segmentation tools Familiarity with network performance analytics, anomaly detection, or real-time data processing Strong client communication and presentation skills Roles And Responsibilities Assist analytics projects within the telecom domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute

Posted 2 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. PwC US - Acceleration Center is seeking a highly strong analytical background to work in our Analytics Consulting practice Senior Associate’s will work as an integral part of business analytics teams in India alongside clients and consultants in the U.S., leading teams for high-end analytics consulting engagements and providing business recommendations to project teams. Years of Experience: Candidates with 4+ years of hands on experience Must Have Experience in building ML models in cloud environments (At least 1 of the 3: Azure ML, GCP’s Vertex AI platform, AWS SageMaker) Knowledge of predictive/prescriptive analytics, especially on usage of Log-Log, Log-Linear, Bayesian Regression technques and including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks Good knowledge of statistics For e.g: statistical tests & distributions Experience in Data analysis For e.g: data cleansing, standardization and data preparation for the machine learning use cases Experience in machine learning frameworks and tools (For e.g. scikit-learn, mlr, caret, H2O, TensorFlow, Pytorch, MLlib) Advanced level programming in SQL or Python/Pyspark Expertise with visualization tools For e.g: Tableau, PowerBI, AWS QuickSight etc. Nice To Have Working knowledge of containerization ( e.g. AWS EKS, Kubernetes), Dockers and data pipeline orchestration (e.g. Airflow) Good Communication and presentation skills Roles And Responsibilities Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background Any graduate /BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Software Engineer In this role, you will : ETL developer is responsible for performing system development work with background of various data models and data warehousing concepts. Investigate live systems faults, diagnose problems and propose and provide solutions Report progress as required and advise of problems in good time Convert requirements into sustainable technical solution through coding best practices. Write, analyse, review, and rewrite programs to departmental and Group standards Update programs to increase operating efficiency or adapt to new requirements Review code from team members Analyst/Developers as part of the quality assurance process Produce unit test plans with detailed expected results to fully exercise the code. Requirements To be successful in this role, you should meet the following requirements: Bachelor Degree or International Equivalent Excellent written and verbal communication skills; presentation skills preferred Strong prioritization and time management skills Minimum 3 years of work experience and experience in Financial domain (Banking). Good Knowledge and experience on DataStage, Linux and conventional database technologies. knowledge and experience on GCP and Cloud technologies along with python. Self-motivated, focused, detailed oriented and able to work efficiently to deadlines are essential Ability to work with a degree of autonomy while also being able to work in a collaborative team environment Experience in designing, developing and implementing Business Intelligence tools. Strong knowledge of scheduling tools such as Control M. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 2 days ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. Those in cloud operations at PwC will focus on managing and optimising cloud infrastructure and services to enable seamless operations and high availability for clients. You will be responsible for monitoring, troubleshooting, and implementing industry leading practices for cloud-based systems. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Job Title: Site Reliability Engineer (SRE) – Senior Associate Location : Bangalore (Hybrid) Department : Managed Services – Core Automation Team Job Overview We’re seeking a Senior Associate with deep hands-on experience in scripting, automation, and RPA to help build intelligent, resilient systems across Managed Services. You’ll work at the intersection of platform reliability and automation—developing scripts, automating runbooks, and integrating low-code/no-code solutions to eliminate manual work and improve operational efficiency. This role is ideal for someone who thrives in solving real-world production challenges with code, automation, and curiosity. Key Responsibilities Automate repetitive infrastructure and application support activities using scripting (Python, Bash, PowerShell) and RPA/low-code platforms. Develop and maintain scripts and reusable components to drive system configuration, monitoring, and auto-remediation. Build self-healing workflows to identify and resolve issues proactively—minimizing human intervention. Integrate observability and alerting tools with automation pipelines to enable real-time anomaly detection and resolution. Leverage low-code/no-code automation platforms (e.g., Power Automate, UiPath, Automation Anywhere) to streamline manual business processes. Collaborate with operations, engineering, and platform teams to build reliable automation frameworks and support scaled delivery. Use GenAI and AI-driven tools to enhance decision automation and support proactive operations management. Create and maintain runbooks and documentation that evolve into automation-first playbooks. Continuously analyze operational inefficiencies and develop automation to close gaps. Required Skills And Qualifications 5+ years of hands-on experience in Site Reliability Engineering, Automation Engineering, or RPA roles. Strong scripting proficiency in Python, Bash, and PowerShell for infrastructure and application automation. Practical experience with low-code/no-code platforms and RPA tools like UiPath, Power Automate, Automation Anywhere, or similar. Solid understanding of automation across monitoring, alerting, configuration management, and incident response. Exposure to log aggregation tools (e.g., Elastic Stack, Splunk) for troubleshooting and automation triggers. Experience building self-healing systems and integrating with event-based automation platforms. Familiarity with cloud environments (AWS, Azure, GCP) and integrating automation across hybrid infrastructure. Experience applying GenAI/AI-driven solutions to automate operations and support predictive monitoring. Strong analytical and root cause analysis skills for solving recurring issues via automation. Ability to work independently and collaborate effectively in cross-functional teams. Desired Skills And Qualifications Experience working in a Managed Services or enterprise support environment with a focus on automation maturity. Understanding of ITIL/ITSM processes and how automation can improve service quality and consistency. Exposure to containerized environments (e.g., Docker, Kubernetes) and automation of application deployments. Experience with observability platforms like Datadog, Prometheus, or AppDynamics is a plus. Strong communication and stakeholder engagement skills to align automation initiatives with business needs. Education Requirements Bachelor’s degree in Computer Science, IT, Engineering, or a related technical field. Certifications in RPA platforms, cloud technologies, or scripting/automation tools are a plus.

Posted 2 days ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. PwC US - Acceleration Center is seeking a highly strong analytical background to work in our Analytics Consulting practice Senior Associate’s will work as an integral part of business analytics teams in India alongside clients and consultants in the U.S., leading teams for high-end analytics consulting engagements and providing business recommendations to project teams. Years of Experience: Candidates with 4+ years of hands on experience Must Have Experience in building ML models in cloud environments (At least 1 of the 3: Azure ML, GCP’s Vertex AI platform, AWS SageMaker) Knowledge of predictive/prescriptive analytics, especially on usage of Log-Log, Log-Linear, Bayesian Regression technques and including Machine Learning algorithms (Supervised and Unsupervised) and deep learning algorithms and Artificial Neural Networks Good knowledge of statistics For e.g: statistical tests & distributions Experience in Data analysis For e.g: data cleansing, standardization and data preparation for the machine learning use cases Experience in machine learning frameworks and tools (For e.g. scikit-learn, mlr, caret, H2O, TensorFlow, Pytorch, MLlib) Advanced level programming in SQL or Python/Pyspark Expertise with visualization tools For e.g: Tableau, PowerBI, AWS QuickSight etc. Nice To Have Working knowledge of containerization ( e.g. AWS EKS, Kubernetes), Dockers and data pipeline orchestration (e.g. Airflow) Good Communication and presentation skills Roles And Responsibilities Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background Any graduate /BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA

Posted 2 days ago

Apply

46.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary We are hiring a Senior DevOps Engineer with 46 years of experience to join our growing engineering team. The ideal candidate is proficient in AWS and Azure, has a solid development background (Python preferred), and demonstrates strong experience in infrastructure design, automation, and DevOps to GCP is a plus. You will be responsible for building, managing, and optimizing robust, secure, and scalable infrastructure solutions from scratch. Key Responsibilities Design and implement cloud infrastructure using AWS, Azure, and optionally GCP. Build and manage Infrastructure-as-Code using Terraform. Develop and maintain CI/CD pipelines using tools such as GitHub Actions, Jenkins, or GitLab CI. Deploy and manage containerized applications using Docker and Kubernetes (EKS/AKS). Set up and manage Kafka for distributed streaming and event processing. Build monitoring, logging, and alerting solutions using tools like Prometheus, Grafana, ELK, CloudWatch, Azure Monitor. Ensure cost optimization and security best practices across all cloud environments. Collaborate with developers to debug application issues and improve system performance. Lead infrastructure architecture discussions and implement scalable, resilient solutions. Automate operational processes and drive DevOps culture and best practices across teams. Required Skills 46 years of hands-on experience in DevOps/Site Reliability Engineering. Strong experience in multi-cloud environments (AWS + Azure); GCP exposure is a bonus. Proficient in Terraform for IaC; experience with ARM Templates or CloudFormation is a plus. Solid experience with Kubernetes (EKS & AKS) and container orchestration. Proficient in Docker and container lifecycle management. Hands-on experience with Kafka (setup, scaling, and monitoring). Experience implementing monitoring, logging, and alerting solutions. Expertise in cloud security, IAM, RBAC, and cost optimization. Development experience in Python or any backend language. Excellent problem-solving and troubleshooting skills. Nice To Have Certifications : AWS DevOps Engineer, Azure DevOps Engineer, CKA/CKAD. Experience with GitOps, Helm, and service mesh Familiarity with serverless architecture and event-driven systems. Education Bachelors or Masters degree in Computer Science, Information Technology, or related field. (ref:hirist.tech)

Posted 2 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing, product-led fintech company on a mission to transform the financial ecosystem using cutting-edge technology, user-centric design, and deep domain expertise. We are looking for an experienced Engineering Manager to lead and scale our MERN stack development team while building high-performance, scalable, and secure financial applications. Key Responsibilities Team Leadership & Management : Lead, mentor, and manage a team of 610 full-stack engineers (MERN stack). Drive performance management, team development, and career progression. Collaborate with product managers, designers, and QA to ensure timely delivery of high-quality features. Technical Ownership Own the architecture, design, and implementation of core fintech product modules using MongoDB, Express.js, React.js, and Node.js. Review and enforce coding standards, CI/CD pipelines, and software quality best practices. Ensure system scalability, security, performance, and availability. Project Delivery Translate business requirements into scalable tech solutions. Ensure sprint planning, estimation, execution, and stakeholder communication. Proactively identify risks and bottlenecks and implement mitigations. Product & Innovation Focus Work closely with leadership on technology strategy and product roadmaps. Foster a culture of innovation, continuous learning, and engineering excellence. Requirements 10+ years of software development experience, with at least 3+ years in team leadership roles. Proven track record of working in product-based fintech companies. Deep hands-on experience in the MERN stack (MongoDB, Express.js, React.js, Node.js). Strong understanding of cloud-native architectures (AWS/GCP/Azure). Proficiency in REST APIs, microservices, Docker, and container orchestration. Exposure to security best practices in fintech (e.g., data encryption, secure auth flows). Strong debugging, optimization, and analytical skills. Excellent communication, interpersonal, and stakeholder management skills. Ability to work in a fast-paced, startup-like environment with a product ownership mindset. Nice To Have Experience with TypeScript, GraphQL, or WebSockets. Exposure to DevOps practices and observability tools. Prior experience building lending, payments, or investment platforms. What You Can Expect In Return ESOPs basis performance. Health insurance. Statutory benefits like PF & Gratuity. Flexible Working structure. Professional development opportunities. Collaborative and inclusive work culture. About The Company EduFund is an early-stage platform that helps Indian parents plan for their child's higher education in advance. Our unique technology is inspired by years of asset management experience as well as personal experience in funding higher education. EduFund team is filled with chai lovers, problem solvers, ridiculous jokes, and immeasurable passion towards our work. Our founding team has had the privilege of working at companies like Reliance, Goldman Sachs, CRISIL, InCred, Upstoxx, LeverageEdu, HDFC, and many others. We have raised a seed round from notable investors such as Anchorage Capital Partners, ViewTrade, AngelList and other angels. We are headquartered in Ahmedabad, with teams in Mumbai and Pune. (ref:hirist.tech)

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Talent Acquisition Executive/Lead at Saarthee, you will play a pivotal role in driving talent acquisition strategies to support the company's growth objectives. Your primary responsibility will be to collaborate with the HR department, business leaders, and hiring managers to identify, attract, and hire top talent in the data analytics industry and related fields. If you are passionate about building high-performing teams and have a proven track record in sourcing, hiring, and retaining top talent, this exciting opportunity is for you. Your key responsibilities will include leading the end-to-end recruitment process for technical roles in Data Engineering, Data Science, and Data Analytics. This will involve assessing candidates" proficiency in programming languages such as Python, Java, and Scala, data pipelines like ETL and Kafka, cloud platforms including AWS, Azure, and GCP, as well as big data technologies like Hadoop and Spark. You will design and implement technical assessment processes to ensure candidates meet the high technical standards required for projects and collaborate with stakeholders to understand specific technical requirements for each role. Furthermore, you will be responsible for building and maintaining a robust pipeline of highly qualified candidates using various sourcing techniques and staying updated with industry trends in Data Engineering, Data Science, and Analytics. Your role will also involve implementing strategies to ensure diverse and inclusive hiring practices, focusing on underrepresented groups in technology, and working on talent development and retention initiatives within the company. To be successful in this role, you should have at least 4 years of experience in Talent Acquisition, with a strong background in recruiting for Data Engineering, Data Science, and Technology roles. You should possess technical knowledge in AI/ML, programming languages like Python, R, and Java, big data technologies such as Hadoop and Spark, cloud platforms like AWS, Azure, and GCP, and analytics tools like Tableau and Power BI. Leadership skills, analytical thinking, excellent communication, and a commitment to excellence are essential qualities required for this position. In addition to technical skills, soft skills such as problem-solving, collaboration, adaptability, attention to detail, continuous learning, and excellent verbal and writing skills are also crucial for success in this role. If you are ready to take on the challenge of shaping the talent acquisition strategy for a dynamic and innovative company like Saarthee, we encourage you to apply for this role and be a part of our journey towards success.,

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

JD - Data Engineer This is a full-time position with D Square Consulting Services Pvt Ltd. Location : Pan-india ( Hybrid). Experience : 5+ years. Notice Period : Candidates available to join immediately. Job Summary We are seeking a skilled and experienced Data Engineer with strong Python expertise, API development experience, and a deep understanding of containerized, CI/CD-driven workflows. You will play a key role in designing, building, and scaling data pipelines and backend services that support our analytics and business intelligence platforms. This is a hands-on engineering role that requires a strong technical foundation and a collaborative mindset. Key Responsibilities Design, implement, and optimize robust, scalable data pipelines and ETL workflows using modern Python tools and libraries. Build and maintain production-grade RESTful and/or GraphQL APIs to serve data to internal and external stakeholders. Collaborate with Data Analysts, Scientists, and Engineering teams to enable end-to-end data solutions. Containerize data services using Docker and manage deployments within Kubernetes environments. Develop and maintain CI/CD pipelines using GitHub Actions to automate testing, data validations, and deployment processes. Ensure code quality through rigorous unit testing, type annotations, and adherence to Python best practices. Participate in architecture reviews, design discussions, and code reviews in an agile development process. Proactively identify opportunities to optimize data access, transformation, and governance. Required Skills & Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. 5+ years of hands-on experience in data engineering or backend development roles. Expert-level Python skills, with a strong understanding of idiomatic patterns, async programming, and typing. Proven experience in building production-grade RESTful or GraphQL APIs using frameworks like FastAPI, Graphene, or Strawberry. Hands-on experience with Docker, container-based workflows, and CI/CD automation using GitHub Actions. Experience working with Kubernetes for orchestrating deployments in production environments. Proficient with SQL and data modeling; familiarity with ETL tools, data lakes, or warehousing concepts is a plus. Strong communicator with a proactive and self-driven approach to problem-solving and Skills : Familiarity with data orchestration tools (e.g, Airflow, Prefect). Experience with streaming data platforms like Kafka or Spark. Knowledge of data governance, security, and observability best practices. Exposure to cloud platforms like AWS, GCP, or Azure. (ref:hirist.tech)

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Python API Developer specializing in Product Development, you will leverage your 4+ years of experience to design, develop, and maintain high-performance, scalable APIs that drive our Generative AI products. Your role will involve close collaboration with data scientists, machine learning engineers, and product teams to seamlessly integrate Generative AI models (e.g., GPT, GANs, DALL-E) into production-ready applications. Your expertise in backend development, Python programming, and API design will be crucial in ensuring the successful deployment and execution of AI-driven features. You should hold a Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. Your professional experience should demonstrate hands-on involvement in designing and developing APIs, particularly with Generative AI models or machine learning models in a production environment. Proficiency in cloud-based infrastructures (AWS, Google Cloud, Azure) and services for deploying backend systems and AI models is essential. Additionally, you should have a strong background in working with backend frameworks and languages like Python, Django, Flask, or FastAPI. Your core technical skills include expertise in Python for backend development using frameworks such as Flask, Django, or FastAPI. You should possess a strong understanding of building and consuming RESTful APIs or GraphQL APIs, along with experience in designing and implementing API architectures. Familiarity with database management systems (SQL/NoSQL) like PostgreSQL, MySQL, MongoDB, Redis, and knowledge of cloud infrastructure (e.g., AWS, Google Cloud, Azure) are required. Experience with CI/CD pipelines, version control tools like Git, and Agile development methodologies is crucial for automating deployments and ensuring efficient backend operations. Key responsibilities will involve closely collaborating with AI/ML engineers to integrate Generative AI models into backend services, handling data pipelines for real-time or batch processing, and engaging in design discussions to ensure technical feasibility and scalability of features. Implementing caching mechanisms, rate-limiting, and queueing systems to manage AI-related API requests, as well as ensuring backend services can handle high concurrency during resource-intensive generative AI processes, will be essential. Your problem-solving skills, excellent communication abilities for interacting with cross-functional teams, and adaptability to stay updated on the latest technologies and trends in generative AI will be critical for success in this role.,

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies