Jobs
Interviews

992 Dataflow Jobs - Page 13

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Minimum qualifications: Bachelor's degree or equivalent practical experience. 5 years of experience with software development in one or more programming languages, and with data structures/algorithms. 3 years of experience testing, maintaining, or launching software products, and 1 year of experience with software design and architecture. 3 years of experience with one or more of the following: Speech/audio (e.g., technology duplicating and responding to the human voice), reinforcement learning (e.g., sequential decision making), ML infrastructure, or specialization in another ML field. 3 years of experience with ML infrastructure (e.g., model deployment, model evaluation, optimization, data processing, debugging). Preferred qualifications: 7 years of experience in software-development with focus on design, development, and deployment of large-scale AI/ML applications. Experience in Python and with AI/ML libraries and frameworks (e.g., TensorFlow, JAX, scikit-learn, Pandas, NumPy). Experience architecting and deploying machine learning models in a cloud environment, with experience on Google Cloud Platform. Experience designing and implementing data processing pipelines using large-scale data engineering tools. Understanding of machine learning algorithms, statistical modeling techniques, and data analysis methodologies. About The Job Google's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. Our products need to handle information at massive scale, and extend well beyond web search. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google’s needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward. We are seeking an experienced Senior AI Software Engineer to join the Human Resources Engineering (HRE) team in Hyderabad. You will be a technical lead, driving the design, development, and deployment of AI-powered solutions that impact Google's employees. This role offers the opportunity to work on Google-scale infrastructure, mentor a team of engineers, collaborate with cross-functional partners, and contribute to the future of HR at Google through AI applications. You will be responsible for translating business challenges and objectives into AI/ML systems. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Lead the technical design and architecture of AI/ML systems and infrastructure within HRE domain, ensuring scalability, reliability, and performance. Drive the development and implementation of advanced AI/ML models and algorithms relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning) using various programming languages (e.g., Python) and Google's internal AI/ML platforms and frameworks (built on technologies like TensorFlow, JAX). Architect and oversee the development of data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) for large-scale data-ingestion, cleaning, transformation, and feature engineering. Take ownership of deployment, monitoring, and optimization of AI/ML models in Google's production environments, establishing standard procedures for ML Operations within the team. Provide technical leadership and mentorship to other engineers on the team, fostering a culture of technical excellence. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). Experience in one or more disciplines: Machine Learning, Recommendation Systems, Natural Language Processing, Computer Vision, Pattern Recognition, or Artificial Intelligence. Preferred qualifications: Understanding of agentic experience/AIML, Large Language Model (LLM), and strong coding skills. About The Job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started -- and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Lead and manage a team of AI Software Engineers, fostering a collaborative and high-performing environment. This includes hiring, mentoring, performance management, and career development. Drive the design, development, and deployment of scalable and reliable AI/ML systems and infrastructure relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning). Collaborate with Product Managers and HR stakeholders to understand business needs, define product requirements, and translate them into technical specifications and project plans. Oversee the architecture and implementation of robust data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) to support AI/ML initiatives. Stay abreast of the latest advancements in AI/ML and related technologies, evaluating their potential application within Human Resources and guiding the team's adoption of relevant innovations. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

On-site

About the Role We are seeking a highly skilled and motivated Senior Data Engineer with hands-on experience across AWS, Azure, and GCP data ecosystems. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and architectures that support advanced analytics and real-time data processing. Key Responsibilities Technical Responsibilities Data Pipeline Development : Design and implement robust ETL/ELT pipelines using cloud-native tools. Cloud Expertise : AWS : EMR, Kinesis, Redshift, Glue Azure : HDInsight, Synapse Analytics, Stream Analytics GCP : Cloud Dataproc, Dataflow, Composer Data Modeling : Develop and optimize data models for analytics and reporting. Data Governance : Ensure data quality, security, and compliance across platforms. Automation & Orchestration : Use tools like Apache Airflow, AWS Step Functions, and GCP Composer for workflow orchestration. Monitoring & Optimization : Implement monitoring, logging, and performance tuning for data pipelines. Collaboration & Communication Work closely with data scientists, analysts, and business stakeholders to understand data needs. Translate business requirements into scalable technical solutions. Participate in code reviews, architecture discussions, and agile ceremonies. Required Qualifications Technical Skills Strong programming skills in Python , SQL , and optionally Scala or Java . Deep understanding of distributed computing , data warehousing , and stream processing . Experience with data lake architectures , data mesh , and real-time analytics . Proficiency in CI/CD practices and infrastructure as code (e.g., Terraform, CloudFormation). Certifications (Preferred) AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Soft Skills & Attributes Analytical Thinking : Ability to break down complex problems and design scalable solutions. Communication : Strong verbal and written communication skills to explain technical concepts to non-technical stakeholders. Collaboration : Team player with a proactive attitude and the ability to work in cross-functional teams. Adaptability : Comfortable working in a fast-paced, evolving environment with shifting priorities. Ownership : High sense of accountability and a drive to deliver high-quality solutions.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

GCP Infrastructure Lead Location: Bangalore, Pune Exp: 6+ Years Responsibilities: 5+ years of demonstrated relevant experience deploying and supporting public cloud Infrastructure (GCP as primary) IaaS and PaaS. Experience in configuring and managing the GCP infrastructure environment components Foundation components – Networking (VPC, VPN, Interconnect, Firewall and Routes), IAM, Folder Structure, Organization Policy, VPC Service Control, Security Command Center etc. Application Components - BigQuery, Cloud Composer, Cloud Storage, Google Kubernetes Engine (GKE), Compute Engine, Cloud SQL, Cloud Monitoring, Dataproc, Data Fusion, Big Table, Dataflow etc. Design and implement Identity and Access Management (IAM) policies, custom roles, and service accounts across GCP projects and organizations. Implement and maintain Workload Identity Federation, IAM Conditions, and least-privilege access models. Integrate Google Cloud audit logs, access logs, and security logs with enterprise SIEM tools (e.g., Splunk, Chronicle, QRadar, or Exabeam). Configure Cloud Logging, Cloud Audit Logs, and Pub/Sub pipelines for log export to SIEM. Collaborate with the Security Operations Center (SOC) to define alerting rules and dashboards based on IAM events and anomalies. Participate in threat modeling and incident response planning involving IAM and access events. Maintain compliance with regulatory and internal security standards (e.g., CIS GCP Benchmark, NIST, ISO 27001). Monitor and report on IAM posture, access drift, and misconfigurations. Support periodic access reviews and identity governance requirements. Required Skills and Abilities: Mandatory Skills – GCP Networking (VPC, Firewall, Routes & VPN),CI/CD Pipelines, Terraform, Shell Scripting/Python Scripting Secondary Skills – Composer, BigQuery, GKE, Dataproc Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, Cloud Devops Engineer, Cloud Security Engineer, Cloud Network Engineer Participate in incident discussions and work with the Team towards resolving platform issues. Good verbal and written communication skills. Ability to communicate with customers, developers, and other stakeholders. Mentor and guide team members Good Presentation skills Strong Team Player About Us: We are a global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation. We have our own products! Eagle – Data warehouse Assessment & Migration Planning Product Raven – Automated Workload Conversion Product Pelican – Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job description: Job Description Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. ͏ Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer’s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ͏ Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ͏ ͏ Mandatory Skills: Google Cloud Dataflow . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 3 weeks ago

Apply

0.0 - 12.0 years

0 Lacs

Gurugram, Haryana

On-site

About the Role: Grade Level (for internal use): 12 The Team: As a member of the EDO, Collection Platforms & AI – Cognitive Engineering team you will spearhead the design and delivery of robust, scalable ML infrastructure and pipelines that power natural language understanding, data extraction, information retrieval, and data sourcing solutions for S&P Global. You will define AI/ML engineering best practices, mentor fellow engineers and data scientists, and drive production-ready AI products from ideation through deployment. You’ll thrive in a (truly) global team that values thoughtful risk-taking and self-initiative. What’s in it for you: Be part of a global company and build solutions at enterprise scale Lead and grow a technically strong ML engineering function Collaborate on and solve high-complexity, high-impact problems Shape the engineering roadmap for emerging AI/ML capabilities (including GenAI integrations) Key Responsibilities: Architect, develop, and maintain production-ready data acquisition, transformation, and ML pipelines (batch & streaming) Serve as a hands-on lead-writing code, conducting reviews, and troubleshooting to extend and operate our data platforms Apply best practices in data modeling, ETL design, and pipeline orchestration using cloud-native solutions Establish CI/CD and MLOps workflows for model training, validation, deployment, monitoring, and rollback Integrate GenAI components-LLM inference endpoints, embedding stores, prompt services-into broader ML systems Mentor and guide engineers and data scientists; foster a culture of craftsmanship and continuous improvement Collaborate with cross-functional stakeholders (Data Science, Product, IT) to align on requirements, timelines, and SLAs What We’re Looking For: 8-12 years' professional software engineering experience with a strong MLOps focus Expert in Python and Apache for large-scale data processing Deep experience deploying and operating ML pipelines on AWS or GCP Hands-on proficiency with container/orchestration tooling Solid understanding of the full ML model lifecycle and CI/CD principles Skilled in streaming and batch ETL design (e.g., Airflow, Dataflow) Strong OOP design patterns, Test-Driven Development, and enterprise system architecture Advanced SQL skills (big-data variants a plus) and comfort with Linux/bash toolsets Familiarity with version control (Git, GitHub, or Azure DevOps) and code review processes Excellent problem-solving, debugging, and performance-tuning abilities Ability to communicate technical change clearly to non-technical audiences Nice to have: Redis, Celery, SQS and Lambda based event driven pipelines Prior work integrating LLM services (OpenAI, Anthropic, etc.) at scale Experience with Apache Avro and Apache Familiarity with Java and/or .NET Core (C#) What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH103.2 - Middle Management Tier II (EEO Job Group) Job ID: 317386 Posted On: 2025-06-30 Location: Gurgaon, Haryana, India

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana

Remote

Lead Assistant Manager EXL/LAM/1407563 Insurance Property & CasualtiesGurgaon Posted On 30 Jun 2025 End Date 14 Aug 2025 Required Experience 5 - 8 Years Basic Section Number Of Positions 1 Band B2 Band Name Lead Assistant Manager Cost Code D014614 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 1200000.0000 - 2200000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Operations Management Sub Group Global Products & Platforms Organization Insurance Property & Casualties LOB Analytics SBU Consulting Country India City Gurgaon Center EXL - Gurgaon Center 38 Skills Skill AZURE AWS Minimum Qualification B.COM Certification No data available Job Description Job Title: Senior Data Engineer – Multi-Cloud (AWS, Azure, GCP) Location: Gurgaon, Haryana (Hybrid/Remote options available) Experience: 5+ years Employment Type: Full-time About the Role We are seeking a highly skilled and motivated Senior Data Engineer with hands-on experience across AWS, Azure, and GCP data ecosystems. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and architectures that support advanced analytics and real-time data processing. Key Responsibilities Technical Responsibilities Data Pipeline Development : Design and implement robust ETL/ELT pipelines using cloud-native tools. Cloud Expertise : AWS : EMR, Kinesis, Redshift, Glue Azure : HDInsight, Synapse Analytics, Stream Analytics GCP : Cloud Dataproc, Dataflow, Composer Data Modeling : Develop and optimize data models for analytics and reporting. Data Governance : Ensure data quality, security, and compliance across platforms. Automation & Orchestration : Use tools like Apache Airflow, AWS Step Functions, and GCP Composer for workflow orchestration. Monitoring & Optimization : Implement monitoring, logging, and performance tuning for data pipelines. Collaboration & Communication Work closely with data scientists, analysts, and business stakeholders to understand data needs. Translate business requirements into scalable technical solutions. Participate in code reviews, architecture discussions, and agile ceremonies. Required Qualifications Technical Skills Strong programming skills in Python , SQL , and optionally Scala or Java . Deep understanding of distributed computing , data warehousing , and stream processing . Experience with data lake architectures , data mesh , and real-time analytics . Proficiency in CI/CD practices and infrastructure as code (e.g., Terraform, CloudFormation). Certifications (Preferred) AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Soft Skills & Attributes Analytical Thinking : Ability to break down complex problems and design scalable solutions. Communication : Strong verbal and written communication skills to explain technical concepts to non-technical stakeholders. Collaboration : Team player with a proactive attitude and the ability to work in cross-functional teams. Adaptability : Comfortable working in a fast-paced, evolving environment with shifting priorities. Ownership : High sense of accountability and a drive to deliver high-quality solutions. Workflow Workflow Type Back Office

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description We are recruiting Data Engineers with strong technical ability who can articulate well to non-tech audience, who will join our team on a permanent basis. Role: The Data Engineer will engage with external Clients and internal customers, understand their needs, and design, build, and maintain data pipelines and infrastructure using Google Cloud Platform (GCP). This will involve the design and implementation of scalable data architectures, ETL processes, and data warehousing solutions on GCP. The role requires expertise in big data technologies, cloud computing, and data integration, as well as the ability to optimize data systems for performance and reliability. This requires a blend of skills including programming, database management, cloud infrastructure, and data pipeline development. Additionally, problem-solving skills, attention to detail, and the ability to work in a fast-paced environment are valuable traits. You will frequently work as part of a scrum team, together with data scientists, ML engineers, and analyst developers, to design and implement robust data infrastructure that supports analytics and machine learning initiatives. Responsibilities Design, build, and maintain scalable data pipelines and ETL processes using GCP services such as Cloud Dataflow, Cloud Dataproc, and BigQuery. Implement and optimize data storage solutions using GCP technologies like Cloud Storage, Cloud SQL, and Cloud Spanner. Develop and maintain data warehouses and data lakes on GCP, ensuring data quality, accessibility, and security. Collaborate with data scientists and analysts to understand data requirements and provide efficient data access solutions. Implement data governance and security measures to ensure compliance with regulations and best practices. Automate data workflows and implement monitoring and ing systems for data pipelines. Sharing data engineering knowledge with the wider functions and developing reusable data integration patterns and best practices. Skills/Experience BSc/MSc in Computer Science, Information Systems, or related field, or equivalent work experience. Proven experience (6+ years) as a Data Engineer or similar role, preferably with GCP expertise. Strong proficiency in SQL and experience with NoSQL databases. Expertise in data modeling, ETL processes, and data warehousing concepts. Significant experience with GCP services such as BigQuery, Dataflow, Dataproc, Cloud Storage, and Pub/Sub. Proficiency in at least one programming language (e.g., Python, Java, or Scala) for data pipeline development. Experience with big data technologies such as Hadoop, Spark, and Kafka. Knowledge of data governance, security, and compliance best practices. GCP certifications (e.g., Professional Data Engineer) are highly advantageous. Effective communication skills to collaborate with cross-functional teams and explain technical concepts to non-technical stakeholders. Skills Bigquery,ETL,Data Management,Python

Posted 3 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Roles And Responsibilities Contributing to designing cloud architectures and integration modules for enterprise-level systems. Managing and mentoring the technical team and ensuring quality delivery of solutions as per the process Lead engagements with partners and customers, including stakeholder management, requirements gathering, and designing solutions along with development and delivery. Collaborate with Program Management, Engineering, User Experience, and Product teams to identify gaps and work with cross-functional teams to design solutions. Essential Skills Experience developing and managing scalable, high-performance production systems. Some experience working in Artificial Intelligence (AI) or RPA Developing, designing, and maintaining high-quality production applications written in NodeJS or Python, data structures, ML algorithms, and software design. Experience with complex API integrations and application development modules. Strong skills in designing database schema; both SQL and NoSQL databases. Experience with any cloud technologies like GCP/AWS/Azure leveraging serverless architectures and technologies like Cloud functions, AWS Lambda, Google Dataflow, Google Pub/Sub etc Design, build, manage and operate the continuous delivery framework and tools, and act as a subject matter expert on CI/CD for developer teams. Fullstack development background with experience on Front-end – Angular / jQuery or any JS frameworks Should possess strong problem-solving ability. The ability or potential to multitask, and prioritize. Experience in Test Driven Development & Agile methodologies. Good communication skills. Experience using tools like Git and Jira. Confluence is a plus. A team player who can collaborate with all stakeholders with strong interpersonal skills. Self-starter with a drive to technically mentor your cohort of developers Good To Have Experience in designing reusable and scalable architecture for cloud applications Experience in managing the design and production implementation of chat and voice bots Exposure to developing, maintaining, and monitoring microservices Experience in one or more: chat/voice bot development, machine learning, Natural Language Processing (NLP), and contact center technologies. Application Security

Posted 3 weeks ago

Apply

5.0 years

5 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs. Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks. Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting. GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting. Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs. Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities. Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team. Requirements To be successful in this role, you should meet the following requirements: Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development. Technical Skills: Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform. Analytical Skills: Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 3 weeks ago

Apply

4.0 - 6.0 years

7 - 9 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Responsible for performing system development work around ETL, which can include both the development of new function and facilities, and the on-going systems support of live systems. Responsible for the documentation, coding, and maintenance of new and existing Extract, Transform, and Load (ETL) processes within the Enterprise Data Warehouse. Investigate live systems faults, diagnose problems, and propose and provide solutions. Work closely with various teams to design, build, test, deploy and maintain insightful MI reports. Support System Acceptance Testing, System Integration and Regression Testing. Identify any issues that may arise to delivery risk, formulate preventive actions or corrective measures, and timely escalate major project risks & issues to service owner. Execute test cases and log defects. Should be proactive in understanding the existing system, identifying areas for improvement, and taking ownership of assigned tasks. Ability to work independently with minimal supervision while ensuring timely delivery of tasks. Requirements To be successful in this role, you should meet the following requirements : 4-6 years of experience in Data Warehousing specialized in ETL. Given the current team is highly technical in nature, the expectation is that the candidate has experience in technologies like DataStage, Teradata Vantage, Unix Scripting and scheduling using Control-M and DevOps tools. Candidate should possess good knowledge on SQL and demonstrate the ability to write efficient and optimized queries effectively. Hands on experience or knowledge on GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Cloud spanner, Cloud SQL would be an added advantage. Hands-on experience with Unix, Git and Jenkins and would be added advantage. This individual should be able to develop and implement solutions on both on-prem and google cloud platform (GCP). Conducting migration, where necessary, to bring tools and other elements into the cloud and software upgrades. Should have proficiency in using JIRA and Confluence and experienced in working in projects that have followed Agile methodologies. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 3 weeks ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

What makes Techjays an inspiring place to work: At Techjays, we are driving the future of artificial intelligence with a bold mission to empower businesses worldwide by helping them build AI solutions that transform industries. As an established leader in the AI space, we combine deep expertise with a collaborative, agile approach to deliver impactful technology that drives meaningful change. Our global team consists of professionals who have honed their skills at leading companies such as Google, Akamai, NetApp, ADP, Cognizant Consulting, and Capgemini. With engineering teams across the globe, we deliver tailored AI software and services to clients ranging from startups to large-scale enterprises. Be part of a company that’s pushing the boundaries of digital transformation. At Techjays, you’ll work on exciting projects that redefine industries, innovate with the latest technologies, and contribute to solutions that make a real-world impact. Join us on our journey to shape the future with AI. We are seeking a Senior Data & AI/ML Engineer with deep expertise in GCP, who will not only build intelligent and scalable data solutions but also champion our internal capability building and partner-level excellence. This is a high-impact role for a seasoned engineer who thrives in designing GCP-native AI/ML-enabled data platforms. You’ll play a dual role as a hands-on technical lead and a strategic enabler, helping drive our Google Cloud Data & AI/ML specialization track forward through successful implementations, reusable assets, and internal skill development. Prefferred Qualifications: GCP Professional Certifications : Data Engineer or Machine Learning Engineer. Experience contributing to a GCP Partner specialization journey. Familiarity with Looker, Data Catalog, Dataform , or other GCP data ecosystem tools. Knowledge of data privacy, model explainability , and AI governance is a plus. Primary Skills (Must-Have): GCP Services: BigQuery, Dataflow, Pub/Sub, Vertex AI ML Engineering: End-to-end ML pipelines using Vertex AI / Kubeflow Programming : Python & SQL MLOps: CI/CD for ML, Model deployment & monitoring Infrastructure-as-Code: Terraform Data Engineering: ETL/ELT, real-time & batch pipelines AI/ML Tools: TensorFlow, scikit-learn, XGBoos Secondary Skills (Good-to-Have): GCP Certifications: Professional Data Engineer or ML Engineer Data Tools : Looker, Dataform, Data Catalog AI Governance: Model explainability, privacy, compliance (e.g., GDPR, fairness) GCP Partner Experience : Prior involvement in specialization journey or partner enablement Work Location: Coimbatore Key Responsibilities Data & AI/ML Architecture Design and implement data architectures for real-time and batch pipelines , leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI , and Cloud Storage . Lead the development of ML pipelines , from feature engineering to model training and deployment using Vertex AI , AI Platform , and Kubeflow Pipelines . Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions , CI/CD , and Model Registry . Define and implement data governance, lineage, monitoring , and quality frameworks . Google Cloud Partner Enablement Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions . Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP. Contribute to building repeatable solution accelerators in Data & AI/ML. Work with the leadership team to align with Google Cloud Partner Program metrics . Team Development Mentor engineers and data scientists toward achieving GCP certifications , especially in Data Engineering and Machine Learning . Organize and lead internal GCP AI/ML enablement sessions . Represent the company in Google partner ecosystem events , tech talks, and joint GTM engagements. What we offer Best-in-class packages. Paid holidays and flexible time-off policies. Casual dress code and a flexible working environment. Opportunities for professional development in an engaging, fast-paced environment. Medical insurance covering self and family up to 4 lakhs per person. Diverse and multicultural work environment. Be part of an innovation-driven culture with ample support and resources to succeed.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Location: Gurgaon (Hybrid/On-site) Department: Data Engineering Reports To: Project Manager / Client Stakeholders Type: Full-Time About The Client Client is a leading data and AI/ML solutions provider, partnering with organizations across India and Australia to drive business transformation through data-driven insights. With a decade-long legacy and collaborations with technology leaders like AWS, Snowflake, Google Cloud Platform (GCP), and Databricks, BluePi delivers custom solutions that help enterprises achieve higher maturity and business outcomes. Role Overview As a Technical Lead – Data Engineer, you will play a pivotal role in designing, developing, and leading complex data projects on Google Cloud Platform and other modern data stacks. You will partner with cross-functional teams, drive architectural decisions, and ensure the delivery of scalable, high-performance data solutions aligned with business goals. Key Responsibilities Lead the design, development, and implementation of robust data pipelines, data warehouses, and cloud-based architectures. Collaborate with business and technical teams to identify problems, define methodologies, and deliver end-to-end data solutions. Own project modules, ensuring complete accountability for scope, design, and delivery. Develop technical roadmaps and architectural vision for data projects, making critical decisions on technology selection, design patterns, and implementation. Implement and optimize data governance frameworks on GCP. Integrate GCP data services (BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI) with platforms like Snowflake. Write efficient, production-grade code in Python, SQL, and ETL/orchestration tools. Utilize containerized solutions (Google Kubernetes Engine) for scalable deployments. Apply expertise in PySpark (batch and real-time), Kafka, and advanced data querying for high-volume, distributed data environments. Monitor, optimize, and troubleshoot system performance, ensuring parallelism, concurrency, and resilience. Reduce job run-times and resource utilization through architecture optimization. Develop and optimize data warehouses, including schema design and data modeling. Mentor team members, contribute as an individual contributor, and ensure successful project delivery. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. Extensive hands-on experience with Google Cloud Platform data services (BigQuery, Dataflow, Dataproc, Cloud Composer, Vertex AI Studio, GenAI). Proven experience with Snowflake integration and data governance on GCP. Strong programming skills in Python, SQL, ETL, and orchestration tools. Proficiency in PySpark (batch and real-time), Kafka, and data querying tools. Experience with containerized solutions using Google Kubernetes Engine. Demonstrated ability to work with large, distributed datasets, optimizing for performance and scalability. Excellent communication skills for effective collaboration with internal teams and client stakeholders. Strong documentation skills, including the ability to articulate design and business objectives. Ability to balance short-term deliverables with long-term technical sustainability. Experience with AWS, Databricks, and other cloud data platforms. Prior leadership experience in data engineering teams. Exposure to AI/ML solution delivery in enterprise settings. Why Join Opportunity to lead high-impact data projects for a reputed client in a fast-growing data consulting environment. Work with cutting-edge technologies and global enterprise clients. Collaborative, innovative, and growth-oriented culture. Skills: cloud,dataflow,design,python,sql,snowflake,data,dataproc,google cloud,etl,orchestration tools,bigquery,cloud composer,pyspark,gcp,google kubernetes engine,genai,vertex ai studio,google cloud platform,kafka,data querying tools

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP. Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs. Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows. Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks. Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency. Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting. GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy. Collaborate with data analysts and data scientists to prepare data for analysis and reporting. Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention. Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs. Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies. Collaborate with security teams to implement data protection measures and address vulnerabilities. Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members. Conduct training sessions and workshops to share expertise and promote best practices within the team. Requirements To be successful in this role, you should meet the following requirements: Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP. Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development. Technical Skills: Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering. Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred. Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc. Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform. Analytical Skills: Strong problem-solving skills with a keen attention to detail. Ability to analyze complex data sets and derive meaningful insights. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Marketing Title. In this role, you will: Responsible for performing system development work around ETL, which can include both the development of new function and facilities, and the on-going systems support of live systems. Responsible for the documentation, coding, and maintenance of new and existing Extract, Transform, and Load (ETL) processes within the Enterprise Data Warehouse. Investigate live systems faults, diagnose problems, and propose and provide solutions. Work closely with various teams to design, build, test, deploy and maintain insightful MI reports. Support System Acceptance Testing, System Integration and Regression Testing. Identify any issues that may arise to delivery risk, formulate preventive actions or corrective measures, and timely escalate major project risks & issues to service owner. Execute test cases and log defects. Should be proactive in understanding the existing system, identifying areas for improvement, and taking ownership of assigned tasks. Ability to work independently with minimal supervision while ensuring timely delivery of tasks. Requirements To be successful in this role, you should meet the following requirements : 4-6 years of experience in Data Warehousing specialized in ETL. Given the current team is highly technical in nature, the expectation is that the candidate has experience in technologies like DataStage, Teradata Vantage, Unix Scripting and scheduling using Control-M and DevOps tools. Candidate should possess good knowledge on SQL and demonstrate the ability to write efficient and optimized queries effectively. Hands on experience or knowledge on GCP’s Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Cloud spanner, Cloud SQL would be an added advantage. Hands-on experience with Unix, Git and Jenkins and would be added advantage. This individual should be able to develop and implement solutions on both on-prem and google cloud platform (GCP). Conducting migration, where necessary, to bring tools and other elements into the cloud and software upgrades. Should have proficiency in using JIRA and Confluence and experienced in working in projects that have followed Agile methodologies. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 3 weeks ago

Apply

1.0 years

4 - 7 Lacs

Haryāna

On-site

Job Title: Nursing/Health Care Assistant Location: Oman Employment Type: Full-Time (rotational shifts, weekend availability) Salary: 250 to 300 OMR per month Reports To: RNs / LPNs / Nurse Manager Job Summary We are seeking a compassionate and dedicated Nursing/Health Care Assistant to support our nursing and rehabilitation team in delivering exceptional patient care. Under the supervision of RNs/LPNs, you will assist with daily living activities, monitor vital signs, maintain hygiene and safety, support therapy sessions, manage feeding and incontinence, perform light housekeeping, and assist with admissions, transfers, and transportation. Key Responsibilities 1. Personal Care & Activities of Daily Living Assist patients with bathing, grooming, dressing, toileting, and incontinence care. Support mobility: transfers, ambulation, positioning, turning to prevent bedsores, and range-of-motion exercises. Provide tube feeding and feeding assistance when necessary. 2. Observation & Monitoring Measure and record vital signs (BP, pulse, temperature, respiration) and intake/output per shift. Observe and document changes in behaviour, mood, physical condition, or signs of distress/aggression, and report promptly. Assist in restraining patients as per rehabilitation protocols. 3. Therapeutic Support Aid physiotherapists and participate in group or individual therapy sessions. Escort patients in emergency and non-emergency situations within the facility or to outpatient (OPD) appointments and events. 4. Medical & Equipment Care Support light medical tasks under supervision (e.g., non‑sterile dressings, routine equipment/supply care). Perform inventory checks and ensure medical supplies/equipment are organized and functional. 5. Environment & Safety Ensure patient rooms are clean and hygienic: change linens, sanitize equipment, tidy rooms. Maintain infection control, follow health & safety protocols, and supervise patients to prevent falls or harm. 6. Admissions, Transfers & Documentation Assist with patient admissions, transfers, and discharges. Accurately record care activities, observations, vitals, feeding, and output in patient charts. 7. Emotional & Companionship Support Provide compassionate companionship, basic patient education, and emotional support. Qualifications & Skills ANM diploma (2‑year) or CNA/Healthcare Assistant certification. 1–3 years minimum healthcare or GNM/BSc or relevant qualification; 3+ years preferred. CPR/BLS certification advantageous. Valid Dataflow clearance (for international candidates). Strong interpersonal, communication, empathy, and confidentiality skills. Physically able to lift up to ~50 lbs, stand for long periods, and perform patient transfers. Working Hours & Benefits Schedule : Rotational shifts; weekend availability. Benefits : Free Joining Ticket (Will be reimbursed after the 3 months’ Probation period) 30 Days paid Annual leave after 1 year of service completion Yearly Up and Down Air Ticket Medical Insurance Life Insurance Accommodation (Chargeable up to OMR 20/-) Note: Interested candidates please call us at 97699 11050 or 99302 65888 , or email your CV to recruitment@thegrowthhive.org . Job Type: Full-time Pay: ₹40,000.00 - ₹60,000.00 per month Benefits: Food provided Health insurance Provident Fund Schedule: Monday to Friday Rotational shift Weekend availability Work Location: In person

Posted 3 weeks ago

Apply

6.0 years

5 - 9 Lacs

Bengaluru

On-site

Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. Who We Are: Wayfair runs the largest custom e-commerce large parcel network in the United States, approximately 1.6 million square meters of logistics space. The nature of the network is inherently a highly variable ecosystem that requires flexible, reliable, and resilient systems to operate efficiently. We are looking for a passionate Backend Software Engineer to join the Fulfillment Optimization team. This team builds the platforms that determine how customer orders are fulfilled, optimizing for Wayfair profitability and customer delight. A big part of our work revolves around enhancing and scaling customer-facing platforms that provide fulfillment information on our websites, starting at the top of the customer funnel on the search pages all the way through orders being delivered. Throughout this customer journey, we are responsible for maintaining an accurate representation of our dynamic supply chain, determining how different products will fit into boxes, predicting how these boxes will flow through warehouses and trucks, and ultimately surfacing the information our customers need to inform their decision and the details our suppliers and carriers require to successfully execute on the promises made to our customers. We do all of this in milliseconds, thousands of times per second. About the Role : As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do : Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need : Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 6+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies – Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools. About Wayfair Inc. Wayfair is one of the world’s largest online destinations for the home. Whether you work in our global headquarters in Boston, or in our warehouses or offices throughout the world, we’re reinventing the way people shop for their homes. Through our commitment to industry-leading technology and creative problem-solving, we are confident that Wayfair will be home to the most rewarding work of your career. If you’re looking for rapid growth, constant learning, and dynamic challenges, then you’ll find that amazing career opportunities are knocking. No matter who you are, Wayfair is a place you can call home. We’re a community of innovators, risk-takers, and trailblazers who celebrate our differences, and know that our unique perspectives make us stronger, smarter, and well-positioned for success. We value and rely on the collective voices of our employees, customers, community, and suppliers to help guide us as we build a better Wayfair – and world – for all. Every voice, every perspective matters. That’s why we’re proud to be an equal opportunity employer. We do not discriminate on the basis of race, color, ethnicity, ancestry, religion, sex, national origin, sexual orientation, age, citizenship status, marital status, disability, gender identity, gender expression, veteran status, genetic information, or any other legally protected characteristic. Your personal data is processed in accordance with our Candidate Privacy Notice (https://www.wayfair.com/careers/privacy). If you have any questions or wish to exercise your rights under applicable privacy and data protection laws, please contact us at dataprotectionofficer@wayfair.com.

Posted 3 weeks ago

Apply

0 years

9 - 9 Lacs

Bengaluru

On-site

Associate - Production Support Engineer Job ID: R0388737 Full/Part-Time: Full-time Regular/Temporary: Regular Listed: 2025-06-27 Location: Bangalore Position Overview Job Title: Associate - Production Support Engineer Location: Bangalore, India Role Description You will be operating within Corporate Bank Production as an Associate, Production Support Engineer in the Corporate Banking subdivisions. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery plus, using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we’ll offer you As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of the run book. Drive knowledge management across the supported applications and ensure full compliance Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your skills and experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Unix, Shell Scripting and/or Python SQL Stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters ITIL v3 Certified (must) Control-M, CRON scheduling MQ- DBUS, IBM JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Data Streaming – Kafka (Experience with Confluent flavor a plus) and ZooKeeper Hadoop framework Configuration Mgmt Tooling: Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) APM Tooling: either or one of Splunk AppDynamics Geneos NewRelic Other platforms: Scheduling – Ctrl-M is a plus, Autosys, etc Search – Elastic Search and/or Solr+ is a plus Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Familiarity with TDD and/or BDD Distributed systems Experience on cloud platforms such as Azure, GCP is a plus Familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT IntelliJ SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in Follow the Sun model, virtual teams and in matrix structure Service Operations experience within a global operations context 6-9 yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of run-book execution Experience of supporting complex application and infrastructure domains Good analytical, troubleshooting and problem-solving skills Working knowledge of incident tracking tools (i.e., Remedy, Heat etc.) How we’ll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

5.0 years

3 - 9 Lacs

Chennai

On-site

The Industrial System Analytics (ISA) team within GDIA develops cutting-edge cloud analytic solutions using GCP tools and techniques to drive strategic insights across Ford. As a Product Owner (Supervisor), you will be a critical leader within our product-driven organization. You will be responsible for defining, prioritizing, and delivering high-value data products and analytical solutions that directly address key business challenges. This role requires a strong blend of strategic product thinking, hands-on agile execution, and the ability to lead, mentor, and guide your team (or cross-functional teams) to achieve exceptional outcomes in a dynamic, data-intensive environment. You’ll have… Bachelor's degree in a quantitative field such as Computer Science, Engineering, Information Systems, Business Analytics, or a related discipline. 5+ years of experience as a Product Owner, Business Analyst, or similar role managing digital products or data solutions. Demonstrated experience in defining product roadmaps, managing backlogs, and prioritizing features. Proven experience working within an Agile software development environment. Experience gathering and translating business requirements into technical specifications and user stories. Strong understanding of data analytics, AI/ML concepts, and how they can drive business value. Familiarity with cloud platforms, preferably Google Cloud Platform (GCP) services (e.g., BigQuery, GCS, Dataflow). Excellent communication, interpersonal, and stakeholder management skills. Even better, you may have… Master's degree or PhD in a quantitative field. Experience supervising or mentoring other Product Owners or team members. Hands-on experience with data visualization tools (e.g., Tableau, Power BI, Looker). Proficiency in SQL and/or scripting languages (e.g., Python) for data exploration. Knowledge of Ford's internal data ecosystems or IT systems. Experience with DevSecOps practices and tools (e.g., CI/CD pipelines, Jira, GitHub). Certified Scrum Product Owner (CSPO) or similar Agile certification. Proven ability to balance "doing it right" with "speed to delivery" in a fast-paced environment. Inquisitive, proactive, and interested in learning new tools and techniques. - Product Strategy & Vision: Translate high-level business objectives and customer needs into clear product vision, strategy, and measurable outcomes for your product area. Communicate product vision and strategy effectively to the development team, stakeholders, and leadership, ensuring alignment and buy-in. Gather and analyze customer/internal feedback to continuously refine the product roadmap and drive improvements. Backlog Management & Prioritization: Own, define, and prioritize the product backlog, ensuring it is well-groomed with clear, actionable user stories and acceptance criteria. Collaborate closely with engineering, data science, and UX teams to refine requirements and ensure technical feasibility and optimal solution design. Manage interdependencies across features and product releases, identifying and proactively mitigating risks to delivery. Stakeholder Collaboration & Communication: Act as the primary liaison between business stakeholders, customers, and the development team, fostering strong relationships. Translate complex technical concepts into understandable business language and vice versa, facilitating effective decision-making. Manage stakeholder expectations and provide regular, transparent updates on product progress, risks, and achievements. Act as a strategic consultant to the business, guiding them towards optimal data-driven solutions rather than just fulfilling requests. Product Delivery & Quality Assurance: Ensure that delivered software and analytical solutions meet desired business outcomes, quality standards, and compliance requirements (e.g., security, legal, Ford policies). Collaborate with the team to define relevant analytics and metrics to track product performance, adoption, and realized business value. Facilitate user acceptance testing and feedback loops to ensure product adoption and satisfaction. Agile Leadership & Process Improvement: Champion Agile software development principles, culture, and best practices within your team and across the organization. Lead and facilitate team ceremonies (e.g., sprint planning, reviews, retrospectives) to ensure efficient and effective delivery. Mentor, coach, and guide team members (including junior Product Owners, if applicable, or cross-functional team members) in product ownership best practices, problem-solving, and continuous improvement. Ensure effective usage of agile tools (e.g., Jira) and derive meaningful insights for continuous improvement of processes and delivery. Drive adoption of DevSecOps and software craftsmanship practices (CI/CD, TDD) where applicable. -

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

For quick Response, please fill out the form Job Application Form 34043 - Data Scientist - Senior I - Udaipur https://docs.google.com/forms/d/e/1FAIpQLSeBy7r7b48Yrqz4Ap6-2g_O7BuhIjPhcj-5_3ClsRAkYrQtiA/viewform 3–5 years of experience in Data Engineering or similar roles Strong foundation in cloud-native data infrastructure and scalable architecture design Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools Design and optimize Data Lakes and Data Warehouses for real-time and batch processing Ingest, transform, and organize large volumes of structured and unstructured data Collaborate with analysts, data scientists, and backend engineers to define data needs Monitor, troubleshoot, and improve pipeline performance, cost-efficiency, and reliability Implement data validation, consistency checks, and quality frameworks Apply data governance best practices and ensure compliance with privacy and security standards Use CI/CD tools to deploy workflows and automate pipeline deployments Automate repetitive tasks using scripting, workflow tools, and scheduling systems Translate business logic into data logic while working cross-functionally Strong in Python and familiar with libraries like pandas and PySpark Hands-on experience with at least one major cloud provider (AWS, Azure, GCP) Experience with ETL tools like AWS Glue, Azure Data Factory, GCP Dataflow, or Apache NiFi Proficient with storage systems like S3, Azure Blob Storage, GCP Cloud Storage, or HDFS Familiar with data warehouses like Redshift, BigQuery, Snowflake, or Synapse Experience with serverless computing like AWS Lambda, Azure Functions, or GCP Cloud Functions Familiar with data streaming tools like Kafka, Kinesis, Pub/Sub, or Event Hubs Proficient in SQL, and knowledge of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) databases Familiar with big data frameworks like Hadoop or Apache Spark Experience with orchestration tools like Apache Airflow, Prefect, GCP Workflows, or ADF Pipelines Familiarity with CI/CD tools like GitLab CI, Jenkins, Azure DevOps Proficient with Git, GitHub, or GitLab workflows Strong communication, collaboration, and problem-solving mindset Experience with data observability or monitoring tools (bonus points) Contributions to internal data platform development (bonus points) Comfort working in data mesh or distributed data ownership environments (bonus points) Experience building data validation pipelines with Great Expectations or similar tools (bonus points)

Posted 3 weeks ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies