Jobs
Interviews

8424 Hadoop Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities 0 -2 years of experience as AI/ML engineer or similar role. Strong knowledge of machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn). Hands-on experience with model development and deployment processes. Proficiency in programming languages such as Python. Experience with data preprocessing, feature engineering, and model evaluation techniques. Familiarity with cloud platforms (e.g., AWS) and containerization (e.g., Docker, Kubernetes). Familiarity with version control systems (e.g., GitHub). Proficiency in data manipulation and analysis using libraries such as NumPy and Pandas. Good to have knowledge of deep learning, ML Ops: Kubeflow, MLFlow, Nextflow. Knowledge on text Analytics, NLP, Gen AI Mandatory Skill Sets ML Ops, AI / ML Preferred Skill Sets ML Ops, AI / ML Years Of Experience Required 0 - 2 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Science Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Strategy {+ 22 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 16 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 16 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 16 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 16 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Job Description Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: · Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. · Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. · Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. · Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. · Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. · Ensure data quality, security, and compliance with industry standards and regulations. · Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory skill sets: Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred skill sets: Python, SQL, and data modeling Years of experience required: 3 to 10 Years Education qualification: Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 16 hours ago

Apply

5.0 years

4 - 9 Lacs

Hyderābād

Remote

Data Engineer Remote role based in India. Note - This is a full time, remote, salaried position through Red Elk Consulting, llc, based in India. This role is 100% focused and dedicated to supporting Together Labs, as a consultant, and includes; salary, benefits, vacation, and a local India - based support team We are seeking an experienced and motivated Data Engineer to join our dynamic data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines, managing our data warehouse infrastructure, and supporting analytics initiatives across the organization. You will work closely with data scientists, analysts, and other stakeholders to ensure data quality, integrity, and accessibility, enabling the organization to make data-driven decisions. RESPONSIBILITIES Design and Develop Data Pipelines: Architect, develop, and maintain robust and scalable data pipelines for ingesting, processing, and transforming large volumes of data from multiple sources in real-time and batch modes . Data Warehouse Management: Manage, optimize, and maintain the data warehouse infrastructure, ensuring data integrity, security, and availability. Oversee the implementation of best practices for data storage, partitioning, indexing, and schema design. ETL Processes: Design and build efficient ETL (Extract, Transform, Load) processes to move data across various systems while ensuring high performance, reliability, and scalability. Data Integration: Integrate diverse data sources (structured, semi-structured, and unstructured data) into a unified data model that supports analytics and reporting needs. Support Analytics and BI: Collaborate with data analysts, data scientists, and business intelligence teams to understand data requirements and provide data sets, models, and solutions that support their analytics needs. Data Quality and Governance: Establish and enforce data quality standards, governance policies, and best practices. Implement monitoring and alerting to ensure data accuracy, consistency, and completeness. Operational Excellence: Drive the development of automated systems for provisioning, deployment, monitoring, failover, and recovery. Implement systems to monitor key performance metrics, logs, and alerts with a focus on automation and reducing manual intervention. Cross-functional Collaboration: Work closely with product, engineering, and QA teams to ensure the infrastructure supports and enhances development workflows and that services are deployed and operated smoothly at scale. Incident Management & Root Cause Analysis: Act as a first responder to data production issues, leading post-mortems and implementing long-term solutions to prevent recurrence. Ensure all incidents are handled promptly with a focus on minimizing impact. Security & Compliance: Ensure our infrastructure is designed with security best practices in mind, including encryption, access control, and vulnerability scanning. Continuous Improvement: Stay up-to-date with industry trends, technologies, and best practices, bringing innovative ideas into the team to improve reliability, performance, and scale. QUALIFICATIONS Education & Experience: Bachelor’s degree in Computer Science, Engineering, or related technical field, or equivalent practical experience. 5+ years of experience in data engineering, with a strong background in systems architecture, distributed systems, cloud infrastructure, or a related field. Proven experience building and managing data pipelines, data warehouses, and ETL processes. Technical skills: Strong proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL, Oracle) and data warehousing solutions (e.g., Snowflake, Redshift, BigQuery). Expertise in data pipeline tools and frameworks (e.g., AWS Glue, Google Dataflow, Apache Airflow, Apache NiFi, dbt). Hands-on experience with cloud platforms and their data services (e.g., AWS, Azure, Google Cloud Platform). Proficiency in programming languages such as Python, Java, or Scala for data manipulation and automation. Knowledge of data modeling, schema design, and data governance principles. Familiarity with distributed data processing frameworks like Apache Spark, Hadoop, or similar. Experience with BI tools (e.g., Tableau, Power BI, Looker) Experience with AWS and standard practices working in Cloud based environments Soft Skills: Strong problem-solving and analytical skills with a keen attention to detail. Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders. Proactive mindset with the ability to work independently and handle multiple tasks in a fast-paced environment. ABOUT US Together Labs innovates technologies that empower people worldwide to connect, create and earn in virtual worlds. Our mission is to redefine social media as a catalyst for authentic human connection through the development of a family of products grounded in this core value. These include: IMVU, the world's largest friendship discovery and social platform and VCOIN, the first regulatory-approved transferable digital currency;. For more information, please visit https://togetherlabs.com/ Founded in 2004 and based in the heart of Silicon Valley, Together Labs is led by a team that's dedicated to pioneering in virtual worlds. Together Labs is backed by venture investors Allegis Capital, Bridgescale Partners and Best Buy Capital. Together Labs (formerly IMVU) has been for nine years running as Best Place to Work in the Silicon Valley. HOW TO APPLY Please familiarize yourself with our products and feel free to try out our core product at https://www.imvu.com/ Together Labs is an equal opportunity employer, and is committed to fostering a culture of inclusion. Our unique differences enable us to learn, collaborate, and grow together. We welcome all applicants without regard to race, color, religious creed, sex, national origin, citizenship status, age, physical or mental disability, sexual orientation, gender identification, marital, parental, veteran or military status, unfavorable military discharge, decisions regarding reproductive health, or any other status protected by applicable federal, state, or local law. This is a remote position.

Posted 17 hours ago

Apply

8.0 years

0 Lacs

Hyderābād

On-site

At least 8+ years of experience and strong knowledge in Scala programming language. Able to write clean, maintainable and efficient Scala code following best practices. Good knowledge on the fundamental Data Structures and their usage At least 8+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies. Having expertise in Spark Core, Spark SQL and Spark Streaming. Experience with Hadoop, HDFS, Hive and other BigData technologies. Familiarity with Data warehousing and ETL concepts and techniques Having expertise in Database concepts and SQL/NoSQL operations. UNIX shell scripting will be an added advantage in scheduling/running application jobs. At least 8 years of experience in Project development life cycle activities and maintenance/support projects. Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives. Understand project requirements and translate them into technical solutions which meets the project quality standards Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues. Strong problem solving and Good Analytical skills. Excellent verbal and written communication skills. Experience and desire to work in a Global delivery environment. Stay up to date with new technologies and industry trends in Development. Job Types: Full-time, Permanent, Contractual / Temporary Pay: ₹5,000.00 - ₹9,000.00 per day Work Location: In person

Posted 17 hours ago

Apply

1.0 years

1 - 5 Lacs

Hyderābād

On-site

Are you looking for an opportunity to join a team of engineers in positively affecting the experience of every consumer who uses Microsoft products? The OSSE team in OPG group is focused on building client experiences and services that light up Microsoft Account experiences across all devices and platforms. We are passionate about working together to build delightful and inclusive account experiences that empower customers to get the most out of what Microsoft has to offer. We’re looking for a collaborative, inclusive and customer obsessed engineer to help us build and sustain authentication experiences like Passkeys as well as engage with our customers by building experiences to help users keep their account secure and connected across multiple devices and applications. We're looking for an enthusiastic Software Engineer to help us build account experiences and deliver business Intelligence through data for experiences across 1.5 billion Windows devices and various Microsoft products. Your responsibilities will include working closely with a variety of teams such as Engineering, Program Management, Design and application partners to understand the key business questions for customer-facing scenarios, to set up the key performance indicators, and setup data pipelines to identify insights and experiment ideas that moves our business metrics. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Enable the Windows, Developers, and Experiences team to do more with data across all aspects of the development lifecycle. Contribute to a data-driven culture as well as a culture of experimentation across the organization. Provide new and improve upon existing platform offerings with a fundamental understanding of the end-to-end scenarios. Collaborate with partner teams and customers to scope and deliver projects. You’ll write secure, reliable, scalable, and maintainable code, and then effectively debug it, test it, and support it. Authoring and design of Big Data ETL pipelines in SCOPE, Scala, SQL, Python, or C#. Qualifications Required Qualifications: Bachelor's Degree in Computer Science, or related technical discipline with proven experience coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Proven coding and debugging skills in C#, C++, Java, or SQL. Ability to work and communicate effectively across disciplines and teams. Preferred Qualifications: 1+ years of experience in data engineering. Understanding and experience with data cloud computing technologies such as – Azure Synapse, Azure Data Factory, SQL, Azure Data Explorer, Power BI, PowerApps, Hadoop, YARN, Apache Spark. Excellent analytical skills with systematic and structured approach to software design. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 17 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

TransUnion's Job Applicant Privacy Notice What We'll Bring TransUnion is a global information and insights company that makes trust possible in the modern economy. We do this by providing a comprehensive picture of each person so they can be reliably and safely represented in the marketplace. As a result, businesses and consumers can transact with confidence and achieve great things. We call this Information for Good.® A leading presence in more than 30 countries across five continents, TransUnion provides solutions that help create economic opportunity, great experiences and personal empowerment for hundreds of millions of people. What You'll Bring As consultant on our team, you will join a global group of statisticians, data scientists, and industry experts on a mission to extract insights from data and put them to good use. You will have an opportunity to be a part of a variety of analytical projects in a collaborative environment and be recognized for the work you deliver. TransUnion offers a culture of lifelong learning and as an associate here, your growth potential is limitless. The consultant role within the Research and Consulting team is responsible for delivering market-level business intelligence both to TransUnion’s senior management and to Financial Services customers. You will work on projects across international markets, including Canada, Hong Kong, UK, South Africa, Philippines, and Colombia. To be successful in this position, you must have good organizational skills, a strategic mindset, and a flexible predisposition. You will also be expected to operate independently and able to lead and present projects with minimal supervision. How You’ll Contribute You will develop a strong understanding of consumer credit data and how it applies to industry trends and research across different international markets You will dig in by extracting data and performing segmentation and statistical analyses on large population datasets (using languages such as R, SQL, and Python on Linux and PC computing platforms) You will conduct analyses and quantitative research studies designed to understand complex industry trends and dynamics, leveraging a variety of statistical techniques You will deliver analytic insights and recommendations in succinct and compelling presentations for internal and external customers at various levels including an executive audience; you may lead key presentations to clients You will perform multiple tasks simultaneously and deal with changing requirements and deadlines You will develop strong consulting skills to be able to help external customers by understanding their business needs and aligning them with TransUnion’s product offerings and capabilities You will help to cultivate an environment that promotes excellence, innovation, and a collegial spirit Through all these efforts, you will be a key contributor to driving the perception of TransUnion as an authority on lending dynamics and a worthwhile, trusted partner to our clients and prospects Impact You'll Make What you'll bring: A Bachelor’s or Master’s degree in Statistics, Applied Mathematics, Operations Research, Economics, or an equivalent discipline Minimum 3-5 years of experience in a relevant field, such as data analytics, lending, or risk strategy Advanced proficiency with one or more statistical programming languages such as R Advanced proficiency writing SQL queries for data extraction Experience with big data platforms (e.g. Apache Hadoop, Apache Spark) preferred Advanced experience with the MS Office suite, particularly Word, Excel, and PowerPoint Strong time management skills with the ability to prioritize and contribute to multiple assignments simultaneously Excellent verbal and written communication skills. You must be able to clearly articulate ideas to both technical and non-technical audiences Highly analytical mindset with the curiosity to dig deeper into data, trends, and consumer behavior A strong interest in the areas of banking, consumer lending, and finance is paramount, with a curiosity as to why consumers act the way they do with their credit Strong work ethic with the passion for team success This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Sr Consultant, Data Analysis and Consulting

Posted 17 hours ago

Apply

175.0 years

8 - 9 Lacs

Gurgaon

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. This opportunity is in American Express Sales and Business Enablement (SABE) Platform and Capabilities team, responsible for supporting new products and services which develop new business opportunities for Amex and expand product distribution to support enterprise value and drive revenue growth. The team is focused on meeting the business needs of internal as well as external customers by creating information products and services that leverage the American Express new age digital ecosystem and proprietary closed loop data. The Enterprise Acquisition Product & Platform (EAPP) is a product journey team responsible for providing integrated digital journey for customer onboarding for our customers across different businesses. EAPP uses big data platform, cloud systems, APIs and algorithms to integrate multiple AmEx product, sales, risk and information platforms, perform evaluations and guide customers to a seamless onboarding. The current role at SABE is to enhance EAPP DQ standards by establishing a proactive monitoring, management and corrective system across the process using enterprise and external tools. Responsibilities of the candidate would include: Lead 3-4 analysts to drive key data quality initiatives for EAPP Spearhead the model discussion and establish a framework in alignment with Enterprise guidelines Define and execute plan to support business process specific data analysis, create specific business rules on data trends, data monitoring, data quality analysis and taking remediate actions Exploratory analysis to understand Data trends and create rationalized rules for quality monitoring Partner with key stakeholders to timely communicate on key milestones and resolve any challenges/roadblocks proactively Create / document / execute rules-based monitoring for new application capabilities Manage product issues and drive towards a speedy issue resolution. Manage customer expectations including scope, schedule, changes, and problem resolution Drive on-time, high quality project deliverables Detailed RCA using Hive/GCP/Python Documented best practice guidelines. Appropriate action to highlight challenges and ensure resolution via correct channels. Required Skills 8-12 years of experience in the field of data management, data quality and governance. Strong analytical skills include the ability to think through all aspects of complex business requirements and generate possible courses of action. Demonstrated ability to drive results and manage multiple relationships and projects, proven ability to adjust quickly to shifting priorities, multiple demands, ambiguity and rapid change. Ability to prioritize and deliver work within a deadline-driven climate. Strong communication skills oriented towards working with customers to document their product needs. Team player with demonstrated ability in collaborating across customer and Technology organizations. Experience in Data Products and Pipelines, understanding systems. Strong Governance mindset with Aptitude for Data Quality. SAS/SQL/Hive programming skills Basic Unix skills/MVL Scripts Knowledge of Big Data/ Hadoop /Cloud/APIs or experience of working in Python programming is a strong plus. Experience working within a complex data warehouse environment is a strong plus. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 17 hours ago

Apply

2.0 years

7 - 10 Lacs

Noida

On-site

Every day, Global Payments makes it possible for millions of people to move money between buyers and sellers using our payments solutions for credit, debit, prepaid and merchant services. Our worldwide team helps over 3 million companies, more than 1,300 financial institutions and over 600 million cardholders grow with confidence and achieve amazing results. We are driven by our passion for success and we are proud to deliver best-in-class payment technology and software solutions. Join our dynamic team and make your mark on the payments technology landscape of tomorrow. Summary of This Role Works throughout the software development life cycle and performs in a utility capacity to create, design, code, debug, maintain, test, implement and validate applications with a broad understanding of a variety of languages and architectures. Analyzes existing applications or formulate logic for new applications, procedures, flowcharting, coding and debugging programs. Maintains and utilizes application and programming documents in the development of code. Recommends changes in development, maintenance and system standards. Creates appropriate deliverables and develops application implementation plans throughout the life cycle in a flexible development environment. What Part Will You Play? Develops basic to moderately complex code using a front or back end programming language within a platform as needed in collaboration with business and technology teams for internal and external client software solutions. Creates, and delivers routine program specifications for code development and support on a project /issue with a moderate understanding of the application / database to better align interactions and technologies. Analyzes, modifies, and develops basic to moderately complex code / unit testing in order to develop application documentation. Performs testing and validation requirements for basic to moderately complex code changes. Performs corrective measures for basic to moderately complex code deficiencies and escalates alternative proposals. Applies a moderate understanding of procedures, methodology and application standards to include Payment Card Industry (PCI) security compliance. What Are We Looking For in This Role? Minimum Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field Typically minimum of 2 years - Professional Experience In Coding, Designing, Developing And Analyzing Data. Typically has a basic knowledge and use of one or more languages / technologies from the following but not limited to; two or more modern programming languages used in the enterprise, experience working with various APIs, external Services, experience with both relational and NoSQL Databases Preferred Qualifications BS in Computer Science, Information Technology, Business / Management Information Systems or related field 4+ years professional Experience In Coding, Designing, Developing And Analyzing Data and experience with IBM Rational Tools What Are Our Desired Skills and Capabilities? Skills / Knowledge - Developing professional expertise, applies company policies and procedures to resolve a variety of issues. Job Complexity - Works on problems of moderate scope where analysis of situations or data requires a review of a variety of factors. Exercises judgment within defined procedures and practices to determine appropriate action. Builds productive internal/external working relationships. Supervision - Normally receives general instructions on routine work, detailed instructions on new projects or assignments. Operating Systems: Linux distributions including one or more for the following: Ubuntu, CentOS/RHEL, Amazon Linux Microsoft Windows z/OS Tandem/HP-Nonstop Database - Design, familiarity with DDL and DML for one or more of the following databases Oracle, MySQL, MS SQL Server, IMS, DB2, Hadoop Back-end technologies - Java, Python, .NET, Ruby, Mainframe COBOL, Mainframe Assembler Front-end technologies - HTML, JavaScript, jQuery, CICS Web Frameworks – Web technologies like Node.js, React.js, Angular, Redux Development Tools - Eclipse, Visual Studio, Webpack, Babel, Gulp Mobile Development – iOS, Android Machine Learning – Python, R, Matlab, Tensorflow, DMTK Global Payments Inc. is an equal opportunity employer. Global Payments provides equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, sex (including pregnancy), national origin, ancestry, age, marital status, sexual orientation, gender identity or expression, disability, veteran status, genetic information or any other basis protected by law. If you wish to request reasonable accommodations related to applying for employment or provide feedback about the accessibility of this website, please contact jobs@globalpay.com.

Posted 17 hours ago

Apply

0 years

7 - 8 Lacs

Noida

On-site

Posted On: 6 Aug 2025 Location: Noida, UP, India Company: Iris Software Why Join Us? Are you inspired to grow your career at one of India’s Top 25 Best Workplaces in IT industry? Do you want to do the best work of your life at one of the fastest growing IT services companies ? Do you aspire to thrive in an award-winning work culture that values your talent and career aspirations ? It’s happening right here at Iris Software. About Iris Software At Iris Software, our vision is to be our client’s most trusted technology partner, and the first choice for the industry’s top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. Working at Iris Be valued, be inspired, be your best. At Iris Software, we invest in and create a culture where colleagues feel valued, can explore their potential, and have opportunities to grow. Our employee value proposition (EVP) is about “Being Your Best” – as a professional and person. It is about being challenged by work that inspires us, being empowered to excel and grow in your career, and being part of a culture where talent is valued. We’re a place where everyone can discover and be their best version. Job Description Must have 3+ hands-on experience in test automation development using Python. Must have Basic knowledge of Big Data and AI Ecosystem. Must have API testing experience using any framework available in the market using Python. Continuous testing experience and expertise required. Proven success in position of similar responsibilities in a QA environment. Must be strong in writing efficient code in Python using data frames. Must have hands on experience on Python, PySpark, Linux, Big Data(data validation), Jenkins, Github. Good to havr AWS-Hadoop Commands, QTest, Java, Rest Assured, Selenium, Pytest, Playwright, Cypress, Cucumber, Behave, Jmeter, LoadRunner. Mandatory Competencies QA/QE - QA Automation - Python Beh - Communication QA/QE - QA Manual - API Testing Big Data - Big Data - Pyspark Operating System - Operating System - Linux Perks and Benefits for Irisians At Iris Software, we offer world-class benefits designed to support the financial, health and well-being needs of our associates to help achieve harmony between their professional and personal growth. From comprehensive health insurance and competitive salaries to flexible work arrangements and ongoing learning opportunities, we're committed to providing a supportive and rewarding work environment. Join us and experience the difference of working at a company that values its employees' success and happiness.

Posted 17 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Description ZiffyHealth is an IoT-enabled, AI-driven health-tech platform focused on bridging the gap in the availability of healthcare professionals between urban and rural India. Based on a big-data Hadoop ecosystem, ZiffyHealth aims to make healthcare more accessible and affordable through cutting-edge technology. The platform is incubated at the Atal Incubation Center-Pinnacle, supported by the Atal Innovation Mission, NITI Aayog, Government of India. ZiffyHealth's mission is to create a world where everyone has the opportunity to lead a healthy and productive life. Role Description This is a full-time on-site role for an Assistant to the Chief Executive Officer, located in Pune. The Assistant to the CEO will be responsible for providing executive administrative assistance, managing office administration, handling day-to-day communications, and organizing tasks. Responsibilities also include managing schedules, coordinating meetings, and supporting the CEO in various administrative functions. Qualifications Executive Administrative Assistance and Administrative Assistance skills Excellent Communication and Organization skills Strong Office Administration skills Ability to manage multiple tasks effectively Proficiency in Microsoft Office Suite and other relevant software Ability to work independently and handle confidential information Bachelor's degree in Business Administration, Communications, or a related field is preferred

Posted 17 hours ago

Apply

0 years

2 - 4 Lacs

Calcutta

Remote

Role Description: This is a full-time role for a Technical Academic Writer at SB Learning Ventures located in Kolkata. The Academic Writer will be responsible for developing high-quality academic content related to technical subjects such as data analytics, machine learning, statistics, programming, and finance. The role requires producing original, well-researched assignments while adhering to academic standards and strict deadlines. Key Responsibilites: 1. Working on academic projects, research papers, dissertations, and case studies related to technical domains for university students, based on subject matter expertise. 2. Submitting well-structured, plagiarism-free solutions within the given deadlines. 3. Reading and analyzing assignment requirements thoroughly before starting the task. 4. Applying technical knowledge and analytical tools effectively while preparing assignments. Skill Requirements: Proven expertise in data analytics, with hands-on experience in visualisation tools such as Excel, Tableau, Power BI, and Google Looker Studio. Strong background in database management with proficiency in MySQL, SQLlite, MS Access, and Server Studio. Advanced knowledge of machine learning, including programming in Python and SAS Studio (familiarity with MATLAB is a plus). Extensive experience in statistical analysis using tools like STATA, SPSS, Gretl, Eviews, Advanced Excel, and RStudio. Solid understanding of economic and financial principles as they apply to technical solutions. Proficiency in Java development and application design. Experience with Hadoop software and big data technologies. Excellent problem-solving skills with the ability to drive strategic technology initiatives. Strong communication and leadership abilities to work effectively across cross-functional teams. Preferred: Experienced in Technical writing or academic research writing. Job Type: Full-time Location: Remote Office Location: Kolkata Job Type: Full-time Pay: ₹20,000.00 - ₹35,000.00 per month Work Location: Remote

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Job Description Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. Ensure data quality, security, and compliance with industry standards and regulations. Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory Skill Sets Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred Skill Sets Python, SQL, and data modeling Years Of Experience Required 3 to 10 Years Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. Ensure data quality, security, and compliance with industry standards and regulations. Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory Skill Sets Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred Skill Sets Python, SQL, and data modeling Years Of Experience Required 3 to 10 Years Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. Ensure data quality, security, and compliance with industry standards and regulations. Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory Skill Sets Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred Skill Sets Python, SQL, and data modeling Years Of Experience Required 3 to 10 Years Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree, Master Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. Ensure data quality, security, and compliance with industry standards and regulations. Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory Skill Sets Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred Skill Sets Python, SQL, and data modeling Years Of Experience Required 3 to 10 Years Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 17 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory (ADF) for ingestion, transformation, and integration of data from various energy systems. Implement workflows using Logic Apps and Function Apps to automate data processing and system interactions. Manage and optimize data storage using Azure Data Lake Storage Gen2 (ADLS Gen2) and Azure SQL databases. Integrate real-time and batch data from IoT devices and energy platforms using Event Hub and Service Bus. Collaborate with cross-functional teams to support data needs across power distribution, automation, and monitoring solutions. Ensure data quality, security, and compliance with industry standards and regulations. Document data architecture, workflows, and best practices for operational transparency and scalability. Mandatory Skill Sets Azure Data Factory, Logic Apps, Function Apps, Azure SQL, Event Hub, Service Bus, and ADLS Preferred Skill Sets Python, SQL, and data modeling Years Of Experience Required 3 to 10 Years Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 17 hours ago

Apply

5.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Senior Data Architect Year of Experience: 5 - 10 Years Job Description: The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities: · Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns · Design logical and physical data models, semantic layers, and metadata frameworks · Establish data quality, lineage, governance, and security policies · Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks · Integrate AI and analytics solutions with operational data platforms · Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake · Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills · Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift · Data Modeling: ERWin, dbt, Power Designer · Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark · Integration: Azure Data Factory, Kafka, Event Grid, SSIS · Metadata/Lineage: Purview, Collibra, Informatica · BI Platforms: Power BI, Tableau, Looker · Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification · Bachelor’s or Master’s in Computer Science, Information Systems, or Data Engineering · Microsoft Certified: Azure Data Engineer / Azure Solutions Architect · Strong experience building cloud-native data architectures · Demonstrated ability to create data blueprints aligned with business strategy and compliance.

Posted 18 hours ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Design and develop data pipelines using Databricks and PySpark to ingest, process, and transform large volumes of data. Implement ETL/ELT workflows to move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using cloud-native tools. Work with structured and unstructured data stored in AWS or Azure Data Lakes. Apply strong SQL and Python skills to manipulate and analyze data efficiently. Collaborate with cross-functional teams to deliver cloud-based serverless data solutions. Design innovative data solutions that address complex business requirements and support data-driven decision-making. Maintain documentation and enforce best practices for data architecture, governance, and performance optimization. Mandatory Skill Sets Databricks, PySpark, and SQL on any cloud platform (AWS or Azure). Preferred Skill Sets ETL,Pyspark Years of experience required: 4 to 10 Years Education Qualification Bachelor's degree in computer science, data science or any other Engineering discipline. Master’s degree is a plus. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master Degree, Bachelor Degree Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills PySpark Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 18 hours ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Looking to hire a Data Engineer who has a good understanding of Data Product Lifecycle, Standards and Practices. Will be responsible for building scalable and efficient data solutions to support the Finance, Franchising & Development function with a specific focus on the Finance Analytics product and initiatives. As a Data Engineer, you will collaborate with data scientists, analysts, and other cross-functional teams to ensure the availability, reliability, and performance of data systems. Vital team member for initiatives to enable trusted financial data, supports decision-making, and partners with business and technology teams to align data capabilities with strategic finance objectives. Expertise in cloud computing platforms, technologies and data engineering best practices will play a crucial role within this domain. Who we are looking for: Primary Responsibilities: Builds and maintains relevant and reliable data products that support Finance Analytics. Develops and implements new technology solutions as needed to ensure ongoing improvement with data reliability and observability in-view. Participates in new software development and data engineering initiatives supporting Finance Analytics, ensuring timely and accurate delivery of financial data products. Drive and implement best Data Engineering practices for pipeline development, data governance, data security and quality across financial datasets. Implement security and privacy controls in data workflows, ensuring compliance with finance regulatory requirements. Monitor, troubleshoot, and improve performance and reliability of existing finance data pipeline infrastructure. Staying up to date with emerging data engineering technologies, trends, and best practices, and evaluating their applicability to meet evolving financial analytics needs. Documenting data engineering processes, workflows, and solutions for knowledge sharing and future reference. Partner and collaborate with data engineers, particularly in finance-centric data models and processing frameworks. Ability and flexibility to coordinate and work with teams distributed across time zones, as needed. Skill: Applies technical data engineering expertise to develop reliable pipelines and improve data quality in support of finance and analytics initiatives Bachelor's or master's degree in computer science or related engineering field and deep experience with Cloud computing 3+ years of professional experience in data engineering or related fields Proficiency in Python, Java, or Scala for data processing and automation Hands-on experience with data orchestration tools (e.g., Apache Airflow, Luigi) and big data ecosystems (e.g., Hadoop, Spark, NoSQL) Good working knowledge of Data quality functions like cleansing, standardization, parsing, de-duplication, mapping, hierarchy management, etc. Ability to perform extensive data analysis (comparing multiple datasets) using a variety of tools Effective communication and stakeholder management skills to drive alignment and adoption of data engineering standards Demonstrated experience in data management & data governance capabilities Familiarity with data warehousing principles and best practices. Excellent problem solver - use of data and technology to solve problems or answer complex data related questions Excellent collaboration skills to work effectively in cross-functional teams Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid.

Posted 19 hours ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position Overview: We are looking for a highly skilled and experienced Senior AI/ML Engineer to join our team. The ideal candidate will have a strong background in artificial intelligence, machine learning, and deep learning, with a proven track record of building and deploying scalable AI models in real-world applications. Key Responsibilities: Design, develop, and deploy scalable AI/ML models and algorithms for real-time or batch processing applications. Collaborate with cross-functional teams including data engineering, software development, and product management to integrate AI solutions into products. Conduct in-depth research and experimentation to improve model performance and develop new approaches using state-of-the-art techniques. Evaluate and optimize ML models for speed, accuracy, and scalability. Maintain and improve existing ML infrastructure, pipelines, and toolkits. Mentor junior developers and contribute to AI/ML best practices and standards. Stay current with the latest research and developments in the AI/ML space. Required Skills and Qualifications: 6+ years of experience in and deploying machine learning and AI solutions. Strong programming skills in Python (with libraries like TensorFlow, PyTorch, Scikit-learn, NumPy, Pandas, etc.). Solid understanding of machine learning algorithms , statistical modelling , data preprocessing , and feature engineering . Experience in building and tuning deep learning models (CNNs, RNNs, Transformers, etc.). Proficiency in working with cloud platforms (AWS, GCP, Azure) and ML Ops tools (MLflow, Kubeflow, etc.). Experience with big data technologies (e.g., Spark, Hadoop) and data pipelines . Strong problem-solving skills and ability to translate business needs into technical solutions. Familiarity with model explainability , bias mitigation , and responsible AI practices. Experience in natural language processing (NLP) , computer vision , or recommendation systems . Familiarity with containerization and orchestration tools (Docker, Kubernetes). Published papers or contributions to open-source ML/AI projects. Certifications (Good to have any) Google Professional Machine Learning Engineer Microsoft Certified: Azure AI Engineer Associate AWS Certified Machine Learning – Specialty TensorFlow Developer Certificate Experience: 6+ years of experience in and deploying machine learning and AI solutions. Educational Qualification(s): Bachelor’s in computer science, Machine Learning, Data Science, or a related field. To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf

Posted 19 hours ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Are you passionate about solving complex business challenges through cutting-edge AI and data science? Are you a proven leader with a vision for leveraging advanced analytics to create scalable solutions? We are looking for an accomplished and innovative Lead Data Scientist to join our growing team. This pivotal role offers the unique opportunity to shape data-driven strategies, lead groundbreaking AI initiatives, and mentor a high-performing team of data scientists. If you're ready to make a significant impact, we invite you to explore this exciting opportunity. As the Lead Data Scientist, you’ll work at the intersection of innovation and business impact, spearheading AI/ML solutions that address complex challenges. This role not only requires exceptional technical expertise but also the ability to inspire and lead a talented team, driving excellence in every project. Key Responsibilities 1. Leadership & Mentorship Lead, inspire, and mentor a team of data scientists, fostering a culture of collaboration, innovation, and continuous learning. Provide technical guidance to ensure the delivery of high-quality, scalable solutions within tight deadlines. Promote best practices, drive knowledge sharing, and encourage cross-functional collaboration to achieve organizational goals. 2. AI/ML Solution Development Architect and deploy scalable, enterprise-level AI solutions tailored to solve complex business problems. Engineer and optimize Generative AI models (GenAI), Large Language Models (LLMs), and Transformer-based architectures for top-notch performance. Utilize techniques like prompt engineering, transfer learning, and model optimization to deliver state-of-the-art AI solutions. 3. Natural Language Processing (NLP) Design advanced NLP solutions leveraging tools such as Word2Vec, BERT, SpaCy, NLTK, CoreNLP, TextBlob, and GloVe. Perform semantic analysis, sentiment analysis, text preprocessing, and tokenization to generate actionable business insights. 4. Cloud & Deployment Build and deploy AI/ML solutions using frameworks like FastAPI or gRPC for seamless delivery of services. Leverage cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP) to design high-performance, scalable systems. Deploy models using Docker containers on Kubernetes clusters for optimal scalability and reliability. 5. Database Management Manage and optimize large-scale data processing using SQL and NoSQL databases. Ensure seamless data flow and retrieval, enhancing overall system performance. 6. Big Data & Analytics Utilize big data technologies like Hadoop, Spark, and Hive for analyzing and processing massive datasets. Apply statistical and experimental design techniques to uncover meaningful insights and drive decision-making. 7. MLOps & CI/CD Pipelines Develop and maintain robust MLOps pipelines to streamline the integration, testing, and deployment of machine learning models. Ensure the scalability, reliability, and efficiency of AI/ML models in production environments. 8. Collaboration & Communication Partner with product managers, business analysts, and engineering teams to identify challenges and propose innovative solutions. Translate complex technical insights into actionable recommendations for technical and non-technical stakeholders alike. Key Qualifications Educational Background Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related technical field. Experience 5–8 years of industry experience, with a minimum of 2 years in a leadership role managing and mentoring data science teams. Proven track record in delivering end-to-end AI/ML solutions that solve real-world business challenges. Technical Skills Proficiency in Python and its data science libraries. Advanced expertise in NLP tools like Word2Vec, BERT, NLTK, SpaCy, TextBlob, CoreNLP, and GloVe. Strong knowledge of Transformer-based architectures and Generative AI/LLMs. Hands-on experience with cloud platforms (AWS, Azure, GCP) and deployment technologies (FastAPI, gRPC, Docker, Kubernetes). Proficiency in big data tools (Hadoop, Spark, Hive) and database systems (SQL/NoSQL). Strong grasp of statistical methods, machine learning algorithms, and experimental design principles. Domain Knowledge Prior experience in Online Reputation Management or product-based industries is highly desirable. Additional Skills Exceptional project management skills with the ability to manage multiple priorities simultaneously. Excellent communication and storytelling skills to convey complex technical concepts effectively to diverse audiences.

Posted 19 hours ago

Apply

3.0 - 4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job purpose : Need to work as a Technology or Functional Consultant in FinCrime solutions modernisation and transformation projects Should exhibit understanding of financial services during the client discussions and be able to articulate the client requirements into tech specs Contribute as team player in a team of consultants to be able to deliver large technology programs Work Experience Requirements Understand high-level business requirements and relate them to appropriate AML / FinCrime product capabilities Define and validate customisation needs for AML products as per client requirements. Review client processes and workflows and make recommendations to the client to maximise benefits from the AML Product. Show in-depth knowledge on best banking practices and AML product modules Prior experience in one of more COTS such as NetReveal , Norkom, Actimize, SAS AML VI/VIA, fircosoft or Quantexa Your client responsibilities : Need to work as a Technical Business Systems Analyst in one or more FinCrime projects. Interface and communicate with the onsite coordinators Completion of assigned tasks on time and regular status reporting to the lead Regular status reporting to the Manager and onsite coordinators Interface with the customer representatives as and when needed Willing to travel to the customers locations on need basis Mandatory skills : Technical: Expert in the following NetReveal modules: Scenario Manager Configuration, Application Builder, Base Platform, Workflow Configurator, Services Manager, Batch bridge, Scheduling Configuration, Command and Control, AML module, Expert in Velocity template. NetReveal Optimization module, Multi-entity and mutli-currency platform, Cloud platform, REST API development using Java. CI/CD technologies (BitBucket, Jenkins, Nexus, Serena). Container Technologies such as Docker, Kubernetes. NetReveal v7.4 or above, Proficient in Oracle SQL, PL/SQL, Websphere Application Server Experience in Agile Methodology SQL and Understanding of Bigdata tech such as Spark, Hadoop, or Elasticsearch Scripting/ Programming: At least one programming/scripting language amongst Python, Java or Unix Shell Script Experience in product migration, implementation - preferably been part of at least 1 AML implementations Act as the Subject Matter Expert (SME) and possess an excellent functional/operational knowledge of the activities performed by the various teams Should Posses high-level understanding of infrastructure designs, data model and application/business architecture. Functional: Thorough knowledge of the AML/CTF transactions monitoring, KYC, Sanctions process Thorough knowledge on Transaction monitoring and scenarios Should have developed one or more modules worked on KYC - know your customer, CDD- customer due diligence, EDD - enhanced due diligence, sanction screening, PE - politically exposed person, adverse media screening, TM- transaction monitoring, CM- Case Management. Thorough knowledge of case management workflows Education and experience – Mandatory MBA/ MCA/ BE/ BTech or equivalent with banking industry experience of 3 to 4 years EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 20 hours ago

Apply

Exploring Hadoop Jobs in India

The demand for Hadoop professionals in India has been on the rise in recent years, with many companies leveraging big data technologies to drive business decisions. As a job seeker exploring opportunities in the Hadoop field, it is important to understand the job market, salary expectations, career progression, related skills, and common interview questions.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving IT industry and have a high demand for Hadoop professionals.

Average Salary Range

The average salary range for Hadoop professionals in India varies based on experience levels. Entry-level Hadoop developers can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with specialized skills can earn upwards of INR 15 lakhs per annum.

Career Path

In the Hadoop field, a typical career path may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually progressing to roles like Data Architect or Big Data Engineer.

Related Skills

In addition to Hadoop expertise, professionals in this field are often expected to have knowledge of related technologies such as Apache Spark, HBase, Hive, and Pig. Strong programming skills in languages like Java, Python, or Scala are also beneficial.

Interview Questions

  • What is Hadoop and how does it work? (basic)
  • Explain the difference between HDFS and MapReduce. (medium)
  • How do you handle data skew in Hadoop? (medium)
  • What is YARN in Hadoop? (basic)
  • Describe the concept of NameNode and DataNode in HDFS. (medium)
  • What are the different types of join operations in Hive? (medium)
  • Explain the role of the ResourceManager in YARN. (medium)
  • What is the significance of the shuffle phase in MapReduce? (medium)
  • How does speculative execution work in Hadoop? (advanced)
  • What is the purpose of the Secondary NameNode in HDFS? (medium)
  • How do you optimize a MapReduce job in Hadoop? (medium)
  • Explain the concept of data locality in Hadoop. (basic)
  • What are the differences between Hadoop 1 and Hadoop 2? (medium)
  • How do you troubleshoot performance issues in a Hadoop cluster? (advanced)
  • Describe the advantages of using HBase over traditional RDBMS. (medium)
  • What is the role of the JobTracker in Hadoop? (medium)
  • How do you handle unstructured data in Hadoop? (medium)
  • Explain the concept of partitioning in Hive. (medium)
  • What is Apache ZooKeeper and how is it used in Hadoop? (advanced)
  • Describe the process of data serialization and deserialization in Hadoop. (medium)
  • How do you secure a Hadoop cluster? (advanced)
  • What is the CAP theorem and how does it relate to distributed systems like Hadoop? (advanced)
  • How do you monitor the health of a Hadoop cluster? (medium)
  • Explain the differences between Hadoop and traditional relational databases. (medium)
  • How do you handle data ingestion in Hadoop? (medium)

Closing Remark

As you navigate the Hadoop job market in India, remember to stay updated on the latest trends and technologies in the field. By honing your skills and preparing diligently for interviews, you can position yourself as a strong candidate for lucrative opportunities in the big data industry. Good luck on your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies