Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Responsibilities Lead and mentored a team of Python developers. Design, develop, and maintain highly scalable data processing applications. Write efficient, reusable, and well-documented code. Deliver big data projects using Spark, Scala, Python, SQL, HQL, and Hive. Leverage data pipelining application to package work. Maintain and tune existing Hadoop applications. Work closely with QA, Operations, and various teams to deliver error-free software on time. Perform code reviews and provide constructive feedback. Actively participate in daily agile / scrum meetings. Design, develop, and maintain highly scalable data processing applications. Write efficient, reusable, and well-documented code. Deliver big data projects using Spark, Scala, Python, SQL, HQL, and Hive. Leverage data pipelining application to package work. Maintain and tune existing Hadoop applications. Work closely with QA, Operations and various teams to deliver error-free software on time. Actively participate in daily agile / scrum meetings. Requirements 7+ years of software development experience with Hadoop framework components(HDFS, Spark, PySpark, Sqoop, Hive, HQL, Spark, Scala). Experience in a leadership or supervisory role. 4+ years of experience using Python, SQL, and shell scripting. Experience in developing and tuning spark applications. Excellent understanding of spark architecture, data frames, and tuning spark. Strong knowledge of database concepts, systems architecture, and data structures is a must. Process-oriented with strong analytical and problem-solving skills. Excellent written and verbal communication skills. Bachelor's degree in Computer Science or related field. 5+ years of software development experience with Hadoop framework components(HDFS, Spark, PySpark, Sqoop, Hive, HQL, Spark, Scala). 4+ years of experience using Python, SQL, and shell scripting. Experience in developing and tuning spark applications. Excellent understanding of spark architecture, data frames, and tuning spark. Strong knowledge of database concepts, systems architecture, and data structures is a must. Process-oriented with strong analytical and problem-solving skills. Excellent written and verbal communication skills. Bachelor's degree in Computer Science or related field. This job was posted by Manisha Rani from Amantya Technologies.
Posted 1 week ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who are we and what do we do? InMobi Group’s mission is to power intelligent, mobile-first experiences for enterprises and consumers. Its businesses across advertising, marketing, data and content platforms are shaping consumer experience in a world of connected devices. InMobi Group has been recognized on both the 2018 and 2019 CNBC Disruptor 50 list and as one of Fast Company’s 2018 World’s Most Innovative Companies. Consistently featured among the “Great Places to Work” in India since 2017, our culture is our true north, enabling us to think big, solve complex challenges and grow with new opportunities. InMobians are passionate and driven, creative and fun-loving, take ownership and are outcome-driven. We invite you to join our tribe as we dream big and chase your passion. About AI Engineering Director - Enterprise Digital & AI transformation across Inmobi is underway for - unlocking productivity & realizing efficiencies across the business, streamlining critical process flows and integrating cross-system architecture to achieve a lean and agile organizational set up. This involves a strong understanding of the company’s underlying business, system and process challenges, translating them into long term sustainable solutions that are applicable across InMobi’s expanding business units. This team works across a spectrum of enterprise systems and products building end-to-end features and capabilities that help establish a fully integrated blueprint of the company’s data architecture and system landscape. The Enterprise systems span the end-to-end value chain covering Lead generation, Sales & CRM, Order fulfilment, Service, Financial systems, BI Dashboards and NLP based Analytics. Reporting to the Chief Digital Officer , this person will lead a team of engineers and collaborate deeply with Product managers, Designers and other Engineering teams, to build AI solutions grounded in consistent architectures and incorporates ongoing advancements. These AI solutions will help in Sales & Marketing transformation, Finance transformation, Service & Operational processes transformation. Are you interested in shaping the era of AI, especially in an Enterprise context and scale? This role requires curiosity to understand the quickly growing AI landscape, continuous disruptive changes, knowledge in modern coding languages, different platforms (esp GCP/ Azure/ Salesforce), Machine Learning and LLMs. As the Enterprise AI Engineering Leader, you are the face of the Enterprise AI engineering team, representing their work, and unblocking the team when problems arise. You manage a highly skilled team and help in the growth of their careers. You drive action and alignment by cultivating relationships with senior management and partnerships with product management and design leaders. You bring a deep understanding and expertise in digital transformation opportunities across AI, supported by thought leadership in technologies such as Generative AI, Advanced Analytics and App Modernization. What you’ll do: Partner with the Enterprise Product team and all concerned stakeholders to understand business needs and translate them into technical requirements and solutions. Lead the development, deployment, and optimization of AI (GenAI) Solutions, Tools, frameworks in the Enterprise space for core business process areas. In this role, you will work closely with cross-functional teams, bringing technical expertise and hands-on problem-solving to ensure the successful delivery of AI solutions that support our business objectives (in the broad process areas described earlier) Support POCs/ Pilots projects to explore and validate the ever-evolving AI technologies and solutions. Stay updated regarding LLMs, MCP, A2A, ADK, Agentic AI landscape and the fast- changing AI technology & solution landscape - ensuring the innovations drive impact. Participating in the full AI development lifecycle, including data preparation, design, coding, testing, deployment, and support. Continuously assess and optimize software architecture, focusing on scalability, reliability, and alignment with emerging trends. Document designs, development processes, and best practices to promote knowledge sharing. Mentor and enable career growth for your team. Support upskilling and re-skilling of other Engineering team members + business users. What you are good at Supporting the development/ adoption/ optimization of AI-driven applications to meet organizational needs. Solving technical challenges and developing scalable, innovative solutions. Applying Change Management principles to ensure successful technology rollouts. Proactively identifying and implementing automation/ AI capabilities – in an Enterprise setup. Collaborating effectively with diverse stakeholders, including technical teams and business leaders. Adapting to fast-paced environments and evolving priorities with high energy and autonomy. Leveraging expertise in (Gen) AI solutions, Technologies & frameworks as listed above and enabling underlying system Integrations/ Infrastructure to deliver impactful solutions. Demonstrate strong problem-solving and analytical skills. Excellent documentation and presentation skills for technical and non-technical audiences. What You'll Bring A bachelor’s degree in Computer Science, Engineering, or a related field. Advanced degrees in AI and Data Science are a plus. 10 - 12 years of professional experience in Engineering/ Software development with increasing responsibility. At least 4 – 5 years proven experience in implementing AI-driven applications. Experience in developing and deploying machine learning models and AI solutions. At least a few of the following : Statistics and Probability Theory, Regression Modelling, Clustering Methods, Deep Learning, Neural Networks, Natural Language Processing, Text Mining, Computer Vision, Image Recognition, Time Series Forecasting, Machine Learning Visualization Tools, Tree Ensembles Proficiency in programming languages like Python, R, or Java. Familiarity with AI and machine learning frameworks (e.g., TensorFlow, PyTorch). Experience on big data technologies and tools (e.g., Hadoop, Spark). At least 2 years in Gen AI/ LLM based solutions. GCP/ Azure/ Salesforce / other platforms - Data and AI services, Data Engineering, Analytics, Machine Learning, Cognitive Services; Cognitive Search and exposure to OpenAI solutions. Previous experience building a user-facing (Gen)AI/ LLM software applications Knowledge of LLMs, RAG/Agentic - Architecture, core concepts and fundamentals Knowledge of evolving protocols like MCP/ A2A/ ADK and others. Strong technical proficiency in both frontend and backend development (e.g., React, Python, Java, Typescript). Experience with design engineering, customer workshops, defining clear problem statements. Demonstrated ability to work with complex datasets and perform data preprocessing and analysis. Expertise needed in other areas - Experience with code repositories including but not limited: Azure DevOps; GitHub as well as understanding of branching and tagging strategies; understanding of Continuous Integration/Continuous Deployment (CI/CD) pipelines. Experience with cloud technologies and infrastructure as code (e.g., AWS, Kubernetes, Terraform). Familiarity with software design patterns, architecture trade-offs, and integration best practices. Knowledge of DevOps practices, CI/CD pipelines, and automated testing frameworks. Experience with database management and data processing.
Posted 1 week ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Data Architect Location: Hyderabad, Chennai & Bangalore Exprience: 10+ Years Job Summary We are seeking a highly experienced and strategic Data Architect to lead the design and implementation of robust data architecture solutions. The ideal candidate will have a deep understanding of data modelling, governance, integration, and analytics platforms. As a Data Architect, you will play a crucial role in shaping the data landscape across the organization, ensuring data availability, consistency, security, and quality. Mandatory Skills Enterprise data architecture and modeling Cloud data platforms (Azure, AWS, GCP) Data warehousing and lakehouse architecture Data governance and compliance frameworks ETL/ELT design and orchestration Master Data Management (MDM) Databricks architecture and implementation Key Responsibilities Lead, define, and implement end-to-end modern data platforms on public cloud using Databricks Design and manage scalable data models and storage solutions Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to define required data structures, formats, pipelines, metadata, and workload orchestration capabilities Establish data standards, governance policies, and best practices Oversee the integration of new data technologies and tools Lead the development of data pipelines, marts, and lakes Ensure data solutions are compliant with security and regulatory standards Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Mentor data engineers and developers on best practices Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Relevant certifications in cloud platforms, data architecture, or governance Technical Skills Data Modelling: Conceptual, Logical, Physical modelling (ERwin, Power Designer, etc.) Cloud: Azure (ADF, Synapse, Databricks), AWS (Redshift, Glue), GCP (Big Query) Databases: SQL Server, Oracle, PostgreSQL, NoSQL (MongoDB, Cassandra) Data Integration: Informatica, Talend, Apache NiFi Big Data: Hadoop, Spark, Kafka Governance Tools: Collibra, Alation, Azure Purview Scripting: Python, SQL, Shell DevOps/DataOps practices and CI/CD tools Soft Skills Strong leadership and stakeholder management Excellent communication and documentation skills Strategic thinking with problem-solving ability Collaborative and adaptive in cross-functional teams Good to Have Experience in AI/ML data lifecycle support Exposure to industry data standards and frameworks (TOGAF, DAMA-DMBOK) Experience with real-time analytics and streaming data solutions Work Experience Minimum 10 years in data engineering, architecture, or related roles At least 5 years of hands-on experience in designing data platforms on Azure Demonstrated knowledge of 2 full project cycles using Databricks as an architect Experience supporting and working with cross-functional teams in a dynamic environment Advanced working SQL knowledge and experience working with relational databases and unstructured datasets Experience with stream-processing systems such as Storm, Spark-Streaming Compensation & Benefits Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Key Result Areas (KRAs) Effective implementation of scalable and secure data architecture Governance and compliance adherence Standardization and optimization of data assets Enablement of self-service analytics and data democratization Key Performance Indicators (KPIs) Architecture scalability and reusability metrics Time-to-delivery for data initiatives Data quality and integrity benchmarks Compliance audit outcomes Satisfaction ratings from business stakeholders Contact: hr@bigtappanalytics.com
Posted 1 week ago
5.0 years
0 Lacs
India
Remote
🚀 Hiring Big Data Engineer (Scala/AWS) 🚀 Location: Remote (India) Timings: 11 AM – 8 PM IST (Flexible) We’re looking for: ✔ Big Data Experts with strong Scala, Python & AWS skills ✔ Experience building scalable pipelines (batch/streaming) ✔ Hands-on with CI/CD, distributed systems & performance optimization Ideal Profile: - 5+ years in Big Data technologies (Spark, Hadoop, Kafka) - Proficient in Scala (must), Java/Python - AWS cloud-native development (EMR, Glue, S3, Lambda) - Passion for high-performance systems ✨ Why apply? Work with cutting-edge tech, flexible hours, and impactful projects! Interested? DM or tag someone who’d be a great fit! ⬇ #BigData #Scala #AWS #Hiring #DataEngineering
Posted 1 week ago
5.0 - 8.0 years
20 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Greetings from LTIMindtree!! About the job Are you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision will surely be a fulfilling experience. Location: Pan India. Interested candidates kindly apply in below link and share updated cv to Hemalatha1@ltimindtree.com https://forms.office.com/r/JhYtz7Vzbn Job Description Key Skill : Cloudera, Spark,Hive,Scoop Jobs Mandatory Skills: Cloudera Administration - Hadoop, HIVE, IMPALA, SPARK, SQOOP. Maintaining/Creating JOBS and Migration, CI?CD Pipelines Monitoring and Performance Tuning. Why join us? Work in industry leading implementations for Tier-1 clients Accelerated career growth and global exposure Collaborative, inclusive work environment rooted in innovation Exposure to best-in-class automation framework Innovation first culture: We embrace automation, AI insights and clean data Know someone who fits this perfectly? Tag them lets connect the right talent with right opportunity DM or email to know more Lets build something great together
Posted 1 week ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Consultant, Advisors Client Services, Performance Analytics Consultant – Performance Analytics Advisors & Consulting Services Services within Mastercard is responsible for acquiring, engaging, and retaining customers by managing fraud and risk, enhancing cybersecurity, and improving the digital payments experience. We provide value-added services and leverage expertise, data-driven insights, and execution. Our Advisors & Consulting Services team combines traditional management consulting with Mastercard’s rich data assets, proprietary platforms, and technologies to provide clients with powerful strategic insights and recommendations. Our teams work with a diverse global customer base across industries, from banking and payments to retail and restaurants. The Advisors & Consulting Services group has five specializations: Strategy & Transformation, Performance Analytics, Business Experimentation, Marketing, and Program Management. Our Performance Analytics consultants translate data into insights by leveraging Mastercard and customer data to design, implement, and scale analytical solutions for customers. They use qualitative and quantitative analytical techniques and enterprise applications to synthesize analyses into clear recommendations and impactful narratives. Positions for different specializations and levels are available in separate job postings. Please review our consulting specializations to learn more about all opportunities and apply for the position that is best suited to your background and experience: https://careers.mastercard.com/us/en/consulting-specializations-at-mastercard Roles and Responsibilities Client Impact Provide creative input on projects across a range of industries and problem statements Contribute to the development of analytics strategies and programs for regional and global clients by leveraging data and technology solutions to unlock client value Collaborate with Mastercard team to understand clients’ needs, agenda, and risks Develop working relationship with client analysts/managers, and act as trusted and reliable partner Team Collaboration & Culture Collaborate with senior project delivery consultants to identify key findings, prepare effective presentations, and deliver recommendations to clients Independently identify trends, patterns, issues, and anomalies in defined area of analysis, and structure and synthesize own analysis to highlight relevant findings Lead internal and client meetings, and contribute to project management Contribute to the firm's intellectual capital Receive mentorship from performance analytics leaders for professional growth and development Qualifications Basic qualifications Undergraduate degree with data and analytics experience in business intelligence and/or descriptive, predictive, or prescriptive analytics Experience managing clients or internal stakeholders Ability to analyze large datasets and synthesize key findings Proficiency using data analytics software (e.g., Python, R, SQL, SAS) Advanced Word, Excel, and PowerPoint skills Ability to perform multiple tasks with multiple clients in a fast-paced, deadline-driven environment Ability to communicate effectively in English and the local office language (if applicable) Eligibility to work in the country where you are applying, as well as apply for travel visas as required by travel needs Preferred Qualifications Additional data and analytics experience in building, managing, and maintaining database structures, working with data visualization tools (e.g., Tableau, Power BI), or working with Hadoop framework and coding using Impala, Hive, or PySpark Ability to analyze large datasets and synthesize key findings to provide recommendations via descriptive analytics and business intelligence Experience managing tasks or workstreams in a collaborative team environment Ability to identify problems, brainstorm and analyze answers, and implement the best solutions Relevant industry expertise Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-250104
Posted 1 week ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
We are seeking an experienced Data Engineer to join our data team. As a Senior Data Engineer, you will work on various data engineering tasks including designing and optimizing data pipelines, data modeling, and troubleshooting data issues. You will collaborate with other data team members, stakeholders, and data scientists to provide data-driven insights and solutions to the organization. The experience required is of 3+ Years. Responsibilities Design and optimize data pipelines for various data sources. Design and implement efficient data storage and retrieval mechanisms. Develop data modeling solutions and data validation mechanisms. Troubleshoot data-related issues and recommend process improvements. Collaborate with data scientists and stakeholders to provide data-driven insights and solutions. Coach and mentor junior data engineers in the team. Requirements 3+ years of experience in data engineering or related field. Strong experience in designing and optimizing data pipelines, and data modeling. Strong proficiency in programming languages Python. Experience with big data technologies like Hadoop, Spark, and Hive. Experience with cloud data services such as AWS, Azure, and GCP. Strong experience with database technologies like SQL, NoSQL, and data warehousing. Knowledge of distributed computing and storage systems. Understanding DevOps power automation and Microsoft Fabric will be an added advantage. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Bachelor's degree in Computer Science, Data Science, or a Computer related field (Master's degree preferred). This job was posted by Himanshu Chavla from Tecblic.
Posted 1 week ago
25.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Company PayPal has been revolutionizing commerce globally for more than 25 years. Creating innovative experiences that make moving money, selling, and shopping simple, personalized, and secure, PayPal empowers consumers and businesses in approximately 200 markets to join and thrive in the global economy. We operate a global, two-sided network at scale that connects hundreds of millions of merchants and consumers. We help merchants and consumers connect, transact, and complete payments, whether they are online or in person. PayPal is more than a connection to third-party payment networks. We provide proprietary payment solutions accepted by merchants that enable the completion of payments on our platform on behalf of our customers. We offer our customers the flexibility to use their accounts to purchase and receive payments for goods and services, as well as the ability to transfer and withdraw funds. We enable consumers to exchange funds more safely with merchants using a variety of funding sources, which may include a bank account, a PayPal or Venmo account balance, PayPal and Venmo branded credit products, a credit card, a debit card, certain cryptocurrencies, or other stored value products such as gift cards, and eligible credit card rewards. Our PayPal, Venmo, and Xoom products also make it safer and simpler for friends and family to transfer funds to each other. We offer merchants an end-to-end payments solution that provides authorization and settlement capabilities, as well as instant access to funds and payouts. We also help merchants connectR0126349bbc913121c9c10be393e180380695f30paypalStaff Software Engineer Your way to impact In this role, you’ll drive the adoption of Generative AI across PayPal’s Site Reliability Engineering organization by building intelligent, scalable platforms—like AI-powered chatbots and predictive systems—that enhance system reliability and engineering productivity. Your work will enable automation at scale, reduce operational toil, and deliver real-time insights that empower engineers across the business. As a technical leader in SRE Labs, you’ll shape how reliability engineering evolves at PayPal, influencing architecture, best practices, and product direction. You’ll be part of a collaborative, high-performance team that brings cutting-edge technology into mission-critical environments. Your day to day In your day to day role you will Lead the design and development of GenAI-driven platforms, including intelligent chatbots, AI copilots, and automation tools for Site Reliability use cases. Build and deploy machine learning models in production using technologies like Python, , LangChain, LLM APIs (OpenAI, Anthropic, Azure OpenAI), Collaborate cross-functionally with product, infrastructure, and SRE teams to translate reliability pain points into scalable AI/ML-powered solutions. Develop scalable front-end and backend services using React.js and Node.js to build user-facing AI-powered applications and APIs. Drive system design and architecture decisions that ensure scalability, reliability, and maintainability of AI platforms. Mentor engineers, share best practices, and help evolve PayPal’s GenAI strategy and engineering standards. What Do You Need To Bring- A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field—or equivalent practical experience. 8+ years of experience designing, developing, and deploying AI/ML solutions, with a strong emphasis on LLMs, chatbots, and scalable architecture. Proficiency in Python, React.js, and Node.js, with hands-on experience building scalable applications and services. Hands-on expertise with LangChain,,lang graph, RESTful APIs and cloud platforms including deploying models in production. Strong understanding of system design principles with experience architecting large-scale, fault-tolerant distributed systems. Solid knowledge of ML pipelines, prompt engineering, retrieval-augmented generation (RAG), and integration of LLMs with enterprise systems. A proven ability to lead cross-functional projects, mentor peers, and influence technical direction while working in Agile, fast-paced environments. You have good analytical and problem-solving skills. Strong verbal and written communication skills. Flexibility and willingness to learn new technologies and adapt quickly. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. What you need to know about the role- SRE Labs, part of PayPal’s Site Reliability Engineering (SRE) organization, is seeking a Staff Software Engineer to lead and shape our Generative AI (GenAI) practice. In this high-impact role, you’ll architect and deliver scalable, production-grade AI solutions—ranging from intelligent chatbots to next-gen automation tools. You'll bring deep engineering expertise and a passion for AI innovation to transform how PayPal builds resilient and intelligent systems. This is a hands-on role with strategic reach: you’ll design, build, and lead AI-first platforms and services that redefine reliability engineering. You’ll work in a fast-paced, agile environment alongside top-tier engineers, product leads, and business partners to bring meaningful, customer-facing innovation to life. Meet our team PayPal SRE is a continuous engineering discipline that effectively combines software development and systems engineering to build and run scalable, distributed, fault-tolerant systems. The SRE team works together to ensure a continuous improvement to optimize for capacity and performance. SRE Labs enables innovation and builds enterprise-wide platforms to support Reliability.Chennai058a7c64f4a043ad9dc148854028f375IND-TNTNIN-TNc4f78be1a8f14da0ab49ce1162348a5eININD356600119bcdc9518960a01a46ac477aa2d240c99RECR_INTN001aacf3285c80f1001c4c166891c570000JF_Software_EngineeringFull_timehttps://paypal.eightfold.ai/careers?domain=paypal.com&sort_by=relevance&query=R0126349 The Company PayPal has been revolutionizing commerce globally for more than 25 years. Creating innovative experiences that make moving money, selling, and shopping simple, personalized, and secure, PayPal empowers consumers and businesses in approximately 200 markets to join and thrive in the global economy. We operate a global, two-sided network at scale that connects hundreds of millions of merchants and consumers. We help merchants and consumers connect, transact, and complete payments, whether they are online or in person. PayPal is more than a connection to third-party payment networks. We provide proprietary payment solutions accepted by merchants that enable the completion of payments on our platform on behalf of our customers. We offer our customers the flexibility to use their accounts to purchase and receive payments for goods and services, as well as the ability to transfer and withdraw funds. We enable consumers to exchange funds more safely with merchants using a variety of funding sources, which may include a bank account, a PayPal or Venmo account balance, PayPal and Venmo branded credit products, a credit card, a debit card, certain cryptocurrencies, or other stored value products such as gift cards, and eligible credit card rewards. Our PayPal, Venmo, and Xoom products also make it safer and simpler for friends and family to transfer funds to each other. We offer merchants an end-to-end payments solution that provides authorization and settlement capabilities, as well as instant access to funds and payouts. We also help merchants connect with their customers, process exchanges and returns, and manage risk. We enable consumers to engage in cross-border shopping and merchants to extend their global reach while reducing the complexity and friction involved in enabling cross-border trade. Our beliefs are the foundation for how we conduct business every day. We live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that we work together as one global team with our customers at the center of everything we do – and they push us to ensure we take care of ourselves, each other, and our communities. For the majority of employees, PayPal's balanced hybrid work model offers 3 days in the office for effective in-person collaboration and 2 days at your choice of either the PayPal office or your home workspace, ensuring that you equally have the benefits and conveniences of both locations. Our Benefits: At PayPal, we’re committed to building an equitable and inclusive global economy. And we can’t do this without our most important asset—you. That’s why we offer benefits to help you thrive in every stage of life. We champion your financial, physical, and mental health by offering valuable benefits and resources to help you care for the whole you. We have great benefits including a flexible work environment, employee shares options, health and life insurance and more. To learn more about our benefits please visit https://www.paypalbenefits.com Who We Are: To learn more about our culture and community visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion PayPal provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, pregnancy, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state, or local law. In addition, PayPal will provide reasonable accommodations for qualified individuals with disabilities. If you are unable to submit an application because of incompatible assistive technology or a disability, please contact us at paypalglobaltalentacquisition@paypal.com. Belonging at PayPal: Our employees are central to advancing our mission, and we strive to create an environment where everyone can do their best work with a sense of purpose and belonging. Belonging at PayPal means creating a workplace with a sense of acceptance and security where all employees feel included and valued. We are proud to have a diverse workforce reflective of the merchants, consumers, and communities that we serve, and we continue to take tangible actions to cultivate inclusivity and belonging at PayPal. Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don’t hesitate to apply. The Company PayPal has been revolutionizing commerce globally for more than 25 years. Creating innovative experiences that make moving money, selling, and shopping simple, personalized, and secure, PayPal empowers consumers and businesses in approximately 200 markets to join and thrive in the global economy. We operate a global, two-sided network at scale that connects hundreds of millions of merchants and consumers. We help merchants and consumers connect, transact, and complete payments, whether they are online or in person. PayPal is more than a connection to third-party payment networks. We provide proprietary payment solutions accepted by merchants that enable the completion of payments on our platform on behalf of our customers. We offer our customers the flexibility to use their accounts to purchase and receive payments for goods and services, as well as the ability to transfer and withdraw funds. We enable consumers to exchange funds more safely with merchants using a variety of funding sources, which may include a bank account, a PayPal or Venmo account balance, PayPal and Venmo branded credit products, a credit card, a debit card, certain cryptocurrencies, or other stored value products such as gift cards, and eligible credit card rewards. Our PayPal, Venmo, and Xoom products also make it safer and simpler for friends and family to transfer funds to each other. We offer merchants an end-to-end payments solution that provides authorization and settlement capabilities, as well as instant access to funds and payouts. We also help merchants connect with their customers, process exchanges and returns, and manage risk. We enable consumers to engage in cross-border shopping and merchants to extend their global reach while reducing the complexity and friction involved in enabling cross-border trade. Our beliefs are the foundation for how we conduct business every day. We live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that we work together as one global team with our customers at the center of everything we do – and they push us to ensure we take care of ourselves, each other, and our communities. Job Description Summary: We are looking for a talented Machine Learning Scientist to join our Venmo Data Science team, driving the development of AI-driven solutions that will shape the future of Venmo. You design and develop ML solutions including prediction, classification, clustering and recommendation.What you need to know about the role- SRE Labs, part of PayPal’s Site Reliability Engineering (SRE) organization, is seeking a Staff Software Engineer to lead and shape our Generative AI (GenAI) practice. In this high-impact role, you’ll architect and deliver scalable, production-grade AI solutions—ranging from intelligent chatbots to next-gen automation tools. You'll bring deep engineering expertise and a passion for AI innovation to transform how PayPal builds resilient and intelligent systems. This is a hands-on role with strategic reach: you’ll design, build, and lead AI-first platforms and services that redefine reliability engineering. You’ll work in a fast-paced, agile environment alongside top-tier engineers, product leads, and business partners to bring meaningful, customer-facing innovation to life. Meet our team PayPal SRE is a continuous engineering discipline that effectively combines software development and systems engineering to build and run scalable, distributed, fault-tolerant systems. The SRE team works together to ensure a continuous improvement to optimize for capacity and performance. SRE Labs enables innovation and builds enterprise-wide platforms to support Reliability. Job Description: With a strong background in Machine Learning and practical experience in building and implementing predictive models to solve business problems, you will help to bring insights and identify additional opportunities from Data & Machine Learning to market. Key Roles and Responsibilities include: Develop and implement advanced ML models, such as gradient boosted decision tree, LLMs, Clustering and deep learning models, to solve critical business problems related to Venmo. Design and deploy scalable ML/AI solutions that enhance Venmo's ability to provide a seamless customer experience, by working closely with our engineering group Communicate complex concepts and the results of models and analyses to both technical and non-technical audiences, influencing partners and customers with your insights and expertise. Basic Requirements: Masters degree or equivalent experience in a quantitative field (Computer Science, Mathematics, Statistics, Engineering, Artificial Intelligence, etc.) with 3+ yrs. of relevant industry experience or PhD with 2+ yrs. of relevant industry experience Great track record delivering solutions with attention to detail and efficiency Experience in any one of Product or Marketing domains is a big plus Proficient in Programming languages such as Python, SQL Fluent spoken and written English communication with business and engineering partners to exchange requirements, explain solution methodologies, and influence with insights Familiarity in relevant machine learning frameworks and packages such as Tensorflow and PyTorch. GCP/Hadoop and big data experience – an advantage Your way to impact In this role, you’ll drive the adoption of Generative AI across PayPal’s Site Reliability Engineering organization by building intelligent, scalable platforms—like AI-powered chatbots and predictive systems—that enhance system reliability and engineering productivity. Your work will enable automation at scale, reduce operational toil, and deliver real-time insights that empower engineers across the business. As a technical leader in SRE Labs, you’ll shape how reliability engineering evolves at PayPal, influencing architecture, best practices, and product direction. You’ll be part of a collaborative, high-performance team that brings cutting-edge technology into mission-critical environments. Your day to day In your day to day role you will Lead the design and development of GenAI-driven platforms, including intelligent chatbots, AI copilots, and automation tools for Site Reliability use cases. Build and deploy machine learning models in production using technologies like Python, , LangChain, LLM APIs (OpenAI, Anthropic, Azure OpenAI), Collaborate cross-functionally with product, infrastructure, and SRE teams to translate reliability pain points into scalable AI/ML-powered solutions. Develop scalable front-end and backend services using React.js and Node.js to build user-facing AI-powered applications and APIs. Drive system design and architecture decisions that ensure scalability, reliability, and maintainability of AI platforms. Mentor engineers, share best practices, and help evolve PayPal’s GenAI strategy and engineering standards. What Do You Need To Bring- A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field—or equivalent practical experience. 8+ years of experience designing, developing, and deploying AI/ML solutions, with a strong emphasis on LLMs, chatbots, and scalable architecture. Proficiency in Python, React.js, and Node.js, with hands-on experience building scalable applications and services. Hands-on expertise with LangChain,,lang graph, RESTful APIs and cloud platforms including deploying models in production. Strong understanding of system design principles with experience architecting large-scale, fault-tolerant distributed systems. Solid knowledge of ML pipelines, prompt engineering, retrieval-augmented generation (RAG), and integration of LLMs with enterprise systems. A proven ability to lead cross-functional projects, mentor peers, and influence technical direction while working in Agile, fast-paced environments. You have good analytical and problem-solving skills. Strong verbal and written communication skills. Flexibility and willingness to learn new technologies and adapt quickly. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don't hesitate to apply. For the majority of employees, PayPal's balanced hybrid work model offers 3 days in the office for effective in-person collaboration and 2 days at your choice of either the PayPal office or your home workspace, ensuring that you equally have the benefits and conveniences of both locations. Our Benefits: At PayPal, we’re committed to building an equitable and inclusive global economy. And we can’t do this without our most important asset—you. That’s why we offer benefits to help you thrive in every stage of life. We champion your financial, physical, and mental health by offering valuable benefits and resources to help you care for the whole you. We have great benefits including a flexible work environment, employee shares options, health and life insurance and more. To learn more about our benefits please visit https://www.paypalbenefits.com Who We Are: To learn more about our culture and community visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion PayPal provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, pregnancy, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state, or local law. In addition, PayPal will provide reasonable accommodations for qualified individuals with disabilities. If you are unable to submit an application because of incompatible assistive technology or a disability, please contact us at paypalglobaltalentacquisition@paypal.com. Belonging at PayPal: Our employees are central to advancing our mission, and we strive to create an environment where everyone can do their best work with a sense of purpose and belonging. Belonging at PayPal means creating a workplace with a sense of acceptance and security where all employees feel included and valued. We are proud to have a diverse workforce reflective of the merchants, consumers, and communities that we serve, and we continue to take tangible actions to cultivate inclusivity and belonging at PayPal. Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don’t hesitate to apply. REQ ID R0126346R0126349
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Full-time Job Description Responsible to assemble large, complex sets of data that meet non-functional and functional business requirements. Responsible to identify, design and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes. Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using Azure, Databricks and SQL technologies Responsible for the transformation of conceptual algorithms from R&D into efficient, production ready code. The data developer must have a strong mathematical background in order to be able to document and maintain the code Responsible for integrating finished models into larger data processes using UNIX scripting languages such as ksh, Python, Spark, Scala, etc. Produce and maintain documentation for released data sets, new programs, shared utilities, or static data. This must be done within department standards Ensure quality deliverables to clients by following existing quality processes, manually calculating comparison data, developing statistical pass/fail testing, and visually inspecting data for reasonableness: the requirement is on-time with zero defects Qualifications Education/Training B.E./B.Tech. with a major in Computer Science, BIS, CIS, Electrical Engineering, Operations Research or some other technical field. Course work or experience in Numerical Analysis, Mathematics or Statistics is a plus Hard Skills Proven experience working as a data engineer Highly proficient in using the spark framework (python and/or Scala) Extensive knowledge of Data Warehousing concepts, strategies, methodologies. Programming experience in Python, SQL, Scala Direct experience of building data pipelines using Apache Spark (preferably in Databricks), Airflow. Hands on experience designing and delivering solutions using Azure including Azure Storage, Azure SQL Data Warehouse, Azure Data Lake Experience with big data technologies (Hadoop) Databricks & Azure Big Data Architecture Certification would be plus Must be team oriented with strong collaboration, prioritization, and adaptability skills required Ability to write highly efficient code in terms of performance / memory utilization Basic knowledge of SQL; capable of handling common functions Experience Minimum 3-6 year of experience as Data engineer Experience modeling or manipulating large amounts of data is a plus Experience with Demographic, Retail business is a plus Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion I'm interested I'm interested Privacy Policy
Posted 1 week ago
1.0 - 2.6 years
4 - 6 Lacs
Hyderābād
Remote
Analyst – SERVICENOW QA - Deloitte Support Services India Private Limited Solutions Delivery-Canada is an integral part of the Information Technology Services group. The principle focus of this organization is the development and maintenance of technology solutions that e-enable the delivery of Function and Marketplace Services and Management Information Systems. Solutions Delivery Canada develops and maintains solutions built on varied technologies like SalesForce, Microsoft technologies, SAP, Hadoop, ETL, BI , ServiceNow, PowerAutomate, OpenText. Solutions Delivery Canada has various groups which provide the best of the breed solutions to the clients by following a streamlined system development methodology. Solutions Delivery Canada comprises of groups like Usability, Application Architecture, Development and Quality Assurance and Performance. Work you’ll do Role: Operates as ServiceNow QA for ServiceNow implementation projects. Responsibilities Strategic Strong communication skills regarding technical topics and remote collaboration skills are critical to this role. Demonstrates an ability to provide accurate project estimates and timelines. Demonstrates an ability to deliver on project commitments. Produces work that consistently meets quality standards. Operational Demonstrates the ability to understand complex business and functional requirements. Demonstrates superior analytical skills in analyzing user, functional and technical requirements Demonstrates practical application of the Testing Concepts in day-to-day project activities. Demonstrates the ability to derive the Test Scenarios from the Business Requirements. Demonstrates working understanding in Test Design and writing detailed test cases based on user scenarios. Demonstrates a working understanding of analyzing test execution results and the creation of appropriate test metrics. Demonstrates a working understanding of the defect management process. Demonstrates a working understanding of ServiceNow Test Management and Agile Demonstrates a working understanding of automation testing concepts and hands on ServiceNow ATF Demonstrate a working understanding of Automation execution like preparing smoke test scenarios and execution in continuous basis with each deployment Demonstrates the ability to enhance automation scripts to build a regression suite. Demonstrates a working understanding of software testing techniques and strategies. Demonstrates working understanding in Mobile Testing. Demonstrates a working understanding of the various Test Management and Defect Management tools. Demonstrates a working understanding of quality assurance and/or software development processes and methodologies, with the ability to share that knowledge with peers, and project team members. Demonstrates excellent analytical skills and ability to solve complex problems. Identifies ways of “working smarter”, through elimination of unnecessary steps or duplication. Location : Hyderabad Work shift Timings : 11 AM to 8 PM Qualifications B.Tech, BE, M.Tech or equivalent technical degree. 1-2.6 years experience in a similar role and Enterprise organisation. Essential Tools/Technology Skills: ServiceNow: The candidate should have knowledge on various ServiceNow applications like GRC/IRM, ITSM, ITBM, CMDB, Service Mapping. Demonstrates working understanding of QE Tools tools like TFS, ServiceNow Test Management, ServiceNow Agile Demonstrates working understanding of Various Service Now Application like GRC/IRM, ITSM, ITBM Demonstrates working understanding of Experience on all ITSM modules ITIL Foundation certified Demonstrates working understanding of ServiceNow ATF, Tricentis Tosca Worked in Agile / Scrum Good to have: Working experience / Knowledge in ServiceNow ATF Experience / Knowledge on ITBM, ITFM, CMDB etc. ServiceNow System Administrator Certification Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302823
Posted 1 week ago
1.0 years
4 - 6 Lacs
Hyderābād
On-site
- 1+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) As a Data Engineer, You will be working on building and maintaining complex data pipelines, assemble large and complex datasets to generate business insights and to enable data driven decision making and support the rapidly growing and dynamic business demand for data. You will have an opportunity to collaborate and work with various teams of Business analysts, Managers, Software Dev Engineers, and Data Engineers to determine how best to design, implement and support solutions. You will be challenged and provided with tremendous growth opportunity in a customer facing, fast paced, agile environment. Key job responsibilities * Design, implement and support an analytical data platform solutions for data driven decisions and insights * Design data schema and operate internal data warehouses & SQL/NOSQL database systems * Work on different data model designs, architecture, implementation, discussions and optimizations * Interface with other teams to extract, transform, and load data from a wide variety of data sources using AWS big data technologies like EMR, RedShift, Elastic Search etc. * Work on different AWS technologies such as S3, RedShift, Lambda, Glue, etc.. and Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency * Work on data lake platform and different components in the data lake such as Hadoop, Amazon S3 etc. * Work on SQL technologies on Hadoop such as Spark, Hive, Impala etc.. * Help continually improve ongoing analysis processes, optimizing or simplifying self-service support for customers * Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. * Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation. * Enjoy working closely with your peers in a group of talented engineers and gain knowledge. * Be enthusiastic about building deep domain knowledge on various Amazon’s business domains. * Own the development and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions. Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
2.0 years
1 - 6 Lacs
Hyderābād
On-site
Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities • Build cloud scale products with focus on efficiency, reliability and security. Build and maintain end-to-end Build, Test and Deployment pipelines. Deploy and manage massive Hadoop, Spark and other clusters. Contribute to the architecture & design of the products. Triaging issues and implementing solutions to restore service with minimal disruption to the customer and business. Perform root cause analysis, trend analysis and post-mortems. Owning the components and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high quality and availability post deployment. Embody our culture and values Embody our culture and values Qualifications Required /Minimum Qualifications • Bachelor's Degree in Computer Science, or related technical discipline AND 2+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Experience in data integration or migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications • Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 week ago
8.0 years
30 - 38 Lacs
Haryāna
Remote
Role: AWS Data Engineer Location: Gurugram Mode: Hybrid Type: Permanent Job Description: We are seeking a talented and motivated Data Engineer with requisite years of hands-on experience to join our growing data team. The ideal candidate will have experience working with large datasets, building data pipelines, and utilizing AWS public cloud services to support the design, development, and maintenance of scalable data architectures. This is an excellent opportunity for individuals who are passionate about data engineering and cloud technologies and want to make an impact in a dynamic and innovative environment. Key Responsibilities: Data Pipeline Development: Design, develop, and optimize end-to-end data pipelines for extracting, transforming, and loading (ETL) large volumes of data from diverse sources into data warehouses or lakes. Cloud Infrastructure Management: Implement and manage data processing and storage solutions in AWS (Amazon Web Services) using services like S3, Redshift, Lambda, Glue, Kinesis, and others. Data Modeling: Collaborate with data scientists, analysts, and business stakeholders to define data requirements and design optimal data models for reporting and analysis. Performance Tuning & Optimization: Identify bottlenecks and optimize query performance, pipeline processes, and cloud resources to ensure cost-effective and scalable data workflows. Automation & Scripting: Develop automated data workflows and scripts to improve operational efficiency using Python, SQL, or other scripting languages. Collaboration & Documentation: Work closely with data analysts, data scientists, and other engineering teams to ensure data availability, integrity, and quality. Document processes, architectures, and solutions clearly. Data Quality & Governance: Ensure the accuracy, consistency, and completeness of data. Implement and maintain data governance policies to ensure compliance and security standards are met. Troubleshooting & Support: Provide ongoing support for data pipelines and troubleshoot issues related to data integration, performance, and system reliability. Qualifications: Essential Skills: Experience: 8+ years of professional experience as a Data Engineer, with a strong background in building and optimizing data pipelines and working with large-scale datasets. AWS Experience: Hands-on experience with AWS cloud services, particularly S3, Lambda, Glue, Redshift, RDS, and EC2. ETL Processes: Strong understanding of ETL concepts, tools, and frameworks. Experience with data integration, cleansing, and transformation. Programming Languages: Proficiency in Python, SQL, and other scripting languages (e.g., Bash, Scala, Java). Data Warehousing: Experience with relational and non-relational databases, including data warehousing solutions like AWS Redshift, Snowflake, or similar platforms. Data Modeling: Experience in designing data models, schema design, and data architecture for analytical systems. Version Control & CI/CD: Familiarity with version control tools (e.g., Git) and CI/CD pipelines. Problem-Solving: Strong troubleshooting skills, with an ability to optimize performance and resolve technical issues across the data pipeline. Desirable Skills: Big Data Technologies: Experience with Hadoop, Spark, or other big data technologies. Containerization & Orchestration: Knowledge of Docker, Kubernetes, or similar containerization/orchestration technologies. Data Security: Experience implementing security best practices in the cloud and managing data privacy requirements. Data Streaming: Familiarity with data streaming technologies such as AWS Kinesis or Apache Kafka. Business Intelligence Tools: Experience with BI tools (Tableau, Quicksight) for visualization and reporting. Agile Methodology: Familiarity with Agile development practices and tools (Jira, Trello, etc.) Job Type: Permanent Pay: ₹3,000,000.00 - ₹3,800,000.00 per year Benefits: Work from home Schedule: Day shift Monday to Friday Experience: Data Engineering: 6 years (Required) AWS: 4 years (Required) Python: 4 years (Required) Work Location: Hybrid remote in Haryana, Haryana
Posted 1 week ago
0 years
10 - 30 Lacs
Sonipat
Remote
Newton School of Technology is on a mission to transform technology education and bridge the employability gap. As India’s first impact university, we are committed to revolutionizing learning, empowering students, and shaping the future of the tech industry. Backed by renowned professionals and industry leaders, we aim to solve the employability challenge and create a lasting impact on society. We are currently looking for a Data Mining Engineer to join our Computer Science Department. This is a full-time academic role focused on data mining, analytics, and teaching/mentoring students in core data science and engineering topics. Key Responsibilities: ● Develop and deliver comprehensive and engaging lectures for the undergraduate "Data Mining", “Big Data”, and “Data Analytics” courses, covering the full syllabus from foundational concepts to advanced techniques. ● Instruct students on the complete data lifecycle, including data preprocessing, cleaning, transformation, and feature engineering. ● Teach the theory, implementation, and evaluation of a wide range of algorithms for Classification, Association rules mining, Clustering, and Anomaly Detections. ● Design and facilitate practical lab sessions and assignments that provide students with hands-on experience using modern data tools and software. ● Develop and grade assessments, including assignments, projects, and examinations, that effectively measure the Course Learning Objectives (CLOs). ● Mentor and guide students on projects, encouraging them to work with real-world or benchmark datasets (e.g., from Kaggle). ● Stay current with the latest advancements, research, and industry trends in data engineering and machine learning to ensure the curriculum remains relevant and cutting-edge. ● Contribute to the academic and research environment of the department and the university. Required Qualifications: ● A Ph.D. (or a Master's degree with significant, relevant industry experience) in Computer Science, Data Science, Artificial Intelligence, or a closely related field. ● Demonstrable expertise in the core concepts of data engineering and machine learning as outlined in the syllabus. ● Strong practical proficiency in Python and its data science ecosystem, specifically Scikit-learn, Pandas, NumPy, and visualization libraries (e.g., Matplotlib, Seaborn). ● Proven experience in teaching, preferably at the undergraduate level, with an ability to make complex topics accessible and engaging. ● Excellent communication and interpersonal skills. Preferred Qualifications: ● A strong record of academic publications in reputable data mining, machine learning, or AI conferences/journals. ● Prior industry experience as a Data Scientist, Big Data Engineer, Machine Learning Engineer, or in a similar role. ● Experience with big data technologies (e.g., Spark, Hadoop) and/or deep learning frameworks (e.g., TensorFlow, PyTorch). ● Experience in mentoring student teams for data science competitions or hackathons. Perks & Benefits: ● Competitive salary packages aligned with industry standards. ● Access to state-of-the-art labs and classroom facilities. ● To know more about us, feel free to explore our website: Newton School of Technology. We look forward to the possibility of having you join our academic team and help shape the future of tech education! Job Type: Full-time Pay: ₹1,000,000.00 - ₹3,000,000.00 per year Benefits: Food provided Health insurance Leave encashment Paid sick time Paid time off Provident Fund Work from home Schedule: Day shift Monday to Friday Supplemental Pay: Performance bonus Quarterly bonus Yearly bonus Application Question(s): Are you interested in a full-time time onsite Instructor role? Are you ready to relocate to Sonipat - NCR Delhi? Are you ready to relocate to Pune? Work Location: In person Expected Start Date: 15/07/2025
Posted 1 week ago
1.0 years
5 - 10 Lacs
Gurgaon
On-site
- 1+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience - 2+ years of data/research scientist, statistician or quantitative analyst in an internet-based company with complex and big data sources experience Job Description Are you interested in applying your strong quantitative analysis and big data skills to world-changing problems? Are you interested in driving the development of methods, models and systems for capacity planning, transportation and fulfillment network? If so, then this is the job for you. Our team is responsible for creating core analytics tech capabilities, platforms development and data engineering. We develop scalable analytics applications and research modeling to optimize operation processes. We standardize and optimize data sources and visualization efforts across geographies, builds up and maintains the online BI services and data mart. You will work with professional software development managers, data engineers, scientists, business intelligence engineers and product managers using rigorous quantitative approaches to ensure high quality data tech products for our customers around the world, including India, Australia, Brazil, Mexico, Singapore and Middle East. Amazon is growing rapidly and because we are driven by faster delivery to customers, a more efficient supply chain network, and lower cost of operations, our main focus is in the development of strategic models and automation tools fed by our massive amounts of available data. You will be responsible for building these models/tools that improve the economics of Amazon’s worldwide fulfillment networks in emerging countries as Amazon increases the speed and decreases the cost to deliver products to customers. You will identify and evaluate opportunities to reduce variable costs by improving fulfillment center processes, transportation operations and scheduling, and the execution to operational plans. You will also improve the efficiency of capital investment by helping the fulfillment centers to improve storage utilization and the effective use of automation. Finally, you will help create the metrics to quantify improvements to the fulfillment costs (e.g., transportation and labor costs) resulting from the application of these optimization models and tools. Major responsibilities include: · Translating business questions and concerns into specific analytical questions that can be answered with available data using BI tools; produce the required data when it is not available. · Apply Statistical and Machine Learning methods to specific business problems and data. · Create global standard metrics across regions and perform benchmark analysis. · Ensure data quality throughout all stages of acquisition and processing, including such areas as data sourcing/collection, ground truth generation, normalization, transformation, cross-lingual alignment/mapping, etc. · Communicate proposals and results in a clear manner backed by data and coupled with actionable conclusions to drive business decisions. · Collaborate with colleagues from multidisciplinary science, engineering and business backgrounds. · Develop efficient data querying and modeling infrastructure. · Manage your own process. Prioritize and execute on high impact projects, triage external requests, and ensure to deliver projects in time. · Utilizing code (Python, R, Scala, etc.) for analyzing data and building statistical models. Knowledge of statistical packages and business intelligence tools such as SPSS, SAS, S-PLUS, or R Experience with clustered data processing (e.g., Hadoop, Spark, Map-reduce, and Hive) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
50.0 years
0 Lacs
Bhilai, Chhattisgarh, India
On-site
INNODEED SYSTEMS PRIVATE LIMITED is a cutting-edge AI-driven software solutions company, revolutionizing web and mobile application development with advanced artificial intelligence. From creating intelligent native apps and AI-powered bots that automate and optimize business processes to launching and marketing your digital solutions, we provide comprehensive, end-to-end services. Our expert team, with over 50 years of combined experience, harnesses the power of AI to enhance efficiency, user engagement, and overall digital transformation. By integrating state-of-the-art AI technologies, we drive innovation, streamline operations, and deliver unparalleled digital experiences. Explore the future of AI-driven solutions with us at www.innodeed.com We are looking for an experienced J2EE Developer to build and enhance AI-powered enterprise applications. The ideal candidate should have a strong background in Java, J2EE frameworks, and AI/ML integrations. You will collaborate with AI engineers, data scientists, and front-end developers to develop scalable, high-performance applications. Key Responsibilities: · Develop, optimize, and maintain AI-powered applications using Java and J2EE technologies. · Integrate AI/ML models using TensorFlow, OpenAI APIs, or other ML frameworks. · Design and implement RESTful APIs and Web Services. · Work with databases like Oracle, MySQL, or PostgreSQL for data management. · Collaborate with front-end developers to ensure seamless application performance. · Implement authentication, authorization, and data security best practices. · Optimize application performance, scalability, and reliability. · Debug, test, and resolve issues to maintain high-quality application standards. · Stay updated with the latest advancements in J2EE, AI, and cloud computing. · Participate in Agile development processes and contribute to sprint planning. Required Skills & Qualifications: · Bachelor's or Master’s degree in Computer Science, IT, or a related field. · 3+ years of experience in Java/J2EE development. · Strong knowledge of Spring Framework, Spring Boot, Hibernate, JPA. · Proficiency in RDBMS (Oracle, MySQL, or PostgreSQL) is essential. · Experience integrating AI/ML models using TensorFlow, OpenAI API, or similar technologies. · Hands-on experience with RESTful APIs, SOAP, and Microservices architecture. · Familiarity with cloud-based AI services (AWS, Azure, Google AI APIs). · Proficiency in version control systems like Git. · Strong problem-solving skills and a passion for AI-driven innovation. · Experience working with AI-powered applications or machine learning models. · A strong understanding of RDBMS (Oracle, MySQL, or PostgreSQL) and OOPS concepts is essential. · Knowledge of Big Data processing frameworks (Apache Kafka, Spark, Hadoop) is an added advantage. · Knowledge of the CI/CD pipeline is a plus.
Posted 1 week ago
3.0 years
10 - 12 Lacs
Delhi
On-site
S enior Fullstack AI/ML Engineer Location: Delhi Experience: 3-5 years Mode: On-site About the Role We are seeking a highly skilled Senior AI/ML Engineer to join our dynamic team. The ideal candidate will have extensive experience in designing, building, and deploying machine learning models and AI solutions to solve real-world business challenges. You will collaborate with cross-functional teams to create and integrate AI/ML models into end-to-end applications, ensuring models are accessible through APIs or product interfaces for real-time usage. Responsibilities Lead the design, development, and deployment of machine learning models for various use cases such as recommendation systems, computer vision, natural language processing (NLP), and predictive analytics. Work with large datasets to build, train, and optimize models using techniques such as classification, regression, clustering, and neural networks. Fine-tune pre-trained models and develop custom models based on specific business needs. Collaborate with data engineers to build scalable data pipelines and ensure the smooth integration of models into production. Collaborate with frontend/backend engineers to build AI-driven features into products or platforms. Build proof-of-concept or production-grade AI applications and tools with intuitive UIs or workflows. Ensure scalability and performance of deployed AI solutions within the full application stack. Implement model monitoring and maintenance strategies to ensure performance, accuracy, and continuous improvement of deployed models. Design and implement APIs or services that expose machine learning models to frontend or other systems Internal Utilize cloud platforms (AWS, GCP, Azure) to deploy, manage, and scale AI/ML solutions. Stay up-to-date with the latest advancements in AI/ML research, and apply innovative techniques to improve existing systems. Communicate effectively with stakeholders to understand business requirements and translate them into AI/ML-driven solutions. Document processes, methodologies, and results for future reference and reproducibility. Required Skills & Qualifications Experience: 5+ years of experience in AI/ML engineering roles, with a proven track record of successfully delivering machine learning projects. AI/ML Expertise: Strong knowledge of machine learning algorithms (supervised, unsupervised, reinforcement learning) and AI techniques, including NLP, computer vision, and recommendation systems. Programming Languages: Proficient in Python and relevant ML libraries such as TensorFlow, PyTorch, Scikit-learn, and Keras. Data Manipulation: Experience with data manipulation libraries such as Pandas, NumPy, and SQL for managing and processing large datasets. Model Development: Expertise in building, training, deploying, and fine-tuning machine learning models in production environments. Cloud Platforms: Experience with cloud platforms such as AWS, GCP, or Azure for the deployment and scaling of AI/ML models. MLOps: Knowledge of MLOps practices for model versioning, automation, and monitoring. Data Preprocessing: Proficient in data cleaning, feature engineering, and preparing datasets for model training. Strong experience building and deploying end-to-end AI-powered applications— not just models but full system integration. Hands-on experience with Flask, FastAPI, Django, or similar for building REST APIs for model serving. Understanding of system design and software architecture for integrating AI into production environments. Experience with frontend/backend integration (basic React/Next.js knowledge is a plus). Demonstrated projects where AI models were part of deployed user-facing applications. Internal NLP & Computer Vision: Hands-on experience with natural language processing or computer vision projects. Big Data: Familiarity with big data tools and frameworks (e.g., Apache Spark, Hadoop) is an advantage. Problem-Solving Skills: Strong analytical and problem-solving abilities, with a focus on delivering practical AI/ML solutions. Nice to Have Experience with deep learning architectures (CNNs, RNNs, GANs, etc.) and techniques. Knowledge of deployment strategies for AI models using APIs, Docker, or Kubernetes. Experience building full-stack applications powered by AI (e.g., chatbots, recommendation dashboards, AI assistants, etc.). Experience deploying AI/ML models in real-time environments using API gateways, microservices, or orchestration tools like Docker and Kubernetes. Solid understanding of statistics and probability. Experience working in Agile development environments. What You'll Gain Be part of a forward-thinking team working on cutting-edge AI/ML technologies. Collaborate with a diverse, highly skilled team in a fast-paced environment. Opportunity to work on impactful projects with real-world applications. Competitive salary and career growth opportunities Job Type: Full-time Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Schedule: Day shift Fixed shift Work Location: In person
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Skills and Qualifications: • Overall 3-5 years of hands-on experience as a Data Engineer, with at least 2-3 years of direct Azure/AWS/GCP Data Engineering experience. • Strong SQL and Python development skills are mandatory. • Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies. • Demonstrated knowledge and experience with Google Cloud BigQuery is a must. • Experience with DataProc and Dataflow is highly preferred. • Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks. • Extensive experience in SQL across various database platforms. • Experience in data mapping and data modeling. • Familiarity with data analytics tools and best practices. • Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell. • Practical experience with Google Cloud services including but not limited to: o BigQuery, BigTable o Cloud Dataflow, Cloud Dataproc o Cloud Storage, Pub/Sub o Cloud Functions, Cloud Composer o Cloud Spanner, Cloud SQL • Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark). • Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc. • Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP. • GCP Data Engineer Certification is highly preferred
Posted 1 week ago
1.0 years
2 - 3 Lacs
Indore
On-site
Excel/Google Sheets – Data cleaning, pivot tables, VLOOKUP, charts. SQL – Extracting and querying data from databases. Python or R – For data manipulation, analysis, and automation. Data Visualization Tools – Tableau, Power BI, Looker, or Google Data Studio. Statistics & Probability – Understanding of statistical methods and hypothesis testing. Data Cleaning & Wrangling – Preprocessing messy data for analysis. Machine Learning Basics – Predictive analytics, regression, classification (optional but valuable). Big Data Tools – Familiarity with Hadoop, Spark (for advanced roles). ETL Tools – Knowledge of data pipelines and integration tools. Cloud Platforms – AWS, GCP, Azure (for storage and analysis at scale). Soft Skills: Analytical Thinking – Ability to derive insights from complex data sets. Problem Solving – Using data to address business challenges. Attention to Detail – Ensuring accuracy in data work. Communication – Explaining insights clearly to non-technical stakeholders. Curiosity & Learning – Always exploring new tools and methods. FOR MORE INFORMATION CONTACT ON 8827277596 Job Types: फ़ुल-टाइम, स्थायी Pay: ₹20,000.00 - ₹25,000.00 per month Benefits: पेमेंट वाली छुट्टियाँ बीमार होने पर ली गई छुट्टियों का पेमेंट Experience: Data Analytics: 1 year (Required) Power BI: 1 year (Required) SQL: 1 year (Required) Work Location: In person
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the Company KPMG in India is a leading professional services firm established in August 1993. The firm offers a wide range of services, including audit, tax, and advisory, to national and international clients across various sectors. KPMG operates from offices in 14 cities, including Mumbai, Bengaluru, Chennai, and Delhi. KPMG India is known for its rapid, performance-based, industry-focused, and technology-enabled services. The firm leverages its global network to provide informed and timely business advice, helping clients mitigate risks and seize opportunities. KPMG India is committed to quality and excellence, fostering a culture of growth, innovation, and collaboration. About the Role Data Scientist Job Location: Bangalore/Gurgaon Experience: 4-10 years Responsibilities 4+ years of work experience as a Data Scientist A combination of business focus, strong analytical and problem-solving skills, and programming knowledge to be able to quickly cycle hypothesis through the discovery phase of a project Advanced skills with statistical/programming software (e.g., R, Python) and data querying languages (e.g., SQL, Hadoop/Hive, Scala) Good hands-on skills in both feature engineering and hyperparameter optimization Experience producing high-quality code, tests, documentation Experience with Microsoft Azure or AWS data management tools such as Azure Data factory, data lake, Azure ML, Synapse, Databricks Understanding of descriptive and exploratory statistics, predictive modelling, evaluation metrics, decision trees, machine learning algorithms, optimization & forecasting techniques, and/or deep learning methodologies Proficiency in statistical concepts and ML algorithms Good knowledge of Agile principles and process Ability to lead, manage, build, and deliver customer business results through data scientists or professional services team Ability to share ideas in a compelling manner, to clearly summarize and communicate data analysis assumptions and results Self-motivated and a proactive problem solver who can work independently and in teams Equal Opportunity Statement KPMG India has a policy of providing equal opportunity for all applicants and employees regardless of their color, caste, religion, age, sex/gender, national origin, citizenship, sexual orientation, gender identity or expression, disability or other legally protected status. KPMG India values diversity and we request you to submit the details below to support us in our endeavor for diversity. Providing the below information is voluntary and refusal to submit such information will not be prejudicial to you.
Posted 1 week ago
8.0 - 13.0 years
14 - 22 Lacs
Mumbai
Work from Office
Designed and built ETL pipelines to support enterprise data initiatives. Developed and implemented databases and data collection systems . Acquired data from primary and secondary sources ; maintained reliable data systems. Extensive experience in data ingestion , modelling , architecture design , and consulting for complex, cross-functional projects involving multiple stakeholders. Identified relevant data sources aligned with specific business requirements . Conducted data quality assessments and built validation processes to ensure integrity. Generated actionable insights from data sets, identifying key trends and patterns . Created detailed executive-level reports and dashboards for project teams. Acted as a single point of contact for all data-related issues across business units. Collaborated with vendors, functional, operational, and technical teams to address and support data needs. Ensured data consistency and accuracy from all providers and external sources. Supported the development of data platforms , maintaining focus on data quality and completeness . Created and maintained comprehensive technical documentation . Built for scalability and high performance in all development efforts. Troubleshot data infrastructure and data processing issues effectively. Supported data roadmap initiatives , recommending optimal technologies. Led the automation of multi-step, repetitive tasks , improving efficiency and reducing manual errors.
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Experience: 3-6 years of hands-on experience in designing and developing conceptual, logical, and physical data models for relational, dimensional, and NoSQL data platforms. Knowledge of Data Vault, NoSQL, Dimensional Modeling, Graph data model, and proficiency in at least one of these. Proven experience with data warehousing, data lakes, and enterprise big data platforms. Knowledge of databases such as columnar databases, vector databases, graph databases, etc. Strong knowledge of metadata management, data modeling, and related tools (e.g., Erwin, ER/Studio). Experience with ETL tools and data ingestion protocols. Familiarity with cloud-based data warehousing solutions (e.g., Google BigQuery , AWS Redshift, Snowflake) and big data technologies (e.g., Hadoop, Spark). Experience in creating comprehensive documentation of data models, data dictionaries, and metadata. Preferred: Experience with cloud modernization projects and modern database technologies. Certification in data modeling or database design. Strong communication and presentation skills. Experience in creating data models that comply with data governance policies and regulatory requirements. Experience leading initiatives to modernize data platforms using cloud-based solutions such as Google BigQuery , AWS Redshift, Snowflake, etc. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302295
Posted 1 week ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Required Skills - Degree in Computer Science or a related discipline Minimum 4 years of relevant experience Fluency ability in Python or shell scripting Experience with data mining, modeling, mapping, and ETL process Experience with Azure Data Factory, Data Lake, Databricks, Synapse analytics, BI Dashboard, and BI implementation projects. Hands-on Experience in Hadoop, PySpark, and SQL Spark. Knowledge in Azure / AWS, RESTful Web Service, SOAP, SOA, Microsoft SQL Server, MySQL Server, and Agile methodology is an advantage Strong analytical, problem- solving, and communication skills Excellent command of both written and spoken English. Should be able to Design, Develop, Deliver & maintain Data Infrastructures. Mandatory Skill Set- Hadoop, Pyspark Preferred Skill Set-Hadoop, Pyspark Year of experience required- 4 - 8 Qualifications-B.E / B.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Hadoop Cluster, PySpark Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 week ago
5.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Responsibilities: 5 to 12 Years Experienced Design, develop, and deploy high-quality data processing applications and pipelines. Analyze and optimize existing data workflows, pipelines, and data integration processes . Develop highly scalable, testable, and maintainable code for data transformation and storage. Troubleshoot and resolve data-related issues and performance bottlenecks . Collaborate with cross-functional teams to understand data requirements and deliver solutions. Qualifications: Bachelor's degree or equivalent experience in Computer Science, Information Technology, or related field . Hands-on development experience with Python and Apache Spark . Strong knowledge of Big Data technologies such as Hadoop, HDFS, Hive, Sqoop, Kafka, RabbitMQ . Proficiency in working with SQL databases or relational database systems (SQL Server, Oracle) . Familiarity with NoSQL databases like MongoDB, Cassandra, or HBase . Experience with Cloud platforms (AWS, Azure Databricks, GCP) is a plus. Understanding of ETL techniques, data integration, and Agile methodologies .
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Experience: 3-6 years of hands-on experience in designing and developing conceptual, logical, and physical data models for relational, dimensional, and NoSQL data platforms. Knowledge of Data Vault, NoSQL, Dimensional Modeling, Graph data model, and proficiency in at least one of these. Proven experience with data warehousing, data lakes, and enterprise big data platforms. Knowledge of databases such as columnar databases, vector databases, graph databases, etc. Strong knowledge of metadata management, data modeling, and related tools (e.g., Erwin, ER/Studio). Experience with ETL tools and data ingestion protocols. Familiarity with cloud-based data warehousing solutions (e.g., Google BigQuery , AWS Redshift, Snowflake) and big data technologies (e.g., Hadoop, Spark). Experience in creating comprehensive documentation of data models, data dictionaries, and metadata. Preferred: Experience with cloud modernization projects and modern database technologies. Certification in data modeling or database design. Strong communication and presentation skills. Experience in creating data models that comply with data governance policies and regulatory requirements. Experience leading initiatives to modernize data platforms using cloud-based solutions such as Google BigQuery , AWS Redshift, Snowflake, etc. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302295
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane