Jobs
Interviews

7331 Hadoop Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Role Overview: We are seeking a skilled Data Engineer with hands-on experience in Dataiku DSS to join our team. The ideal candidate will design and develop data pipelines, optimize workflows, and implement AI/ML models on cloud platforms. The role demands technical expertise, problem-solving ability, and a collaborative mindset. Key Responsibilities: Design and develop scalable ETL (Extract, Transform, Load) pipelines to collect and process data from multiple sources. Configure and streamline Dataiku DSS workflows for efficient data processing and machine learning operations. Integrate Dataiku with cloud platforms such as AWS, Azure, and GCP, as well as big data tools like Snowflake, Hadoop, and Spark . Develop and deploy AI/ML models for predictive analytics using Dataiku. Implement MLOps and DataOps practices within the platform for model deployment and data flow automation. Monitor job performance and automate data workflows for improved scalability and reliability. Customize Dataiku functionality with Python or R scripts for enhanced analytics. Manage and support the Dataiku platform, ensuring its reliability and performance. Must-Have Skills: 2 to 6 years of hands-on experience with the Dataiku DSS platform . Strong proficiency in Python and SQL for scripting and data manipulation. Solid understanding of ETL processes and data pipeline development. Experience with cloud environments (AWS, Azure, GCP). Familiarity with big data frameworks such as Spark and Hadoop. Good understanding of AI/ML model development and deployment practices. Ability to automate workflows and monitor performance effectively. Strong analytical thinking and problem-solving abilities. Excellent verbal and written communication skills.

Posted 6 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

LinkedIn is the world’s largest professional network, built to create economic opportunity for every member of the global workforce. Our products help people make powerful connections, discover exciting opportunities, build necessary skills, and gain valuable insights every day. We’re also committed to providing transformational opportunities for our own employees by investing in their growth. We aspire to create a culture that’s built on trust, care, inclusion, and fun, where everyone can succeed. Join us to transform the way the world works. This role will be based in Bangalore, India. At LinkedIn, our approach to flexible work is centered on trust and optimized for culture, connection, clarity, and the evolving needs of our business. The work location of this role is hybrid, meaning it will be performed both from home and from a LinkedIn office on select days, as determined by the business needs of the team. As part of our world-class software engineering team, you will be charged with building the next-generation infrastructure and platforms for LinkedIn, including but not limited to: an application and service delivery platform, massively scalable data storage and replication systems, cutting-edge search platform, best-in-class AI platform, experimentation platform, privacy and compliance platform etc. You will work and learn among the best, putting to use your passion for distributed technologies and algorithms, API design and systems-design, and your passion for writing code that performs at an extreme scale. LinkedIn has already pioneered well-known open-source infrastructure projects like Apache Kafka, Pinot, Azkaban, Samza, Venice, Datahub, Feather, etc. We also work with industry standard open source infrastructure products like Kubernetes, GRPC and GraphQL - come join our infrastructure teams and share the knowledge with a broader community while making a real impact within our company. Responsibilities: - You will own the technical strategy for broad or complex requirements with insightful and forward-looking approaches that go beyond the direct team and solve large open-ended problems. - You will design, implement, and optimize the performance of large-scale distributed systems with security and compliance in mind. - You will Improve the observability and understandability of various systems with a focus on improving developer productivity and system sustenance - You will effectively communicate with the team, partners and stakeholders. - You will mentor other engineers, define our challenging technical culture, and help to build a fast-growing team - You will work closely with the open-source community to participate and influence cutting edge open-source projects (e.g., Apache Iceberg) - You will deliver incremental impact by driving innovation while iteratively building and shipping software at scale - You will diagnose technical problems, debug in production environments, and automate routine tasks Basic Qualifications: - BA/BS Degree in Computer Science or related technical discipline, or related practical experience. - 8+ years of industry experience in software design, development, and algorithm related solutions. - 8+ years experience programming in object-oriented languages such as Java, Python, Go and/or Functional languages such as Scala or other relevant coding languages - Hands on experience developing distributed systems, large-scale systems, databases and/or Backend APIs Preferred Qualifications: - Experience with Hadoop (or similar) Ecosystem (Gobblin, Kafka, Iceberg, ORC, MapReduce, Yarn, HDFS, Hive, Spark, Presto) - Experience with industry, open-source projects and/or academic research in data management, relational databases, and/or large-data, parallel and distributed systems - Experience in architecting, building, and running large-scale systems - Experience with open-source project management and governance Suggested Skills: - Distributed systems - Backend Systems Infrastructure - Java You will Benefit from our Culture: We strongly believe in the well-being of our employees and their families. That is why we offer generous health and wellness programs and time away for employees of all levels. India Disability Policy LinkedIn is an equal employment opportunity employer offering opportunities to all job seekers, including individuals with disabilities. For more information on our equal opportunity policy, please visit https://legal.linkedin.com/content/dam/legal/Policy_India_EqualOppPWD_9-12-2023.pdf Global Data Privacy Notice for Job Candidates This document provides transparency around the way in which LinkedIn handles personal data of employees and job applicants: https://legal.linkedin.com/candidate-portal

Posted 6 days ago

Apply

0.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Bengaluru, Karnataka Job ID JR2025463343 Category Information Technology Role Type Hybrid Post Date Jul. 17, 2025 Job Description At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. Overview As a leading global aerospace company, Boeing develops, manufactures and services commercial airplanes, defense products and space systems for customers in more than 150 countries. As a top U.S. exporter, the company leverages the talents of a global supplier base to advance economic opportunity, sustainability and community impact. Boeing’s diverse team is committed to innovating for the future, leading with sustainability, and cultivating a culture based on the company’s core values of safety, quality and integrity. Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ diverse engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring diverse perspectives and thoughts – enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping people’s careers and being thoughtful about employee wellbeing. With us, you can create and contribute to what matters most in your career, community, country, and world. Join us in powering the progress of global aerospace. At Boeing, we are inclusive, diverse, and transformative . With us, you can create and contribute to what matters most in your career, community, country, and world. Join us in powering the progress of global aerospace. Basic Qualifications (Required Skills/Experience): Bachelor’s with 8 -12 years relevant experience . Hands on knowledge in the Designing and Developing full stack modules and components for various applications. This position is for a full stack developer who has excellent programming skills in the latest tools & technology frameworks like JavaScript, JQuery, Java/J2EE, Azure App Services / Functions, Microservices etc . Must have experience with Spring Boot Framework, REST based Web Services and SO A Experience in Object Oriented Analysis and Design using Java and UML. Must have experience using Software Design Patterns, Standards and Best Practices within the industry . Experience with Azure Cloud is highly preferrabl e Experience in automated test environments such as TDD/JUNIT/Mockito/Jest is essentia l Desire to learn and conduct hands-on experimentation, product evaluations and proof of concepts in various diverse array of technologie s Good knowledge of database concepts with knowledge on writing SQL Queries. Oracle / Postgres experience to analyze current scripts and develop new scripts based on customer data extraction requests . Experience with HTML, CSS, jQuery, Bootstrap, JavaScrip t Experience with frontend JavaScript frameworks, Angular, React JS, Backbone JS, Node J S Experience with backend framework like Spring MVC, Spring Boot, JPA, Service Discover y Experience in database modellin g Experience with Release Tools (VSTS, Artifactory, Gitlab, Maven), configuration management (chef), monitoring, virtualization and containerization is essentia l Website troubleshooting & coding experience: IIS, Azure Web Apps, Apache Tomca t Identity and Authentication: SAML, SSO/Federation, AD/Azure AD etc . Azure Application Development or support experiences with Azure PaaS services (Redis Cache, Azure Blob, RabbitMQ, Cloud Service, etc. ) Experience with Event Store and Event Driven Architectur e Position Responsibilities: Demonstrated leadership skills with the ability to clearly communicate with other team member s Demonstrated ability to create positive impact on customer by developing polished, cohesive, effective, and user-friendly applications for large and complex aviation related system s Candidate must be a self-starter with a positive attitude, high ethics, and strong analytical and creative problem-Solving skills and a track record of working successfully under pressure in a time-constrained environmen t Able to communicate with others on the team virtuall y Experience in understanding and interacting with multiple data format s Ability to rapidly learn and understand software from source cod e Knowledge and/or experience of SAFe projects and processes desirabl e Organization skills to manage a workload of various tasks, which may be interrupted by high priority system issues or requests . Estimate user stories/features (story point estimation) and tasks in hours with the required level of accuracy and commit them as part of Sprint Plannin g Contributes to the Backlog refinement meetings by promptly asking relevant questions to ensure requirements achieve the right level of DO R Works with the Product Owner to confirm that the code and acceptance tests reflect the desired functionalit y Raise any impediments/risks (technical/operational/personal) they come across and approaches Scrum Master/Technical Architect/PO accordingly to arrive at a solutio n Update the status and the remaining efforts for their tasks dail y Ensures change requests are treated correctly and tracked in the system, impact analysis done and risks/timelines are appropriately communicate d Develops working software towards committed Team PI Objectives and Iteration plan s Delivers working software in accordance with expected quality standard s Designs for testability: Design and evolve the system to support testability and test automatio n Works on prototyping and evaluates technical feasibilit y Ability to work in both Unix/Linux and Windows environment s Preferred Qualifications (Desired Skills/Experience): Knowledge of Domain Driven Design (DDD) is preferre d Experience with DevOps Practices such as Continuous integration and continuous deployment (using Microsoft Azure DevOps), configuration management, metrics and monitoring schemes, virtualization and cloud computing using Microsoft Azur e Knowledge and/or experience of the Aviation Industry desirabl e Familiarity with Java software development to effectively perform analysis of issues and identify potential code defect s Beginner knowledge of Bigdata: HDInsight/Hadoop, Machine Learning, Azure Stream Analytics is preferre d Scripting experience with JavaScript, Python. Also, should be able to write scripts that will fire off and orchestrate the complete deployments of DEV, QA and Production environments via tools such as Chef . Typical Education & Experience: Degree and typical experience in engineering classification: Bachelor's with 8 to 12 years' experience, master’s degree with 7+ years' experience. Bachelor, master or doctorate of science degree from an accredited course of study, in engineering . Relocation This position does not offer relocation . Applications for this position will be accepted until Jul. 25, 2025 Export Control Requirements: This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India) Equal Opportunity Employer: We are an equal opportunity employer. We do not accept unlawful discrimination in our recruitment or employment practices on any grounds including but not limited to; race, color, ethnicity, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military and veteran status, or other characteristics covered by applicable law. We have teams in more than 65 countries, and each person plays a role in helping us become one of the world’s most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited to: conducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews. Your Benefits No matter where you are in life, our benefits help prepare you for the present and the future. Competitive base pay and incentive programs. Industry-leading tuition assistance program pays your institution directly. Resources and opportunities to grow your career. Up to $10,000 match when you support your favorite nonprofit organizations.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Manager Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Manager, Data Engineering Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 4 years of experience in application development (Internship experience does not apply) At least 2 years of experience in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) At least 2 years of people management experience Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra) 4+ years of data warehousing experience (Redshift or Snowflake) 4+ years of experience with UNIX/Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.

Posted 6 days ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana

Remote

Software Engineer Hyderabad, Telangana, India Date posted Jul 17, 2025 Job number 1832398 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. The Fabric Data Integration team is currently seeking a Software Engineer to join their team. This team is in charge of designing, building, and operating a next generation service that transfers large volumes of data from various source systems to target systems with minimal latency while providing a data centric orchestration platform. The team focuses on advanced data movement/replication scenarios while maintaining user-friendly interfaces. Working collaboratively, the team utilizes a range of technologies to deliver high-quality products at a fast pace. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Qualifications Required /Minimum Qualifications Bachelor's degree in computer science, or related technical discipline AND 2+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Preferred/Additional Qualifications Bachelor's Degree in Computer Science or related technical field AND 1+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java OR master’s degree in computer science or related technical field AND 1+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java OR equivalent experience. 1+ years of experience in developing and shipping system level features in an enterprise production backend server system. Experience building Distributed Systems with reliable guarantees. Understanding of data structures, algorithms, and distributed systems. Solve problems by always leading with passion and empathy for customers. Have a desire to work collaboratively, solve problems with groups, find win/win solutions and celebrate successes. Enthusiasm, integrity, self-discipline, results-orientation in a fast-paced environment. 1+ years of experience in developing and shipping system level features in an enterprise production backend server system. 1+ years of experience building and supporting distributed cloud services with production grade. #azdat #azuredata #azdataintegration Responsibilities Build cloud scale products with focus on efficiency, reliability and security. Build and maintain end-to-end Build, Test and Deployment pipelines. Deploy and manage massive Hadoop, Spark and other clusters. Contribute to the architecture & design of the products. Triaging issues and implementing solutions to restore service with minimal disruption to the customer and business. Perform root cause analysis, trend analysis and post-mortems. Owning the components and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high quality and availability post deployment. Embody our culture and values Embody our culture and values Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

0.0 - 8.0 years

0 Lacs

Hyderabad, Telangana

Remote

Senior Data Science Hyderabad, Telangana, India + 2 more locations Date posted Jul 17, 2025 Job number 1846341 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Research, Applied, & Data Sciences Discipline Data Science Employment type Full-Time Overview Microsoft’s Cloud business is expanding, and the Cloud Supply Chain (CSCP) organization is responsible for enabling the hardware infrastructure underlying this growth including AI! CSCP’s vision is to empower customers to achieve more by delivering Cloud and AI capabilities at scale. Our mission is to deliver the world's computer with an industry-leading supply chain. The CSCP organization is responsible for traditional supply chain functions such as plan, source, make, deliver, but also manages supportability (spares), sustainability, and decommissioning of datacenter assets worldwide. We deliver the core infrastructure and foundational technologies for Microsoft's over 200 online businesses including Bing, MSN, Office 365, Xbox Live, OneDrive and the Microsoft Azure platform for external customers. Our infrastructure is supported by more than 300 datacenters around the world that enable services for more than 1 billion customers in over 90 countries. Microsoft Cloud Planning (MCP) is the central planning function within CSCP focused on forecasting, demand planning, and supply planning for all Microsoft Cloud services and associated hardware, directly impacting the success of Microsoft's cloud business. Qualifications Required: M.Sc. in Statistics, Applied Mathematics, Applied Economics, Computer Science or Engineering, Data Science, Operations Research or similar applied quantitative field 4-8 years of industry experience in developing production-grade statistical and machine learning code in a collaborative team environment. Prior experience in machine learning using R or Python (scikit / numpy / pandas / statsmodel). Prior experience in time series forecasting. Prior experience with typical data management systems and tools such as SQL. Knowledge and ability to work within a large-scale computing or big data context, and hands-on experience with Hadoop, Spark, DataBricks or similar. Excellent analytical skills; ability to understand business needs and translate them into technical solutions, including analysis specifications and models. Creative thinking skills with emphasis on developing innovative methods to solve hard problems under ambiguity and no obvious solutions. Good interpersonal and communication (verbal and written) skills, including the ability to write concise and accurate technical documentation and communicate technical ideas to non-technical audiences. Preferred: PhD in Statistics, Applied Mathematics, Applied Economics, Computer Science or Engineering, Data Science, Operations Research or similar applied quantitative field. Experience in machine learning using R or Python (scikit / numpy / pandas / statsmodel) with skill level at or near fluency. Experience with deep learning models (e.g., tensorflow, PyTorch, CNTK) and solid knowledge of theory and practice. Practical and professional experience contributing to and maintaining a large code base with code versioning systems such as Git. Knowledge of supply chain models, operations research techniques, optimization modelling and solvers. Responsibilities Researching and developing production-grade models (forecasting, anomaly detection, optimization, clustering, etc.) for our global cloud business by using statistical and machine learning techniques. Manage large volumes of data, and create new and improved solutions for data collection, management, analyses, and data science model development. Drive the onboarding of new data and the refinement of existing data sources through feature engineering and feature selection. statistical concepts and cutting-edge machine learning techniques to analyze cloud demand and optimize our data science model code for distributed computing platforms and task automation. Work closely with other data scientists and data engineers to deploy models that drive cloud infrastructure capacity planning. Present analytical findings and business insights to project managers, stakeholders, and senior leadership and keep abreast of new statistical / machine learning techniques and implement them as appropriate to improve predictive performance. Oversees and directs the plan or forecast across the company for demand planning. Evangelizes the demand plan with other leaders. Drives clarity and understanding of what is required to achieve the plan (e.g., promotions, sales resources, collaborative planning, forecasting, and replenishment [CPFR], budget, engineering changes) and assesses plans to mitigate potential risks and issues. Oversees the analysis of data and leads the team in identifying trends, patterns, correlations, and insights to develop new forecasting models and improve existing models. Oversees development of short and long term (e.g., weekly, monthly, quarterly) demand forecasts and develops and publishes key forecast accuracy metrics. Analyzes data to identify potential sources of forecasting error. Serves as an expert resource and leader of demand planning across the company and ensures that business drivers are incorporated into the plan (e.g., forecast, budget). Leads collaboration among team and leverages data to identify pockets of opportunity to apply state-of-the-art algorithms to improve a solution to a business problem. Consistently leverages knowledge of techniques to optimize analysis using algorithms. Modifies statistical analysis tools for evaluating Machine Learning models. Solves deep and challenging problems for circumstances such as when model predictions are not correct, when models do not match the training data or the design outcomes when the data is not clean when it is unclear which analyses to run, and when the process is ambiguous. Provides coaching to team members on business context, interpretation, and the implications of findings. Interprets findings and their implications for multiple businesses, and champions methodological rigor by calling attention to the limitations of knowledge wherever biases in data, methods, and analysis exist. Generates and leverages insights that inform future studies and reframe the research agenda. Informs both current business decisions by implementing and adapting supply-chain strategies through complex business intelligence. Connects across functional teams and the broader organization outside of Demand Planning to advocate for continuous improvement and maintain best practices. Leads broad governance and rhythm of the business processes that ensure cross-group collaboration, discussion of key issues, and an opportunity to build proposed solutions to address current or future business needs. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana

Remote

Principal Software Engineer Hyderabad, Telangana, India Date posted Jul 17, 2025 Job number 1848084 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Are you interested in building the next state-of-the-art AI infrastructure to fuel Microsoft's rapidly growing multi-billion online advertiser business? Are you passionate about using big data and machine learning to solve real world customer problems and delight hundreds of millions of Bing users? How about working in a fun and fast paced environment where engineers are empowered to innovate? If so, please come join our Microsoft Ads team. We engineer massively scalable streaming systems and services that form the backbone of Bing's ~$10B monetization engine, covering all aspects of online advertising such as advertiser facing demand ingestion/management/insights, preparation and transformation of huge volume of advertising data to be served, and performing low latency and high throughput online serving. Tough competition in the industry has created enormous opportunities as well as technical challenges in Big Data, Distributed Systems, and Machine Learning/Deep Learning. If you would like to tackle these challenges and be part of a winning team, we are the right place for you! Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor's Degree in Computer Science or related technical field AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Preferred Qualifications: Bachelor's Degree in Computer Science OR related technical field AND 10+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Master's Degree in Computer Science or related technical field AND 8+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Work related to data engineering domain including hadoop like distributed systems and spark/scala platform #MicrosoftAI Responsibilities Partners with appropriate stakeholders to determine user requirements for a set of scenarios. Leads identification of dependencies and the development of design documents for a product, application, service, or platform. Leads by example and mentors others to produce extensible and maintainable code used across products. Leverages subject-matter expertise of cross-product features with appropriate stakeholders (e.g., project managers) to drive multiple group's project plans, release plans, and work items. Holds accountability as a Designated Responsible Individual (DRI), mentoring engineers across products/solutions, working on-call to monitor system/product/service for degradation, downtime, or interruptions. Proactively seeks new knowledge and adapts to new trends, technical solutions, and patterns that will improve the availability, reliability, efficiency, observability, and performance of products while also driving consistency in monitoring and operations at scale and shares knowledge with other engineers. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 6 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Voyager (94001), India, Bangalore, Karnataka Manager, Data Engineering Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 4 years of experience in application development (Internship experience does not apply) At least 2 years of experience in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) At least 2 years of people management experience Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra) 4+ years of data warehousing experience (Redshift or Snowflake) 4+ years of experience with UNIX/Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).

Posted 6 days ago

Apply

10.0 - 15.0 years

0 Lacs

delhi

On-site

You are looking for a Senior Data Architect to join the team at Wingify in Delhi. As a Senior Data Architect, you will be responsible for leading and mentoring a team of data engineers, optimizing scalable data infrastructure, driving data governance frameworks, collaborating with cross-functional teams, and ensuring data security, compliance, and quality. Your role will involve optimizing data processing workflows, fostering a culture of innovation and technical excellence, and aligning technical strategy with business objectives. To be successful in this role, you should have at least 10 years of experience in software/data engineering, with a minimum of 3 years in a leadership position. You should possess expertise in backend development using programming languages like Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. Proficiency in SQL, Python, and Scala for data processing and analytics is essential, along with a strong understanding of cloud platforms such as AWS, GCP, or Azure and their data services. Additionally, you should have experience with big data technologies like Spark, Hadoop, Kafka, and distributed computing frameworks, as well as hands-on experience with data warehousing solutions like Snowflake, Redshift, or BigQuery. Deep knowledge of data governance, security, and compliance, along with familiarity with NoSQL databases and automation/DevOps tools, is required. Strong leadership, communication, and stakeholder management skills are crucial for this role. Preferred qualifications include experience in machine learning infrastructure or MLOps, exposure to real-time data processing and analytics, and interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company would be advantageous. Please note that candidates must have a minimum of 10 years of experience to be eligible for this role. Graduation from Tier - 1 colleges, such as IIT, is preferred. Candidates from B2B Product Companies with High data-traffic are encouraged to apply, while those who do not meet these criteria are kindly requested not to apply.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have 2-7 years of experience in Noida, Gurugram, Indore, Pune, or Bangalore with a notice period of currently serving or immediate joiners. Your primary responsibilities will include having 2-6 years of hands-on experience with Big Data technologies like PySpark (Data frame and SparkSQL), Hadoop, and Hive. Additionally, you should have good experience with Python and Bash Scripts, a solid understanding of SQL and data warehouse concepts, and strong analytical, problem-solving, data analysis, and research skills. You should also demonstrate the ability to think creatively and independently, along with excellent communication, presentation, and interpersonal skills. It would be beneficial if you have hands-on experience with using Cloud Platform provided Big Data technologies such as IAM, Glue, EMR, RedShift, S3, and Kinesis. Experience in orchestration with Airflow and any job scheduler, as well as experience in migrating workloads from on-premise to cloud and cloud to cloud migrations, would be considered a plus.,

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

You will be responsible for developing Risk Segmentation models using sophisticated statistical techniques such as CHAID, cluster analysis, and pattern recognition. These models will help in identifying credit abuse and first party fraud risks at Acquisition and Existing Customer Management stages. Additionally, you will work on establishing swap set analysis and P&L optimization using these models and have hands-on experience in Credit/Fraud Risk strategy development. Preferred knowledge of credit/fraud risk analytics is required for this role. You will collaborate closely with cross-functional teams, including business stakeholders, MRM, governance teams, and strategy implementation teams. Your responsibilities will involve implementing initiatives to ensure consistency and compliance with global credit policies. You will be expected to observe key credit abuse trends, threats, inaccuracies, and drivers of these observations. Furthermore, you will support Audit requirements from external/internal auditors and respond to queries from Independent Risk. To excel in this role, you should possess strong analytical, strategic, and project management skills. You will be required to create storyboards, presentations, and project plans for discussions with senior management and governance requirements to derive valuable insights. Proficiency in utilizing UNIX and Statistical Analysis System (SAS) for risk, financial, and data analyses, including profiling, sampling, reconciliation, and hypothesis testing is essential. Knowledge and experience in statistical procedures and financial analytic tools such as SQL, R, Python, Hadoop, Spark, SAS, and Machine Learning will be an advantage. Experience in pre/post validation of deployed strategies and models and monitoring strategy performance against established benchmarks is also expected. The ideal candidate should have a minimum of 4 years of relevant experience and hold an Advanced Degree (Masters required) in Statistics, Applied Mathematics, Operations Research, Economics, MBA (Finance), or any other highly quantitative discipline. This position falls under the Decision Management job family and the Business Analysis job family. It is a full-time opportunity at Citi, an equal opportunity and affirmative action employer. Citigroup Inc. and its subsidiaries ("Citi) welcome all qualified interested applicants to apply for career opportunities. If you are a person with a disability and require a reasonable accommodation to use Citi's search tools or apply for a career opportunity, please review Accessibility at Citi. Please refer to the "EEO is the Law" poster, the EEO is the Law Supplement, and the EEO Policy Statement for further information.,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Lead Software Engineer at HERE Technologies, you will be responsible for developing full-stack software solutions and building extensive ETL pipelines. Joining the HERE Analytics group, you will play a key role in strengthening the infrastructure of big data visualization tools to view complex large-scale location attributes on a map. Your responsibilities will cover all aspects of the software development lifecycle, from refining product vision and gathering requirements to coding, testing, release, and support. Working collaboratively with team members located worldwide, you will tackle challenging problems related to large-scale data extraction, transformation, and enrichment. In this role, you will implement tools to enhance automated and semi-automated map data processing, involving both backend/service-based software stack and front-end visualization components for big data analysis. You will also utilize CI/CD tools, taking end-to-end ownership of the software you develop, including DevOps and Testing. Collaborating closely with full-stack and frontend engineers, you will refine APIs and system integrations. Additionally, you will engage with other engineering teams and internal customers to identify new opportunities, address critical needs, and solve complex problems using your backend development expertise. Becoming an expert in leveraging internal platform resources and APIs, you will work in the AWS cloud computing environment. To be successful in this role, you should have at least 8 years of software development experience and be proficient in Java, Python, Scala, or a similar Functional Programming language. You should also possess expertise in working with Relational Databases, Cloud Computing Services like AWS, Continuous Integration (CI), Continuous Deployment, ETL systems using Big Data processing engines such as Hadoop, Spark, EMR, NoSQL Databases, and SOAP/REST Web Services. If you are passionate about driving innovation, creating positive change, and working on cutting-edge technologies in a collaborative global environment, we invite you to join our team at HERE Technologies.,

Posted 6 days ago

Apply

8.0 - 12.0 years

0 Lacs

haryana

On-site

Our Enterprise Technology division at Macquarie delivers cutting-edge solutions for global operations. We are currently seeking a Vice President who will be responsible for driving strategic direction and operational excellence. The ideal candidate will lead a talented team of engineers, fostering a culture of innovation and collaboration. At Macquarie, we take pride in bringing together diverse individuals and empowering them to explore a wide range of possibilities. As a global financial services group operating in 34 markets with 55 years of unbroken profitability, we offer a supportive and friendly team environment where everyone's ideas contribute to driving outcomes. In this key leadership position, you will have the opportunity to lead and mentor a high-performing team of engineers, cultivating a culture of innovation and continuous improvement. Your responsibilities will include developing and executing the strategic roadmap for enterprise data platforms, ensuring alignment with business objectives and timely project delivery. Collaboration with cross-functional teams to deliver effective data solutions, maintaining technical excellence, and embracing innovative technologies will be essential. The successful candidate should possess: - Extensive experience in data engineering and managing complex data platform projects - Demonstrated leadership skills in managing and developing engineering teams - Proficiency in data architecture, data warehousing, ETL processes, big data technologies (Hadoop, Spark, Kafka), AWS services, Kubernetes, and Docker - Strong analytical and problem-solving abilities for data-driven decision-making - Excellent communication and interpersonal skills for engaging and influencing stakeholders If you are inspired to contribute to building a better future and are excited about the role or working at Macquarie, we encourage you to apply. About Technology: Technology plays a crucial role in every aspect of Macquarie, for our people, customers, and communities. We are a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrow's technology solutions. Our Commitment to Diversity, Equity, and Inclusion: We are dedicated to providing reasonable adjustments to individuals who may require support during the recruitment process and in their working arrangements. If you need additional assistance, please inform us during the application process.,

Posted 6 days ago

Apply

6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Tranzmeo is an innovative leader in the fiber optic sensing industry, dedicated to developing cutting-edge solutions that optimize pipeline monitoring, well monitoring, railways, aerospace, defense, and beyond. We harness the power of AI, ML, and Big Data to push the boundaries of sensing technology and provide real-time insights that matter. We’re hiring a passionate and experienced Technical Lead to head our Data Science & Machine Learning initiatives. If you’re excited about applying AI/ML to advanced sensing technologies and want to make a tangible impact, we want to hear from you! What You’ll Do: 🔹 Lead the design, development, and deployment of scalable AI/ML models tailored to fiber optic sensing data 🔹 Architect Big Data solutions for processing vast amounts of sensing data in real-time 🔹 Collaborate with product teams, sensor engineers, and stakeholders on innovative projects 🔹 Mentor and manage a talented team of data scientists and engineers 🔹 Ensure data quality, model performance, and security standards 🔹 Stay ahead of developments in AI/ML and fiber optic sensing fields to foster innovation What We’re Looking For: ✅ B.Tech in Computer Science, Data Science, or related fields ✅ 6+ years of experience in data science, machine learning, and Big Data, preferably in sensing or telecom sectors ✅ Proven leadership and team management skills ✅ Strong programming skills (Python) ✅ Experience with cloud platforms (AWS, GCP, Azure) ✅ Hands-on experience with ML frameworks (TensorFlow, PyTorch, scikit-learn) ✅ Skilled in Big Data tools (Hadoop, Spark, Kafka, Airflow) ✅ Good understanding of fiber optic sensing technologies and data characteristics is a plus ✅ Excellent problem-solving, communication, and collaboration skills Why Join Us? Work on innovative sensing technology projects with real-world impact Lead the AI and ML strategies in a pioneering industry Dynamic and growth-oriented work environment Opportunity to shape the future of fiber optic sensing solutions

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an Associate Product Support Engineer focused on Hadoop distributed systems, you will be responsible for providing technical support to enterprise clients. Your main tasks will involve troubleshooting and resolving issues within Hadoop environments, ensuring the stability and reliability of our customers" infrastructures. Working closely with experienced engineers, you will collaborate with customers to understand their problems and deliver effective solutions with empathy and professionalism. Your key responsibilities will include addressing customer issues related to Hadoop clusters, core components (HDFS, YARN, MapReduce, Hive, etc.), and performing basic administrative tasks such as installation, configuration, and upgrades. You will document troubleshooting steps and solutions for knowledge sharing purposes. To excel in this role, you should have a minimum of 3 years of hands-on experience as a Hadoop Administrator or in a similar support role. A strong understanding of Hadoop architecture and core components, along with proven experience in troubleshooting Hadoop-related issues, is essential. Proficiency in Linux operating systems, good communication skills, and excellent problem-solving abilities are also required. Experience with components like Spark, NiFi, and HBase, as well as exposure to data security and data engineering principles within Hadoop environments, will be advantageous. Furthermore, prior experience in a customer-facing technical support role and familiarity with tools like Salesforce & Jira are considered beneficial. Knowledge of automation and scripting languages like Python and Bash is a plus. This role offers an opportunity for candidates passionate about Hadoop administration and customer support to deepen their expertise in a focused, high-impact environment.,

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a Capabilities and Insights Analytics Analyst at McKinsey, you will have the opportunity to drive lasting impact and build long-term capabilities with clients in the healthcare industry. Thriving in a high-performance culture, you will embrace challenges, learn from setbacks, and demonstrate resilience in finding innovative solutions. Your journey at McKinsey will be supported by resources, mentorship, and opportunities that will accelerate your growth as a leader. Colleagues at all levels will invest in your development, ensuring you receive the guidance and exposure needed to excel. Through structured programs and a culture of continuous learning, you will be empowered to take ownership of your development and embrace feedback for rapid growth. From day one, your voice will be valued, and your ideas will contribute to delivering exceptional results for clients. Embracing diverse perspectives and collaborating with colleagues from around the globe, you will work towards achieving the best outcomes for clients while fostering creativity and innovation. In this role, you will be focused on healthcare value, aiming to improve the accessibility and affordability of healthcare for billions of people worldwide. By delivering high-quality analytical insights and leveraging advanced analytics tools, you will guide decision-making for clients, driving positive impact in the healthcare industry. Your responsibilities will include owning data models, developing healthcare content expertise, and honing project management and client communication skills. Collaborating with colleagues from various domains, you will contribute to solving complex business problems and driving innovation within the organization. To excel in this role, you should have a bachelor's degree in business or engineering, along with at least 2 years of relevant experience. Proficiency in working with large databases, data visualization tools, and statistical analysis is preferred, while knowledge of SQL and additional programming languages such as R, Python, and Tableau would be beneficial. Strong problem-solving skills, entrepreneurial drive, and excellent communication abilities are essential for success in this dynamic and collaborative environment. Join McKinsey to be part of a global community dedicated to making a difference in the healthcare industry, where your skills and contributions will have a meaningful impact on shaping a better future for healthcare worldwide.,

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Qualcomm India Private Limited is a leading technology innovator pushing the boundaries of what's possible to enable next-generation experiences and drive digital transformation for a smarter, connected future. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and specialized utility programs to launch cutting-edge, world-class products exceeding customer needs. Collaborate with various teams to design system-level software solutions and gather performance requirements and interfaces. Minimum Qualifications: - Possess a Bachelor's degree in Engineering, Information Systems, Computer Science, or a related field. Senior Machine Learning & Data Engineer: Join our team as a Senior Machine Learning & Data Engineer with expertise in Python development. Design scalable data pipelines, build and deploy ML/NLP models, and enable data-driven decision-making within the organization. Key Responsibilities: - Data Engineering & Infrastructure: Design and implement robust ETL pipelines and data integration workflows using SQL, NoSQL, and big data technologies. - Machine Learning & NLP: Build, fine-tune, and deploy ML/NLP models using frameworks like TensorFlow, PyTorch, and Scikit-learn. - Python Development: Develop scalable backend services using Python frameworks such as FastAPI, Flask, or Django. - Collaboration & Communication: Work closely with cross-functional teams to integrate ML solutions into production systems. Required Qualifications: - Hold a Bachelors or Masters degree in Computer Science, Engineering, or a related field. - Possess strong Python programming skills and experience with modern libraries and frameworks. - Deep understanding of ML/NLP concepts and practical experience with LLMs and RAG architectures. Automation Engineer: As an Automation Engineer proficient in C#/Python development, you will play a crucial role in developing advanced solutions for Product Test automation. Collaborate with stakeholders to ensure successful implementation and operation of automation solutions. Responsibilities: - Design, develop, and maintain core APIs using C#. - Identify, troubleshoot, and optimize API development and testing. - Stay updated with industry trends in API development. Requirements: - Hold a Bachelor's degree in Computer Science, Engineering, or a related field. - Proven experience in developing APIs using C# and Python. - Strong understanding of software testing principles and methodologies. Qualcomm is an equal opportunity employer committed to providing accessible processes for individuals with disabilities. For accommodations, contact disability-accommodations@qualcomm.com.,

Posted 6 days ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with data processing frameworks such as Hadoop. - Strong understanding of distributed computing principles. - Familiarity with programming languages such as Java or Scala. - Experience in developing and deploying applications in cloud environments. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Bhubaneswar office. - A 15 years full time education is required., 15 years full time education

Posted 6 days ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years of fulltime education or above Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Databricks Unified Data Analytics Platform. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: - Design, develop, and maintain applications using Databricks Unified Data Analytics Platform. - Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. - Develop and maintain technical documentation related to the application development process. - Ensure that all applications are developed according to industry standards and best practices. - Provide technical support and troubleshooting for applications developed using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with other big data technologies such as Hadoop, Spark, and Hive. - Strong understanding of software development principles and methodologies. - Experience with programming languages such as Python, Java, or Scala. - Experience with database technologies such as SQL and NoSQL. - Experience with version control systems such as Git or SVN. Additional Information: - The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. - This position is based at our Bengaluru office., 15 years of fulltime education or above

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Senior Developer/ Lead in Data Science at Material+, you will be responsible for leading the development and implementation of Generative AI models such as Azure OpenAI GPT and multi-agent system architecture. Your role will require a strong background in Python and AI/ML libraries like TensorFlow, PyTorch, and Scikit-learn. Experience with frameworks like LangChain, AutoGen, or similar for multi-agent systems will be essential, along with expertise in data preprocessing, feature engineering, and model evaluation techniques. It is preferred that you have a minimum of 4+ years of experience in a similar role as a Senior Developer/ Lead in Data Science. The ideal candidate should be well-versed in Big Data tools such as Spark, Hadoop, Databricks, and databases like SQL and NoSQL. Familiarity with ReactJS for developing responsive and interactive user interfaces would be a bonus. Material+ is looking for an immediate joiner located in Gurgaon or Bangalore. In addition to a challenging work environment, we offer professional development opportunities, mentorship, and a hybrid work mode with a remote-friendly workplace. We are committed to the well-being of our employees and provide benefits such as health and family insurance, ample leave days including maternity and paternity leaves, as well as wellness, meditation, and counseling sessions. Join us at Material+ where you can contribute to cutting-edge projects in Data Science while enjoying a supportive and inclusive work culture.,

Posted 6 days ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Organization Summary A career within Operations Consulting services, will provide you with the opportunity to help our clients optimize all elements of their operations to move beyond the role of a cost-effective business enabler and become a source of competitive advantages. We focus on product innovation and development, supply chain, procurement and sourcing, manufacturing operations, service operations and capital asset programs to drive both growth and profitability. In our Operations Transformation team, you’ll work with our clients to transform their enterprise processes by leveraging integrated supply and demand planning solutions to enhance their core transaction processing and reporting competencies, ultimately strengthening their ability to support management decision-making and corporate strategy. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be purpose-led and values-driven leaders at every level. To help us achieve this, we have the PwC Professional, our global leadership development framework. It gives us a single set of expectations across our lines, geographies, and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. We are seeking an experienced Planning Solutions Architect with a strong technical background and extensive experience in implementing solutions such as o9 / SAP IBP. The ideal candidate will have completed at least 5 to 6 implementations and possess deep knowledge of the product and technical architecture. This role focuses on the ability to lead o9 / SAP IBP transformations for clients across industries, being able to lead team of consultants and translate business requirements to technical and architecture needs and drive the implementation journey. Key Responsibilities Collaborate with stakeholders to understand business requirements and translate them into technical specifications Lead design and process discussions with clients Design and architect the overall technology framework for o9 / IBP implementations Lead the technical implementation of o9 / IBP solutions, including integrations with legacy systems, data warehouse, and end to end ERP Develop testing strategy and test scenarios Identify gaps and develop custom design specifications Provide technical guidance, troubleshooting and support throughout the implementation process Lead team through the implementation and hyper care process, actively coaching / mentoring junior consultants Conduct training sessions and knowledge transfer to internal teams and client teams Ensure best practices and quality standards are followed across the engagement delivery Lead client pursuit discussions and highlight experience and approach for o9/ IBP implementations across industries Mentor, guide and train a team of o9 / SAP IBP consultants Travel may be required for this role, depending on client requirements Education Degrees/Field of Study required: MBA / MTech or a Master's degree in a related field Certifications Certifications related to o9 / SAP IBP solutions or other relevant technologies Required Skills o9 / SAP IBP functional and technical expertise in Demand Planning, Supply Planning, Master Planning and Scheduling and IBP Supply chain planning domain experience Optional Skills Advanced understanding of o9 / SAP IBP data models Experience with other supply chain planning solutions Database: SQL, Python on HADOOP, R-Scripts MS SSIS integration skills, HADOOP HIVE Travel Requirements Yes

Posted 6 days ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in infrastructure focus on designing and implementing robust, secure IT systems that support business operations. They enable the smooth functioning of networks, servers, and data centres to optimise performance and minimise downtime. In infrastructure engineering at PwC, you will focus on designing and implementing robust and scalable technology infrastructure solutions for clients. Your work will involve network architecture, server management, and cloud computing experience. Data Modeler Job Description: Looking for candidates with a strong background in data modeling, metadata management, and data system optimization. You will be responsible for analyzing business needs, developing long term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise include Analyze and translate business needs into long term solution data models. Evaluate existing data systems and recommend improvements. Define rules to translate and transform data across data models. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Utilize canonical data modeling techniques to enhance data system efficiency. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems to ensure optimal performance. Strong expertise in relational and dimensional modeling (OLTP, OLAP). Experience with data modeling tools (Erwin, ER/Studio, Visio, PowerDesigner). Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). Knowledge of NoSQL databases (MongoDB, Cassandra) and their data structures. Experience working with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). Familiarity with ETL processes, data integration, and data governance frameworks. Strong analytical, problem-solving, and communication skills. Qualifications: Bachelor's degree in Engineering or a related field. 3 to 5 years of experience in data modeling or a related field. 4+ years of hands-on experience with dimensional and relational data modeling. Expert knowledge of metadata management and related tools. Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. Knowledge of transactional databases and data warehouses. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams. Preferred Skills: Experience in cloud-based data solutions (AWS, Azure, GCP). Knowledge of big data technologies (Hadoop, Spark, Kafka). Understanding of graph databases and real-time data processing. Certifications in data management, modeling, or cloud data engineering. Excellent communication and presentation skills. Strong interpersonal skills to collaborate effectively with various teams.

Posted 6 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will also be monitoring and controlling all phases of the development process, providing user and operational support on applications to business users, and recommending and developing security measures post-implementation to ensure successful system design and functionality. Furthermore, you will be utilizing in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgments. You will consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and ensure essential procedures are followed while defining operating standards and processes. As an Applications Development Senior Programmer Analyst, you will also serve as an advisor or coach to new or lower-level analysts, operate with a limited level of direct supervision, and exercise independence of judgment and autonomy. You will act as a subject matter expert to senior stakeholders and/or other team members and appropriately assess risk when making business decisions. Qualifications: - Must Have: - 8+ years of application/software development/maintenance - 5+ years of experience on Big Data Technologies like Apache Spark, Hive, Hadoop - Knowledge of Python, Java, or Scala programming language - Experience with JAVA, Web services, XML, Java Script, Microservices, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and Hadoop ecosystem - Experience with developing frameworks and utility services, code quality tools - Ability to work independently, multi-task, and take ownership of various analyses - Strong analytical and communication skills - Banking domain experience is a must - Good to Have: - Work experience in Citi or Regulatory Reporting applications - Hands-on experience on cloud technologies, AI/ML integration, and creation of data pipelines - Experience with vendor products like Tableau, Arcadia, Paxata, KNIME - Experience with API development and data formats Education: Bachelors degree/University degree or equivalent experience This job description provides a high-level overview of the work performed. Other job-related duties may be assigned as required.,

Posted 6 days ago

Apply

5.0 - 12.0 years

0 Lacs

karnataka

On-site

The role requires you to design, test, and maintain software programs for operating systems or applications that need to be deployed at a client's end while ensuring they meet 100% quality assurance parameters. As a Big Data Developer with expertise in Spark, Scala, and Pyspark coding & scripting, you will be responsible for big data engineering development using the Hadoop/Spark ecosystem. With 5 to 12 years of experience, this position is based in Bangalore with a notice period of 0 to 30 days. Key skills include proficiency in Spark, Scala, Pyspark coding, hands-on experience in Big Data, knowledge of the Hadoop Eco System, and cloud architecture AWS. You will also be involved in data ingestion and integration into the Data Lake using Hadoop ecosystem tools, such as Sqoop, Spark, Impala, Hive, Oozie, and Airflow. Candidates should be fluent in Python/Scala language and possess strong communication skills. Your responsibilities will include performing coding and ensuring optimal software/module development, determining operational feasibility, developing and automating processes for software validation, modifying software to fix errors or improve performance, and analyzing information to recommend system installations or modifications. Additionally, you will prepare reports on programming project specifications, activities, and status, ensure error-free code, compile comprehensive documentation and reports, coordinate with the team on project status, and provide feedback on usability and serviceability. You will also be responsible for status reporting and maintaining a focus on customer requirements throughout the project execution. This includes capturing client requirements, taking feedback regularly for on-time delivery, participating in continuing education and training, consulting with engineering staff, documenting solutions, and ensuring quality interaction with customers via email, fault report tracking, voice calls, etc. Timely response to customer requests without any complaints internally or externally is crucial. Performance parameters will be measured based on continuous integration, deployment, and monitoring of software, quality, customer satisfaction, MIS & reporting. Mandatory skills include Python for Insights with 5-8 years of experience. Wipro is undergoing a digital transformation, and individuals who are inspired by reinvention and constant evolution are encouraged to join. This is an opportunity to be part of a purpose-driven business that empowers you to design your reinvention and realize your ambitions. Applications from people with disabilities are explicitly welcome.,

Posted 6 days ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Skill Map: DevOps Role Description: Creates DevOps Pipeline for Various aspects of Delivery related to Model Implementation Key Accountabilities Role is accountable for Platform Simplification and Optimisation via DevOps. Achieving delivery efficiency by DevOps, knowledge of CICD pipeline. Should be able to understand the business requirements and be able to convert into technical artefact. Also manage and automate various execution steps wherever applicable to reduce the operational gaps Working on AWS services – EC2, EBS, S3, RDS, Secrets Manager, lambda, CloudWatch & CloudFormation Well versed with cloud architecture (AWS), TeamCity, Chef. Implement Change Requests (CR) for Business and team Identifying and resolving production and application development problems Adhere to Agile methodology, ensure requirements documentation complies with Agile and audit standards. Service transition, Dev Ops implementation Essential Skills/Basic Qualifications DevOps Tools experience for 5+ years. Well versed with cloud architecture (AWS) AWS native DevOps pipeline exposure e.g. Cloud Formation Good experience of DevOps tools like Jenkins, Chef. Must have Shell Scripting hands-on. Must have Python programming hands-on. Knowledge of AWS and cloud solutions. AWS certification is desirable. Knowledge of different big data processing platforms including Hadoop/RDS.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies