Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
0.0 - 7.0 years
0 - 0 Lacs
Noida, Uttar Pradesh
On-site
Varahe Analytics is on the hunt for a Video Editor with flair, someone who doesn’t just stitch clips together but crafts compelling narratives that inform, inspire, and spark conversation across digital platforms. What You’ll Be Doing - Craft punchy, fast-paced video content that simplifies complex ideas and makes them scroll-stopping. - Edit political explainers, cultural deep dives, and trending narratives with engaging visuals, pacing, and transitions. - Add motion graphics, sound design, memes, and moments of magic that make viewers say: “Wait, replay that!” - Collaborate with our researchers, writers, and design team to turn insights into impact. Your Editing Arsenal Should Include: - Adobe Premiere Pro, After Effects, Photoshop, Illustrator, Audio, Geolayer Plugin and emerging AI tools. - Any editing software you swear by is welcome, it’s the storytelling that matters most. - Understanding of visual storytelling, retention dynamics, and emotional pacing. - A good eye for color grading, typography, and how visuals land on social feeds.Bonus: Experience with high-speed editing workflows or content that’s gone viral. Preferred Background - Degree/Diploma in Film, Media, Communication, or Design (self-taught legends also welcome). - 4-7 years of editing experience, preferably in content creation or digital storytelling. - A showreel or portfolio is required. let your edits do the talking. You’ll Thrive If You: - Know how to keep viewers hooked within the first 5 seconds. - Know the difference between a cut that tells a story vs. a cut that kills it. - Can handle feedback, deadlines, and rapid turnarounds without compromising quality. Job Location: Sector 8, Noida, Uttar Pradesh. Ready to Join? Send your: CV, Showreel, A few lines on the kind of stories you love to tell. parth.patel@varaheanalytics.com Job Type: Full-time Pay: ₹40,000.00 - ₹75,000.00 per month Work Location: In person
Posted 4 days ago
0 years
0 Lacs
Delhi, India
On-site
Company Description Strategic Engagement and Event Solutions (SEES) is a provider and enabler of technology, consulting, and management solutions in the meetings domain. SEES leverages technology and design to make meetings effective, engaging, and easy. The company's founding pillars are transparency, value, and sustainable inclusive partnerships with clients. By mapping the latest trends, SEES provides innovative solutions that drive lasting business results. Role Description This role is all about people power. You'll help Mommyfest's voice reach new moms, influencers, and wellness champions across India. You'll be the one building bridges between our festival and the vibrant communities we aim to celebrate. Selected intern's day-to-day responsibilities include: 1. Work with SEES to create a thriving Mommyfest community through WhatsApp groups, Instagram polls, and contests 2. Identify and connect with mommy bloggers, micro-influencers, and parenting groups for collaborations 3. Craft fun and friendly outreach messages that spark conversations and partnerships 4. Coordinate influencer content (stories, reels, and posts) to align with our event buzz 5. Be the digital glue keeping our moms and creators engaged, supported, and hyped for Mommyfest Why this rocks: You'll build a strong network in the creator economy and learn how to manage influencer relationships like a pro. Qualifications Strong Interpersonal Skills and Community Engagement abilities Excellent Communication and Customer Service skills Experience in Community Management Proactive and organized with the ability to handle multiple tasks Relevant experience in a similar or related role is a plus Bachelor's degree in Communications, Public Relations, or a related field
Posted 4 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 10 The Team : The Data Engineering team is responsible for architecting, building, and maintaining our evolving data infrastructure, as well as curating and governing the data assets created on our platform. We work closely with various stakeholders to acquire, process, and refine vast datasets, focusing on creating scalable and optimized data pipelines. Our team possesses broad expertise in critical data domains, technology stacks, and architectural patterns. We foster knowledge sharing and collaboration, resulting in a unified strategy and seamless data management. The Impact: This role is the foundation of the products delivered. The data onboarded is the base for the company as it feeds into the products, platforms, and essential for supporting our advanced analytics and machine learning initiatives. What’s in it for you: Be the part of a successful team which works on delivering top priority projects which will directly contribute to Company’s strategy. Drive the testing initiatives including supporting Automation strategy, performance, and security testing. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities Design, develop, and maintain scalable and efficient data pipelines to process large volumes of data. To implement ETL processes to acquire, validate, and process incoming data from diverse sources. Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and translate them into technical solutions. Implement data ingestion, transformation, and integration processes to ensure data quality, accuracy, and consistency. Optimize Spark jobs and data processing workflows for performance, scalability, and reliability. Troubleshoot and resolve issues related to data pipelines, data processing, and performance bottlenecks. Conduct code reviews and provide constructive feedback to junior team members to ensure code quality and best practices adherence. Stay updated with the latest advancements in Spark and related technologies and evaluate their potential for enhancing existing data engineering processes. Develop and maintain documentation, including technical specifications, data models, and system architecture diagrams. Stay abreast of emerging trends and technologies in the data engineering and big data space and propose innovative solutions to enhance data processing capabilities. What We’re Looking For 5+ Years of experience in Data Engineering or related field Strong experience in Python programming with expertise in building data-intensive applications. Proven hands-on experience with Apache Spark, including Spark Core, Spark SQL, Spark Streaming, and Spark MLlib. Solid understanding of distributed computing concepts, parallel processing, and cluster computing frameworks. Proficiency in data modeling, data warehousing, and ETL techniques. Experience with workflow management platforms, preferably Airflow. Familiarity with big data technologies such as Hadoop, Hive, or HBase. Strong Knowledge of SQL and experience with relational databases. Hand on experience with AWS cloud data platform Strong problem-solving and troubleshooting skills, with the ability to analyze complex data engineering issues and provide effective solutions. Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams. Nice to have experience on DataBricks Preferred Qualifications: Bachelor’s degree in Information Technology, Computer Information Systems, Computer Engineering, Computer Science, or other technical discipline What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 315442 Posted On: 2025-07-23 Location: Ahmedabad, Gujarat, India
Posted 4 days ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As a Medicare Risk Adjustment Data Analyst, you’ll play a crucial role in supporting the development and enhancement of new analytical applications related to Medicare risk adjustment as well as supporting existing applications such as Coding Transformation Modernization, Attribution Analytics, Financial Risk Adjustment and Management Engine. Primary Responsibilities This position is for OI Clinical Solutions - Decision intelligence team and upon selection, you will be part of dynamic team working on developing and delivering Best in-class Analytics for end users. Your work will focus on understanding CMS Medicare Advantage business and developing best in-class Analytics for OI Clinical Solutions - Decision Intelligence team according to Business/Technical requirements. Here are the key responsibilities, qualities and experience we will look for in an ideal candidate: Gather and analyze business and/ or functional requirements from 1 or more client business teams Validate requirements with stakeholders and day to day project team, provide suggestions and recommendations in line with industry best practices Developing and delivering Best in-class Analytics for end users using Big Data and Cloud platforms Document, discuss and resolve business, data, data processing and BI/ reporting issues within the team, across functional teams, and with business stakeholders Present written and verbal data analysis findings, to both the project team and business stakeholders as required to support the requirements gathering phase and issue resolution activities Manage changing business priorities and scope and work on multiple projects concurrently Self - motivated and proactive with the ability to work in a fast - paced environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications 3+ years of solid work experience with python, spark and HIVE and solid experience in developing Analytics at scale using Python, Spark and HIVE 3+ years of solid work experience in developing E2E Analytics pipeline on Hadoop/Bigdata platform 3+ years of solid work experience in SQL or associated languages 3+ years of solid work experience - Ability to convert Business requirements into technical requirements and ability to develop Best in class code as per Technical/Business requirements Proven interpersonal, collaboration, diplomatic, influencing, planning and organizational skills Consistently demonstrate clear and concise written and verbal communication Proven ability to effectively use complex analytical, interpretive and problem-solving techniques Proven relationship management skills to partner and influence across organizational lines Demonstrated ability to be work under pressure and to meet tight deadlines with proactive, decisiveness and flexibility Preferred Qualifications AWS/GCP or any other cloud-based platform development experience Domain Experience: Understanding of Medicare risk adjustment programs Understanding of CMS datasets such as MMR/MOR/EDPS etc. At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 4 days ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Impact: Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset. Executes and provides feedback for data modeling policies, procedure, processes, and standards. Assists with capturing and documenting system flow and other pertinent technical information about data, database design, and systems. Develop comprehensive data quality standards and implement effective tools to ensure data accuracy and reliability. Collaborate with various Investment Management departments to gain a better understanding of new data patterns. Collaborate with Data Analysts, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. Translate high-level business requirements into detailed technical specs. The Minimum Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Systems or related field. Experience: 7-9 years of experience with data analytics, data modeling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau).
Posted 4 days ago
28.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
About Client: It is a global IT services company specializing in product engineering and digital transformation. They are known for their unique delivery platform, Excel Shore, which leverages data science to enhance client value. With over 28 years of experience, partners with technology-led businesses worldwide, including independent software vendors (ISVs). offers end-to-end software product development, modernization, and platform integration services. has multiple locations and serves a diverse range of industries, including software, media, travel, retail, and healthcare. have a large team of over 7,500 employees dedicated to delivering innovative solutions. emphasizes a diverse and inclusive workplace, with a significant percentage of women in their workforce. Job Title: Dotnet Fullstack Developer · Location: Pan India · Experience: 7 + Yrs · Mode of Work : Remote/Hybrid · Job Type : Contract to hire. · Notice Period: Immediate joiners. · Project Tenure: Long-term project Job Description: Python, ML, NLP (LDA, embeddings, RAG), AI techniques, LLM-based matching (e.g., GPT/embeddings), timeseries forecasting Django is essential Experience with Databricks, Azure ML Stack, OpenAI API, Spark, and fuzzy matching would be a plus. Builds and deploys ML pipelines (incl. MLOps, API endpoints, CI/CD) Works with Langchain, Azure Synapse, Kubernetes, and modern ML frameworks
Posted 4 days ago
0 years
0 Lacs
Chandigarh, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
0 years
0 Lacs
Vadodara, Gujarat, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Grade Level (for internal use): 10 The Team : The Data Engineering team is responsible for architecting, building, and maintaining our evolving data infrastructure, as well as curating and governing the data assets created on our platform. We work closely with various stakeholders to acquire, process, and refine vast datasets, focusing on creating scalable and optimized data pipelines. Our team possesses broad expertise in critical data domains, technology stacks, and architectural patterns. We foster knowledge sharing and collaboration, resulting in a unified strategy and seamless data management. The Impact: This role is the foundation of the products delivered. The data onboarded is the base for the company as it feeds into the products, platforms, and essential for supporting our advanced analytics and machine learning initiatives. What’s in it for you: Be the part of a successful team which works on delivering top priority projects which will directly contribute to Company’s strategy. Drive the testing initiatives including supporting Automation strategy, performance, and security testing. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities Design, develop, and maintain scalable and efficient data pipelines to process large volumes of data. To implement ETL processes to acquire, validate, and process incoming data from diverse sources. Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data requirements and translate them into technical solutions. Implement data ingestion, transformation, and integration processes to ensure data quality, accuracy, and consistency. Optimize Spark jobs and data processing workflows for performance, scalability, and reliability. Troubleshoot and resolve issues related to data pipelines, data processing, and performance bottlenecks. Conduct code reviews and provide constructive feedback to junior team members to ensure code quality and best practices adherence. Stay updated with the latest advancements in Spark and related technologies and evaluate their potential for enhancing existing data engineering processes. Develop and maintain documentation, including technical specifications, data models, and system architecture diagrams. Stay abreast of emerging trends and technologies in the data engineering and big data space and propose innovative solutions to enhance data processing capabilities. What We’re Looking For 5+ Years of experience in Data Engineering or related field Strong experience in Python programming with expertise in building data-intensive applications. Proven hands-on experience with Apache Spark, including Spark Core, Spark SQL, Spark Streaming, and Spark MLlib. Solid understanding of distributed computing concepts, parallel processing, and cluster computing frameworks. Proficiency in data modeling, data warehousing, and ETL techniques. Experience with workflow management platforms, preferably Airflow. Familiarity with big data technologies such as Hadoop, Hive, or HBase. Strong Knowledge of SQL and experience with relational databases. Hand on experience with AWS cloud data platform Strong problem-solving and troubleshooting skills, with the ability to analyze complex data engineering issues and provide effective solutions. Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams. Nice to have experience on DataBricks Preferred Qualifications: Bachelor’s degree in Information Technology, Computer Information Systems, Computer Engineering, Computer Science, or other technical discipline What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 315442 Posted On: 2025-07-23 Location: Ahmedabad, Gujarat, India
Posted 4 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a part of Microsoft's Cloud Supply Chain (CSCP) organization, your role will be crucial in supporting the growth of Microsoft's Cloud business which includes AI technologies. The vision of CSCP is to empower customers to achieve more by providing Cloud Capacity Differentiated at Scale. The mission is to deliver capacity for all cloud services predictably through intelligent systems and continuous learning. The responsibilities of CSCP extend beyond traditional supply chain functions to include supportability, decommissioning, and disposition of Data centre assets on a global scale. Within the Cloud Manufacturing Operations and Fulfilment (CMOF) organization, your role will focus on developing scalable and secure data architecture to support analytics and business processes. You will lead the creation of data pipelines, models, and integration strategies to enable analytics and AI capabilities across CMOF. This position plays a critical role in aligning data infrastructure with Microsoft's evolving Security Future Initiative (SFI) and engineering best practices. Key Responsibilities: - Design and develop scalable data ingestion pipelines from multiple sources. - Implement data orchestration using tools like Spark, PySpark, and Python. - Develop ETL jobs to optimize data flow and reliability. - Design logical and physical data models to support near real-time analytics. - Perform data profiling and gap analysis for migration to next-gen platforms. - Ensure data models support scalability, privacy, and governance. - Adhere to Microsoft's SFI guidelines, data residency policies, and data privacy regulations. - Implement data security measures like data masking and encryption. - Collaborate with engineering teams to ensure system updates and data lineage tracking. - Enable self-service BI and analytics using tools like Power BI and Azure Synapse. - Create reusable datasets, data models, and visualizations aligned with business priorities. - Translate business requirements into technical specs for scalable data solutions. Qualifications: Required: - Bachelor's degree in computer science, MIS, Data Engineering, or equivalent. - 5-8 years of experience in building cloud-based data systems and ETL frameworks. - Proficiency in relational databases, cloud-based data systems, and data orchestration tools. - Experience with visualization tools like Microsoft Power Platform and Fabric. Preferred: - Strong foundation in data modeling, warehousing, and data lake architecture. - Familiarity with ERP systems such as SAP and Dynamics 365. - Experience in modern development practices, agile methodologies, and version control. - Hands-on experience in data security, compliance controls, and governance frameworks. - Knowledge of AI applications for automated learning. Key Competencies: - Strong business acumen and strategic alignment of data capabilities. - Deep understanding of data privacy, compliance, and lifecycle management. - Excellent collaboration and communication skills across global teams. - Self-starter mindset with the ability to thrive in a fast-paced environment. - Strong analytical thinking, problem-solving skills, and continuous improvement mindset. - Ability to drive change and promote a data-driven culture within the organization.,
Posted 5 days ago
0.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Role: Senior Analyst - Data Engineering Experience: 4 to 6 years Location: Chennai, Tamil Nadu , India (CHN) Job Description: A highly skilled and motivated Senior Engineer with deep expertise in the Databricks platform to join our growing data engineering and analytics team. As a Senior Engineer, you will play a crucial role in designing, building, and optimizing our data pipelines, data lakehouse solutions, and analytics infrastructure on Databricks. You will collaborate closely with data scientists, analysts, and other engineers to deliver high-quality, scalable, and reliable data solutions that drive business insights and decision-making. Job Responsibilities: Design, develop, and maintain scalable and robust data pipelines and ETL/ELT processes using Databricks, Spark (PySpark, Scala), Delta Lake, and related technologies. Architect and implement data lakehouse solutions on Databricks, ensuring data quality, integrity, and performance. Develop and optimize data models for analytical and reporting purposes within the Databricks environment. Implement and manage data governance and security best practices within the Databricks platform, including Unity Catalog and RBAC. Utilize Databricks Delta Live Tables (DLT) to build and manage reliable data pipelines. Implement and leverage Change Data Feed (CDF) for efficient data synchronization and updates. Monitor and troubleshoot data pipelines and system performance on the Databricks platform. Collaborate with data scientists and analysts to understand their data requirements and provide efficient data access and processing solutions. Participate in code reviews, ensuring adherence to coding standards and best practices. Contribute to the development of technical documentation and knowledge sharing within the team. Stay up-to-date with the latest advancements in Databricks and related data technologies. Mentor and guide junior engineers on the team. Participate in the planning and execution of data-related projects and initiatives. Skills Required: Databricks, SQL, Pyspark, Python Data modeling, DE concepts Job Snapshot Updated Date 24-07-2025 Job ID J_3897 Location Chennai, Tamil Nadu, India Experience 4 - 6 Years Employee Type Permanent
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job Description: Job Title: Data/AI Engineer – GenAI & Agentic AI Integration (Azure) Location: Bangalore, India Job Type: Full-Time About the Role We are seeking a highly skilled Data/AI Engineer to join our dynamic team, specializing in integrating cutting-edge Generative AI (GenAI) and Agentic AI solutions within the Azure cloud environment. The ideal candidate will have a strong background in Python, data engineering, and AI model integration, with hands-on experience working on Databricks, Snowflake, Azure Storage, and Palantir platforms. You will play a crucial role in designing, developing, and deploying scalable data and AI pipelines that power next-generation intelligent applications. Key Responsibilities Design, develop, and maintain robust data pipelines and AI integration solutions using Python on Azure Databricks. Integrate Generative AI and Agentic AI models into existing and new workflows to drive business innovation and automation. Collaborate with data scientists, AI researchers, software engineers, and product teams to deliver scalable and efficient AI-powered solutions. Orchestrate data movement and transformation across Azure-native services including Azure Databricks, Azure Storage (Blob, Data Lake), and Snowflake, ensuring data quality, security, and compliance. Integrate enterprise data using Palantir Foundry and leverage Azure services for end-to-end solutions. Develop and implement APIs and services to facilitate seamless AI model deployment and integration. Optimize data workflows for performance and scalability within Azure. Monitor, troubleshoot, and resolve issues related to data and AI pipeline performance. Document architecture, designs, and processes for knowledge sharing and operational excellence. Stay current with advances in GenAI, Agentic AI, Azure data engineering best practices, and cloud technologies. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field (or equivalent practical experience). 5+ years of professional experience in data engineering or AI engineering roles. Strong proficiency in Python for data processing, automation, and AI model integration. Hands-on experience with Azure Databricks and Spark for large-scale data engineering. Proficiency in working with Snowflake for cloud data warehousing. In-depth experience with Azure Storage solutions (Blob, Data Lake) for data ingestion and management. Familiarity with Palantir Foundry or similar enterprise data integration platforms. Demonstrated experience integrating and deploying GenAI or Agentic AI models in production environments. Knowledge of API development and integration for AI and data services. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Excellent communication and documentation skills. Preferred Qualifications Experience with Azure Machine Learning, Azure Synapse Analytics, and other Azure AI/data services. Experience with MLOps, model monitoring, and automated deployment pipelines in Azure. Exposure to data governance, privacy, and security best practices. Experience with visualization tools and dashboard development. Knowledge of advanced AI model architectures, including LLMs and agent-based systems. #DataEngineer Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category Engineering Experience Principal Associate Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Principal Associate- Fullstack Engineering Job Description Generative AI Observability & Governance for ML Platform At Capital One India, we work in a fast paced and intellectually rigorous environment to solve fundamental business problems at scale. Using advanced analytics, data science and machine learning, we derive valuable insights about product and process design, consumer behavior, regulatory and credit risk, and more from large volumes of data, and use it to build cutting edge patentable products that drive the business forward. We’re looking for a Principal Associate, Full Stack to join the Machine Learning Experience (MLX) team! As a Capital One Principal Associate, Full Stack, you'll be part of a team focusing on observability and model governance automation for cutting edge generative AI use cases. You will work on building solutions to collect metadata, metrics and insights from the large scale genAI platform. And build intelligent and smart solutions to derive deep insights into platform's use-cases performance and compliance with industry standards. You will contribute to building a system to do this for Capital One models, accelerating the move from fully trained models to deployable model artifacts ready to be used to fuel business decisioning and build an observability platform to monitor the models and platform components. The MLX team is at the forefront of how Capital One builds and deploys well-managed ML models and features. We onboard and educate associates on the ML platforms and products that the whole company uses. We drive new innovation and research and we’re working to seamlessly infuse ML into the fabric of the company. The ML experience we're creating today is the foundation that enables each of our businesses to deliver next-generation ML-driven products and services for our customers. What You’ll Do: Lead the design and implementation of observability tools and dashboards that provide actionable insights into platform performance and health. Leverage Generative AI models and fine tune them to enhance observability capabilities, such as anomaly detection, predictive analytics, and troubleshooting copilot. Build and deploy well-managed core APIs and SDKs for observability of LLMs and proprietary Gen-AI Foundation Models including training, pre-training, fine-tuning and prompting. Work with model and platform teams to build systems that ingest large amounts of model and feature metadata and runtime metrics to build an observability platform and to make governance decisions to ensure ethical use, data integrity, and compliance with industry standards for Gen-AI. Partner with product and design teams to develop and integrate advanced observability tools tailored to Gen-AI. Collaborate as part of a cross-functional Agile team,data scientists, ML engineers, and other stakeholders to understand requirements and translate them into scalable and maintainable solutions. Bring research mindset, lead Proof of concept to showcase capabilities of large language models in the realm of observability and governance which enables practical production solutions for improving platform users productivity. Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. At least 4 years of experience designing and building data intensive solutions using distributed computing with deep understanding of microservices architecture. At least 4 years of experience programming with Python, Go, or Java Proficiency in observability tools such as Prometheus, Grafana, ELK Stack, or similar, with a focus on adapting them for Gen AI systems. Excellent knowledge in Open Telemetry and priority experience in building SDKs and APIs. Hands-on experience with Generative AI models and their application in observability or related areas. Excellent knowledge in Open Telemetry and priority experience in building SDKs and APIs. At least 2 years of experience with cloud platforms like AWS, Azure, or GCP. Preferred Qualifications: At least 4 years of experience building, scaling, and optimizing ML systems At least 3 years of experience in MLOps either using open source tools like MLFlow or commercial tools At least 2 Experience in developing applications using Generative AI i.e open source or commercial LLMs, and some experience in latest open source libraries such as LangChain, haystack and vector databases like open search, chroma and FAISS. Preferred prior experience in leveraging open source libraries for observability such as langfuse, phoenix, openInference, helicone etc. Contributed to open source libraries specifically GEN-AI and ML solutions Authored/co-authored a paper on a ML technique, model, or proof of concept Preferred experience with an industry recognized ML framework such as scikit-learn, PyTorch, Dask, Spark, or TensorFlow. Prior experience in NVIDIA GPU Telemetry and experience in CUDA Knowledge of data governance and compliance, particularly in the context of machine learning and AI systems. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.
Posted 5 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category Engineering Experience Principal Associate Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Principal Associate- Fullstack Engineering Job Description Generative AI Observability & Governance for ML Platform At Capital One India, we work in a fast paced and intellectually rigorous environment to solve fundamental business problems at scale. Using advanced analytics, data science and machine learning, we derive valuable insights about product and process design, consumer behavior, regulatory and credit risk, and more from large volumes of data, and use it to build cutting edge patentable products that drive the business forward. We’re looking for a Principal Associate, Full Stack to join the Machine Learning Experience (MLX) team! As a Capital One Principal Associate, Full Stack, you'll be part of a team focusing on observability and model governance automation for cutting edge generative AI use cases. You will work on building solutions to collect metadata, metrics and insights from the large scale genAI platform. And build intelligent and smart solutions to derive deep insights into platform's use-cases performance and compliance with industry standards. You will contribute to building a system to do this for Capital One models, accelerating the move from fully trained models to deployable model artifacts ready to be used to fuel business decisioning and build an observability platform to monitor the models and platform components. The MLX team is at the forefront of how Capital One builds and deploys well-managed ML models and features. We onboard and educate associates on the ML platforms and products that the whole company uses. We drive new innovation and research and we’re working to seamlessly infuse ML into the fabric of the company. The ML experience we're creating today is the foundation that enables each of our businesses to deliver next-generation ML-driven products and services for our customers. What You’ll Do: Lead the design and implementation of observability tools and dashboards that provide actionable insights into platform performance and health. Leverage Generative AI models and fine tune them to enhance observability capabilities, such as anomaly detection, predictive analytics, and troubleshooting copilot. Build and deploy well-managed core APIs and SDKs for observability of LLMs and proprietary Gen-AI Foundation Models including training, pre-training, fine-tuning and prompting. Work with model and platform teams to build systems that ingest large amounts of model and feature metadata and runtime metrics to build an observability platform and to make governance decisions to ensure ethical use, data integrity, and compliance with industry standards for Gen-AI. Partner with product and design teams to develop and integrate advanced observability tools tailored to Gen-AI. Collaborate as part of a cross-functional Agile team,data scientists, ML engineers, and other stakeholders to understand requirements and translate them into scalable and maintainable solutions. Bring research mindset, lead Proof of concept to showcase capabilities of large language models in the realm of observability and governance which enables practical production solutions for improving platform users productivity. Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. At least 4 years of experience designing and building data intensive solutions using distributed computing with deep understanding of microservices architecture. At least 4 years of experience programming with Python, Go, or Java Proficiency in observability tools such as Prometheus, Grafana, ELK Stack, or similar, with a focus on adapting them for Gen AI systems. Excellent knowledge in Open Telemetry and priority experience in building SDKs and APIs. Hands-on experience with Generative AI models and their application in observability or related areas. Excellent knowledge in Open Telemetry and priority experience in building SDKs and APIs. At least 2 years of experience with cloud platforms like AWS, Azure, or GCP. Preferred Qualifications: At least 4 years of experience building, scaling, and optimizing ML systems At least 3 years of experience in MLOps either using open source tools like MLFlow or commercial tools At least 2 Experience in developing applications using Generative AI i.e open source or commercial LLMs, and some experience in latest open source libraries such as LangChain, haystack and vector databases like open search, chroma and FAISS. Preferred prior experience in leveraging open source libraries for observability such as langfuse, phoenix, openInference, helicone etc. Contributed to open source libraries specifically GEN-AI and ML solutions Authored/co-authored a paper on a ML technique, model, or proof of concept Preferred experience with an industry recognized ML framework such as scikit-learn, PyTorch, Dask, Spark, or TensorFlow. Prior experience in NVIDIA GPU Telemetry and experience in CUDA Knowledge of data governance and compliance, particularly in the context of machine learning and AI systems. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Posted 5 days ago
2.0 years
0 Lacs
Gurugram, Haryana
On-site
Location Gurugram, Haryana, India Category Corporate Job Id GGN00002168 Tech Ops / Maintenance - Management & Administrative Job Type Full-Time Posted Date 07/24/2025 Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we have some of the best aircraft in the world. Our Technical Operations team is full of aircraft maintenance technicians, engineers, planners, ground equipment and facilities professionals, and supply chain teams that help make sure they’re well taken care of and ready to get our customers to their desired destinations. If you’re ready to work on our planes, join our Tech Ops experts and help keep our fleet in tip-top shape. Job overview and responsibilities United's Maintenance & Engineering operation collects mountains of data, including maintenance plans, log pages, task signoffs, schedule reliability performance, aircraft routing, part availability, and more. The Tech Ops Business Intelligence team will be tasked to deliver the right information to the right people in the right format at the right time, all with the goal of enabling better operational decisions that improve United's flight completion rate, on-time performance, productivity, and cost. This includes both performance trends looking backward, real-time operational status, and expectations looking forward. The team has five core responsibilities: Data design and validation, Data analysis, KPI design, dashboard creation, and automation. Support with design of meaningful metrics that indicate operational health and inform operational decisions Generate high-quality operational dashboards and reports for Tech Ops leadership, front-line management, and individual business teams throughout the organization Curate tables and views that serve as the "single source of truth" for United's Tech Ops data Continuously interface with business groups throughout Tech Ops to understand organizational needs and design solutions Support with automating existing manual reports and processes to improve operational throughput Document the Tech Ops data landscape, maintain an inventory of reports, and plan for report consolidation, elimination, and/or improvement This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): Bachelor's degree in a quantitative field like Math, Statistics, Operations Research, Computer Science, Engineering, or related field required At least 2 years of experience in analytics/ reporting required Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy Understanding of data structures, relationships, and efficient transformations Knowledge and application of data visualization best practices" Familiarity with writing complex queries and procedures using both traditional and modern technologies/languages (i.e. SQL, Python, Spark, etc.) Data visualization skills using one or more reporting tools (i.e. Spotfire, Tableau, ggplot2, etc.) to produce meaningful, elegant dashboards Experience with JavaScript, D3, HTML, CSS / front-end development Ability to learn what a business team does, then design a data/technology solution that connects business processes with quantifiable outcomes Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): Master's Degree in a quantitative field preferred Airline experience or knowledge of airline operations preferred Familiarity with various parts of the data ecosystem (acquisition, engineering, storage, management, analysis, visualization, and deployment) preferred Exposure to statistical and analytical methods preferred
Posted 5 days ago
0.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Designation: Assistant Manager Experience: 5 to 8 years Location: Chennai, Tamil Nadu, India (CHN) Job Description: +5 years of experience working in web, product, marketing, or other related analytics fields to solve for marketing/product business problems +4 years of experience in designing and executing experiments (A/B and multivariate) with a deep understanding of the stats behind hypothesis testing Proficient in alternative A/B testing methods like DiD, Synthetic control and other causal inference techniques +5 years of technical proficiency in SQL, Python or R and data visualization tools like tableau +5 years of experience in manipulating and analyzing large complex datasets (e.g. clickstream data), constructing data pipelines (ETL) and working on big data technologies (e.g., Redshift, Spark, Hive, BigQuery) and solutions from cloud platforms and visualization tools like Tableau +3 years of experience in web analytics, analyzing website traffic patterns and conversion funnels +5 years of experience in building ML models (eg: regression, clustering, trees) for personalization applications Demonstrate ability to drive strategy, execution and insights for AI native experiences across the development lifecycle (ideation, discovery, experimentation, scaling) Outstanding communication skills with both technical and non-technical audiences Ability to tell stories with data, influence business decisions at a leadership level, and provide solutions to business problems Ability to manage multiple projects simultaneously to meet objectives and key deadlines Responsibilities: Drive measurement strategy and lead E2E process of A/B testing for areas of web optimization such as landing pages, user funnel, navigation, checkout, product lineup, pricing, search and monetization opportunities Analyze web user behavior at both visitor and session level using clickstream data by anchoring to key web metrics and identify user behavior through engagement and pathing analysis Leverage AI/GenAI tools for automating tasks and building custom implementations Use data, strategic thinking and advanced scientific methods including predictive modeling to enable data-backed decision making for Intuit at scale Measure performance and impact of various product releases Demonstrate strategic thinking and systems thinking to solve business problems and influence strategic decisions using data storytelling. Partner with GTM, Product, Engineering, Design, Engineering teams to drive analytics projects end to end Build models to identify patterns in traffic and user behavior to inform acquisition strategies and optimize for business outcomes Skills: 5 to 8 years in the DA domain web, product, marketing, A/B testing methods like DiD, Synthetic control, constructing data pipelines (ETL), big data technologies (e.g., Redshift, Spark, Hive, BigQuery),SQL, Python or R and tableau, web analytics, analyzing website traffic patterns and conversion funnels, ML models (eg: regression, clustering, trees), Managerial skills Job Snapshot Updated Date 24-07-2025 Job ID J_3934 Location Chennai, Tamil Nadu, India Experience 5 - 8 Years Employee Type Permanent
Posted 5 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
At Takeda, we are guided by our purpose of creating better health for people and a brighter future for the world. Every corporate function plays a role in making sure we — as a Takeda team — can discover and deliver life-transforming treatments, guided by our commitment to patients, our people and the planet. People join Takeda because they share in our purpose. And they stay because we’re committed to an inclusive, safe and empowering work environment that offers exceptional experiences and opportunities for everyone to pursue their own ambitions. Job ID R0158759 Date posted 07/24/2025 Location Bengaluru, Karnataka I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’sPrivacy Noticeand Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description The Future Begins Here At Takeda, we are leading digital evolution and global transformation. By building innovative solutions and future-ready capabilities, we are meeting the need of patients, our people, and the planet. Bengaluru, the city, which is India’s epicenter of Innovation, has been selected to be home to Takeda’s recently launched Innovation Capability Center. We invite you to join our digital transformation journey. In this role, you will have the opportunity to boost your skills and become the heart of an innovative engine that is contributing to global impact and improvement. At Takeda’s ICC we Unite in Diversity Takeda is committed to creating an inclusive and collaborative workplace, where individuals are recognized for their backgrounds and abilities they bring to our company. We are continuously improving our collaborators journey in Takeda, and we welcome applications from all qualified candidates. Here, you will feel welcomed, respected, and valued as an important contributor to our diverse team. The Opportunity: The Data Enginee r will work directly with architects and product owners on the delivery of data pipelines and platforms for structured and unstructured data as part of a transformational data program. This data program will include an integrated data flow with end-to end control of data, internalization of numerous systems and processes, broad enablement of automation and near-time data access, efficient data review and query, and enablement of disruptive technologies for next-generation trial designs and insight derivation. We are primarily looking for people who love taking complex data and making it easy to use. As a Data Engineer you will Provide leadership to develop and execute highly complex and large-scale data structures and pipelines to organize, collect and standardize data to generate insights and addresses reporting needs. Interpret and integrate advanced techniques to ingest structured and unstructured data across complex ecosystem Delivery & Business Accountabilities Build and maintain technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources and large, complex data sets with a focus on clinical and operational data Develop data profiling and data quality methodologies and embed them into the processes involved in transforming data across the systems. Manages and influences the data pipeline and analysis approaches, uses different technologies, big data preparations, programming and loading as well as initial exploration in the process of searching and finding data patterns. Uses data science input and requests, translates these from data exploration - large record (billions) and unstructured data sets - to mathematic algorithms and uses various tooling from programming languages to new tools (artificial and machine learning) to find data patterns, build and optimize models. Leads and implements ongoing tests in the search for solutions in data modelling, collects and prepares the training of data, tunes the data, optimizes algorithm implementations to test, scale, and deploy future models. Conducts and facilitates analytical assessment conceptualizing business needs and translates them into analytical opportunities. Leads the development of technical roadmaps and approaches for data analyses to find patterns, to design data models, to scale the model to a managed production environment within the current or a technical landscape to develop. Influences and manages data exploration from analysis to scalable models, works independently and decides quickly on transfers in complex data analysis and modelling. Skills and Qualifications: Bachelor’s degree or higher in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics or information sciences such as business analytics or informatics 5+ years of experience working in data engineering role in an enterprise environment Strong experience with ETL/ELT design and implementations in the context of large, disparate and complex datasets Demonstrated experience with a variety of relational database and data warehousing technology such as AWS Redshift, Athena, RDS, BigQuery Demonstrated experience with big data processing systems and distributed computing technology such as Databricks, Spark, Sagemaker, Kafka, Tidal/Airflow etc. Demonstrated experience with DevOps tools such as GitLab, Terraform, Ansible, Chef etc. Experience with developing solutions on cloud computing services and infrastructure in the data and analytics space Solution-oriented enabler mindset Prior experience with Data Engineering projects and teams at an Enterprise level Preferred : Understanding or Application of Machine Learning and / or Deep Learning Significant experience in an analytical role in the healthcare industry preferred WHAT TAKEDA ICC INDIA CAN OFFER YOU: Takeda is certified as a Top Employer, not only in India, but also globally. No investment we make pays greater dividends than taking good care of our people. At Takeda, you take the lead on building and shaping your own career. Joining the ICC in Bengaluru will give you access to high-end technology, continuous training and a diverse and inclusive network of colleagues who will support your career growth. BENEFITS: It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are: Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs. Employee Assistance Program Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Leaves – Paternity Leave (4 Weeks), Maternity Leave (up to 26 weeks), Bereavement Leave (5 days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. #Li-Hybrid Locations IND - Bengaluru Worker Type Employee Worker Sub-Type Regular Time Type Full time
Posted 5 days ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
At Takeda, we are guided by our purpose of creating better health for people and a brighter future for the world. Every corporate function plays a role in making sure we — as a Takeda team — can discover and deliver life-transforming treatments, guided by our commitment to patients, our people and the planet. People join Takeda because they share in our purpose. And they stay because we’re committed to an inclusive, safe and empowering work environment that offers exceptional experiences and opportunities for everyone to pursue their own ambitions. Job ID R0152063 Date posted 07/24/2025 Location Bengaluru, Karnataka I understand that my employment application process with Takeda will commence and that the information I provide in my application will be processed in line with Takeda’sPrivacy Noticeand Terms of Use. I further attest that all information I submit in my employment application is true to the best of my knowledge. Job Description About the role: The Platform Engineer is responsible for building large-scale systems and creating robust and scalable platforms used by every team. This includes building the data transport, collection, and orchestration layers. These efforts will allow accessibility to business and user behavior insights. How you will contribute: Build and maintain high-performance, fault-tolerant, secure, and scalable platforms Partner with architects and business leaders to design and build robust services using storage layer, streaming, and batch data Think through long-term impacts of key design decisions and handling failure scenarios Form a holistic understanding of tools, key business concepts (data tables), and the data dependencies and team dependencies Help drive storage layer and API features roadmap as well as be responsible for the overall engineering (design, implementation, and testing) Build self-service platforms to empower users Lead development of high leverage projects and capabilities of the platform Skills and qualifications: Utilizes DevSecOps tools and methodologies to enhance security and operational processes Designs and implements effective data models to improve data accessibility and utility Applies Agile and SDLC methodologies to optimize software development life cycle Understands and manipulates data structures and algorithms to solve complex problems Works with distributed data technologies such as Spark and Hadoop for large-scale data processing Integrates multiple systems ensuring seamless data flow and functionality Proficient in data engineering programming languages including SQL, Python, Scala, Java, and C++ Deploys and manages applications on cloud platforms using tools like Kubernetes As an entry-level professional, you will tackle challenges within a focused and manageable scope. Your role is pivotal in applying core theories and concepts to practical scenarios, reflecting a seamless transition from academic excellence to professional application. You will harness standard methodologies to evaluate situations and data, cultivating a budding understanding of industry practices. Typically, this role requires a bachelor or college degree or the equivalent professional experience. Your role is characterized by growth and learning, while your journey within Takeda will evolve, fostering valuable internal relationships. BENEFITS It is our priority to provide competitive compensation and a benefit package that bridges your personal life with your professional career. Amongst our benefits are: Competitive Salary + Performance Annual Bonus Flexible work environment, including hybrid working Comprehensive Healthcare Insurance Plans for self, spouse, and children Group Term Life Insurance and Group Accident Insurance programs Health & Wellness programs including annual health screening, weekly health sessions for employees. Employee Assistance Program 3 days of leave every year for Voluntary Service in additional to Humanitarian Leaves Broad Variety of learning platforms Diversity, Equity, and Inclusion Programs Reimbursements – Home Internet & Mobile Phone Employee Referral Program Company-Provided Transport: Available at scheduled times for 2nd shift employees for a smooth commute to office and back home Security Escort for Drop-Off: Female employees receive a security escort to their designated drop-off point after the shift for safety Shift Allowance: Additional shift allowance will be provided for hours worked outside regular working hours Food/Meal: Meal will be provided for the 2nd shift employees Leaves – Paternity Leave (4 Weeks) , Maternity Leave (up to 26 weeks), Bereavement Leave (5 calendar days) ABOUT ICC IN TAKEDA: Takeda is leading a digital revolution. We’re not just transforming our company; we’re improving the lives of millions of patients who rely on our medicines every day. As an organization, we are committed to our cloud-driven business transformation and believe the ICCs are the catalysts of change for our global organization. Locations IND - Bengaluru Worker Type Employee Worker Sub-Type Regular Time Type Full time
Posted 5 days ago
0.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka
On-site
About Us Observe.AI is transforming customer service with AI agents that speak, think, and act like your best human agents—helping enterprises automate routine customer calls and workflows, support agents in real time, and uncover powerful insights from every interaction. With Observe.AI, businesses boost automation, deliver faster, more consistent 24/7 service and build stronger customer loyalty. Trusted by brands like Accolade, Prudential, Concentrix, Cox Automotive, and Included Health, Observe.AI is redefining how businesses connect with customers—driving better experiences and lasting relationships at every touchpoint. The Opportunity We are looking for a Senior Data Engineer with strong hands-on experience in building scalable data pipelines and real-time processing systems. You will be part of a high-impact team focused on modernizing our data architecture, enabling self-serve analytics, and delivering high-quality data products. This role is ideal for engineers who love solving complex data challenges, have a growth mindset, and are excited to work on both batch and streaming systems. What you’ll be doing: Build and maintain real-time and batch data pipelines using tools like Kafka, Spark, and Airflow. Contribute to the development of a scalable LakeHouse architecture using modern data formats such as Delta Lake, Hudi, or Iceberg. Optimize data ingestion and transformation workflows across cloud platforms (AWS, GCP, or Azure). Collaborate with Analytics and Product teams to deliver data models, marts, and dashboards that drive business insights. Support data quality, lineage, and observability using modern practices and tools. Participate in Agile processes (Sprint Planning, Reviews) and contribute to team knowledge sharing and documentation. Contribute to building data products for inbound (ingestion) and outbound (consumption) use cases across the organization. Who you are: 5-8 years of experience in data engineering or backend systems with a focus on large-scale data pipelines. Hands-on experience with streaming platforms (e.g., Kafka) and distributed processing tools (e.g., Spark or Flink). Working knowledge of LakeHouse formats (Delta/Hudi/Iceberg) and columnar storage like Parquet. Proficient in building pipelines on AWS, GCP, or Azure using managed services and cloud-native tools. Experience in Airflow or similar orchestration platforms. Strong in data modeling and optimizing data warehouses like Redshift, BigQuery, or Snowflake. Exposure to real-time OLAP tools like ClickHouse, Druid, or Pinot. Familiarity with observability tools such as Grafana, Prometheus, or Loki. Some experience integrating data with MLOps tools like MLflow, SageMaker, or Kubeflow. Ability to work with Agile practices using JIRA, Confluence, and participating in engineering ceremonies. Compensation, Benefits and Perks Excellent medical insurance options and free online doctor consultations Yearly privilege and sick leaves as per Karnataka S&E Act Generous holidays (National and Festive) recognition and parental leave policies Learning & Development fund to support your continuous learning journey and professional development Fun events to build culture across the organization Flexible benefit plans for tax exemptions (i.e. Meal card, PF, etc.) Our Commitment to Inclusion and Belonging Observe.AI is an Equal Employment Opportunity employer that proudly pursues and hires a diverse workforce. Observe AI does not make hiring or employment decisions on the basis of race, color, religion or religious belief, ethnic or national origin, nationality, sex, gender, gender identity, sexual orientation, disability, age, military or veteran status, or any other basis protected by applicable local, state, or federal laws or prohibited by Company policy. Observe.AI also strives for a healthy and safe workplace and strictly prohibits harassment of any kind. We welcome all people. We celebrate diversity of all kinds and are committed to creating an inclusive culture built on a foundation of respect for all individuals. We seek to hire, develop, and retain talented people from all backgrounds. Individuals from non-traditional backgrounds, historically marginalized or underrepresented groups are strongly encouraged to apply. If you are ambitious, make an impact wherever you go, and you're ready to shape the future of Observe.AI, we encourage you to apply.
Posted 5 days ago
0.0 - 3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Job description Schneider Electric is looking for AWS data cloud engineer with min experience of 5 years in AWS data lake implementation. Responsible for creating/managing data ingestion, transformation, making data ready for consumption in analytical layer of data lake. Also responsible for managing/monitoring data quality of data lake using informatica power center. Also responsible for creating dashboards from analytical layer of data lake using Tableau or Power BI. Your Role We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your responsibilities are: Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools. Implement data quality rules, perform data profiling to assess the source data quality, identify data anomalies, and create data quality scorecards using Informatica PowerCenter. Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making. Interact with product owners to understand the needs of data ingestion, data quality rules. Adopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes. Optional skill. Qualifications Your Skills and Experience Min of 3 to 5 years of experience in AWS Data Lake implementation. Min of 2 to 3 years of knowledge in Informatica PowerCenter. Proficiency with AWS Tools: Demonstrable experience using AWS Glue, AWS Lambda, Amazon Kinesis, Amazon EMR, Amazon Athena, Amazon DynamoDB, Amazon Cloudwatch, Amazon SNS and AWS Step Functions. Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow. Understanding of relational databases like Oracle, SQL Server, MySQL Programming Skills: Strong experience with modern programming languages such as Python and Java. Expertise in Data Storage Technologies: In-depth knowledge of Data Warehouse, Database technologies, and Big Data Eco-system technologies such as AWS Redshift, AWS RDS, and Hadoop. Experience with AWS Data Lakes: Proven experience working with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Expertise in developing Business Intelligence dashboards in Tableau, Power BI is a plus. Good knowledge on project and portfolio management suite of tools is a plus. Should be well versed with Agile principle of implementation. Having Safe Agile principles is a plus. About Us Schneider Electric™ creates connected technologies that reshape industries, transform cities and enrich lives. Our 144,000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Help us deliver solutions that ensure Life Is On everywhere, for everyone and at every moment: https://youtu.be/NlLJMv1Y7Hk. Great people make Schneider Electric a great company. We seek out and reward people for putting the customer first, being disruptive to the status quo, embracing different perspectives, continuously learning, and acting like owners. We want our employees to reflect the diversity of the communities in which we operate. We welcome people as they are, creating an inclusive culture where all forms of diversity are seen as a real value for the company. We’re looking for people with a passion for success — on the job and beyond. See what our people have to say about working for Schneider Electric: https://youtu.be/6D2Av1uUrzY Our EEO statement : Schneider Electric aspires to be the most inclusive and caring company in the world, by providing equitable opportunities to everyone, everywhere, and ensuring all employees feel uniquely valued and safe to contribute their best. We mirror the diversity of the communities in which we operate and we ‘embrace different’ as one of our core values. We believe our differences make us stronger as a company and as individuals and we are committed to championing inclusivity in everything we do. This extends to our Candidates and is embedded in our Hiring Practices. You can find out more about our commitment to Diversity, Equity and Inclusion here and our DEI Policy here Schneider Electric is an Equal Opportunity Employer. It is our policy to provide equal employment and advancement opportunities in the areas of recruiting, hiring, training, transferring, and promoting all qualified individuals regardless of race, religion, color, gender, disability, national origin, ancestry, age, military status, sexual orientation, marital status, or any other legally protected characteristic or conduct Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana
On-site
Job Title: Databricks Developer / Data Engineer Duration - 12 Months with Possible Extension Location: Hyderabad, Telangana (Hybrid) 1-2 days onsite at client location Job Summary: We are seeking a highly skilled Databricks Developer / Data Engineer with 5+ years of experience in building scalable data pipelines, managing large datasets, and optimizing data workflows in cloud environments. The ideal candidate will have hands-on expertise in Azure Databricks, Azure Data Factory, and other Azure-native services, playing a key role in enabling data-driven decision-making across the organization. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines for data ingestion, transformation, and integration Work with both structured and unstructured data from a variety of internal and external sources Collaborate with data analysts, scientists, and engineers to ensure data quality, integrity, and availability Build and manage data lakes, data warehouses, and data models (Azure Databricks, Azure Data Factory, Snowflake, etc.) Optimize performance of large-scale batch and real-time processing systems Implement data governance , metadata management, and data lineage practices Monitor and troubleshoot pipeline issues; perform root cause analysis and proactive resolution Automate data validation and quality checks Ensure compliance with data privacy, security, and regulatory requirements Maintain thorough documentation of architecture, data workflows, and processes Mandatory Qualifications: 5+ years of hands-on experience with: Azure Blob Storage, Azure Data Lake Storage, Azure SQL Database Azure Logic Apps, Azure Data Factory, Azure Databricks, Azure ML Azure DevOps Services, Azure API Management, Webhooks Intermediate-level proficiency in Python scripting and PySpark Basic understanding of Power BI and visualization functionalities Technical Skills & Experience Required: Proficient in SQL and working with both relational and non-relational databases (e.g., SQL, PostgreSQL, MongoDB, Cassandra) Hands-on experience with Apache Spark, Hadoop, Hive for big data processing Proficiency in building scalable data pipelines using Azure Data Factory and Azure Databricks Solid knowledge of cloud-native tools : Delta Lake, Azure ML, Azure DevOps Understanding of data modeling , OLAP/OLTP systems , and data warehousing best practices Experience with CI/CD pipelines , version control with Git , and working with Azure Repos Knowledge of data security , privacy policies, and compliance frameworks Excellent problem-solving , troubleshooting , and analytical skills
Posted 5 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi