Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
Join our fast-paced Corporate Oversight and Governance Technology AI/ML team at J.P. Morgan, where your skills and passion will drive innovation and make a significant impact. We offer unparalleled opportunities for career growth and a collaborative environment where you can thrive and contribute to meaningful projects. As an Applied AI ML Lead within the Corporate Oversight and Governance Technology team, you will solve challenging and impactful problems across a wide range of Corporate functions. You will study complex business problems and apply advanced algorithms to develop, test, and evaluate AI/ML applications or models. Working with the firm's rich data pool using Python/Spark via AWS, you will derive business insights and present them to non-technical audiences, contributing to stronger business programs and improved efficiency. Proactively develop an understanding of key business problems and processes. Execute tasks throughout a model development process including data wrangling/analysis, model training, testing, and selection. Implement optimization strategies to fine-tune generative models for specific NLP use cases, ensuring high-quality outputs in summarization and text generation. Conduct thorough evaluations of generative models (e.g., GPT-4), iterate on model architectures, and implement improvements to enhance overall performance in NLP applications. Implement monitoring mechanisms to track model performance in real-time and ensure model reliability. Communicate AI/ML/LLM/GenAI capabilities and results to both technical and non-technical audiences. Generate structured and meaningful insights from data analysis and modeling exercises and present them in an appropriate format according to the audience. Collaborate with other data scientists and machine learning engineers to deploy machine learning solutions. Carry out ad-hoc and periodic analysis as required by the business stakeholder, model risk function, and other groups. Required Qualifications, Capabilities, and Skills: - Formal training or certification in applied AI/ML concepts and 5+ years of applied experience. - Proficiency in programming languages like Python for model development, experimentation, and integration with OpenAI API. - Experience with machine learning frameworks, libraries, and APIs, such as TensorFlow, PyTorch, Scikit-learn, and OpenAI API. - Experience in building AI/ML models on structured and unstructured data along with model explainability and model monitoring. - Solid understanding of fundamentals of statistics, machine learning (e.g., classification, regression, time series, deep learning, reinforcement learning), and generative model architectures, particularly GANs, VAEs. - Experience with a broad range of analytical toolkits, such as SQL, Spark, Scikit-Learn, and XGBoost. - Experience with graph analytics and neural networks (PyTorch). - Excellent problem-solving, communication (verbal and written), and teamwork skills. Preferred Qualifications, Capabilities, and Skills: - Expertise in building AI/ML models on structured and unstructured data along with model explainability and model monitoring. - Expertise in designing and implementing pipelines using Retrieval-Augmented Generation (RAG). - Familiarity with the financial services industry.,
Posted 4 days ago
15.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Summary: Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Have you led large Business transformation programs for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, Accenture Strategy and Consulting is the right place for you to explore limitless possibilities. As a part of our Operations and Process Transformation Center of Excellence, you will help organizations reimagine and transform their processes for tomorrow—with a positive impact on the business, society and the planet. While we are housed within Supply Chain and Operations from a reporting point of view, we are function agnostic and work across enterprise-wide processes including Finance and Accounting, Human Resources. Together, let’s innovate, build competitive advantage, and improve business, and societal outcomes, in an ever-changing, ever-challenging world. We are seeking a highly skilled and experienced leaders to drive our Business Transformation programs. The ideal candidate will have a strong background in management consulting, with a proven track record of successfully leading and delivering large-scale transformation projects. This role requires excellent leadership, strategic thinking, client stakeholder management and communication skills to drive change and achieve business objectives. Roles & Responsibilities: Lead and manage business transformation programs, ensuring alignment with organizational goals and objectives. Work closely with clients to understand their key priorities, shape the transformation roadmap and lead the Innovation agenda for clients Design and deliver transformation roadmap and business case, aligning with client goals and objectives Deploy Accenture’s standard methods, tools and assets to drive standardization during the implementation. Develop and implement strategies to drive business process improvements, enhance operational efficiency, and achieve cost savings. Conceptualization and implementation of GenAI, AI, Analytics, automation products and solutions to drive innovation in client processes Collaborate with cross-functional teams, including Consulting, Operations and Client account teams to ensure successful project execution. Identify and mitigate risks associated with transformation initiatives. Provide leadership and guidance to project teams, fostering a culture of continuous improvement and innovation. Monitor and report on project progress, ensuring timely delivery and achievement of key milestones. Engage with senior stakeholders to communicate project status, challenges, and successes. , Professional & Technical Skills: MBA or equivalent advanced degree preferred from Tier 1 or Tier 2 Business schools. Minimum of 15 years of experience in management consulting or an internal consulting team or a similar role with a focus on business transformation. Proven experience in leading large-scale transformation projects, preferably in a consulting environment. Strong analytical and problem-solving skills, with the ability to think strategically and drive change. Prior experience in enabling GenAI, AI and RPA technologies in client processes is highly preferred Experience in authoring business case for large Business transformation programs Excellent communication and interpersonal skills, with the ability to influence and engage stakeholders at all levels. Demonstrated ability to manage multiple projects simultaneously and deliver results in a fast-paced environment. Proficiency in project management tools and methodologies. Ability to adopt and deploy new methods and approaches with focus on value.
Posted 4 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Tesco India • Bengaluru, Karnataka, India • Hybrid • Full-Time • Permanent • Apply by 01-Aug-2025 About the role Refer to responsibilities What is in it for you At Tesco, we are committed to providing the best for you. As a result, our colleagues enjoy a unique, differentiated, market- competitive reward package, based on the current industry practices, for all the work they put into serving our customers, communities and planet a little better every day. Our Tesco Rewards framework consists of pillars - Fixed Pay, Incentives, and Benefits. Total Rewards offered at Tesco is determined by four principles - simple, fair, competitive, and sustainable. Salary - Your fixed pay is the guaranteed pay as per your contract of employment. Performance Bonus - Opportunity to earn additional compensation bonus based on performance, paid annually Leave & Time-off - Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays, as per the company’s policy. Making Retirement Tension-FreeSalary - In addition to Statutory retirement beneets, Tesco enables colleagues to participate in voluntary programmes like NPS and VPF. Health is Wealth - Tesco promotes programmes that support a culture of health and wellness including insurance for colleagues and their family. Our medical insurance provides coverage for dependents including parents or in-laws. Mental Wellbeing - We offer mental health support through self-help tools, community groups, ally networks, face-to-face counselling, and more for both colleagues and dependents. Financial Wellbeing - Through our financial literacy partner, we offer one-to-one financial coaching at discounted rates, as well as salary advances on earned wages upon request. Save As You Earn (SAYE) - Our SAYE programme allows colleagues to transition from being employees to Tesco shareholders through a structured 3-year savings plan. Physical Wellbeing - Our green campus promotes physical wellbeing with facilities that include a cricket pitch, football field, badminton and volleyball courts, along with indoor games, encouraging a healthier lifestyle. You will be responsible for Job Summary: Build solutions for the real-world problems in workforce management for retail. You will work with a team of highly skilled developers and product managers throughout the entire software development life cycle of the products we own. In this role you will be responsible for designing, building, and maintaining our big data pipelines. Your primary focus will be on developing data pipelines using available tec hnologies. In this job, I’m accountable for: Following our Business Code of Conduct and always acting with integrity and due diligence and have these specific risk responsibilities: Represent Talent Acquisition in all forums/ seminars pertaining to process, compliance and audit Perform other miscellaneous duties as required by management Driving CI culture, implementing CI projects and innovation for withing the team Design and implement scalable and reliable data processing pipelines using Spark/Scala/Python &Hadoop ecosystem. Develop and maintain ETL processes to load data into our big data platform. Optimize Spark jobs and queries to improve performance and reduce processing time. Working with product teams to communicate and translate needs into technical requirements. Design and develop monitoring tools and processes to ensure data quality and availability. Collaborate with other teams to integrate data processing pipelines into larger systems. Delivering high quality code and solutions, bringing solutions into production. Performing code reviews to optimise technical performance of data pipelines. Continually look for how we can evolve and improve our technology, processes, and practices. Leading group discussions on system design and architecture. Manage and coach individuals, providing regular feedback and career development support aligned with business goals. Allocate and oversee team workload effectively, ensuring timely and high-quality outputs. Define and streamline team workflows, ensuring consistent adherence to SLAs and data governance practices. Monitor and report key performance indicators (KPIs) to drive continuous improvement in delivery efficiency and system uptime. Oversee resource allocation and prioritization, aligning team capacity with project and business demands. Key people and teams I work with in and outside of Tesco: People, budgets and other resources I am accountable for in my job: TBS & Tesco Senior Management TBS Reporting Team Tesco UK / ROI/ Central Europe Any other accountabilities by the business Business stakeholders Operational skills relevant for this job: Experience relevant for this job: Skills: ETL, YARN,Spark, Hive,Hadoop,PySpark/Python 7+ years of experience inbuilding and maintaining big data (anyone) Linux/Unix/Shell environments(anyone), Query platforms using Spark/Scala. optimisation Strong knowledge of distributed computing principles and big Good to have: Kafka, restAPI/reporting tools. data technologies such as Hadoop, Spark, Streaming etc. Experience with ETL processes and data modelling. Problem-solving and troubleshooting skills. Working knowledge on Oozie/Airflow. Experience in writing unit test cases, shell scripting. Ability to work independently and as part of a team in a fast-paced environment. You will need Refer to responsibilities About us Tesco in Bengaluru is a multi-disciplinary team serving our customers, communities, and planet a little better every day across markets. Our goal is to create a sustainable competitive advantage for Tesco by standardising processes, delivering cost savings, enabling agility through technological solutions, and empowering our colleagues to do even more for our customers. With cross-functional expertise, a wide network of teams, and strong governance, we reduce complexity, thereby offering high-quality services for our customers. Tesco in Bengaluru, established in 2004 to enable standardisation and build centralised capabilities and competencies, makes the experience better for our millions of customers worldwide and simpler for over 3,30,000 colleagues. Tesco Business Solutions: Established in 2017, Tesco Business Solutions (TBS) has evolved from a single entity traditional shared services in Bengaluru, India (from 2004) to a global, purpose-driven solutions-focused organisation. TBS is committed to driving scale at speed and delivering value to the Tesco Group through the power of decision science. With over 4,400 highly skilled colleagues globally, TBS supports markets and business units across four locations in the UK, India, Hungary, and the Republic of Ireland. The organisation underpins everything that the Tesco Group does, bringing innovation, a solutions mindset, and agility to its operations and support functions, building winning partnerships across the business. TBS's focus is on adding value and creating impactful outcomes that shape the future of the business. TBS creates a sustainable competitive advantage for the Tesco Group by becoming the partner of choice for talent, transformation, and value creation.
Posted 4 days ago
0.0 years
0 Lacs
Gachibowli, Hyderabad, Telangana
On-site
Location: IN - Hyderabad Telangana Goodyear Talent Acquisition Representative: Katrena Calimag-Rupera Sponsorship Available: No Relocation Assistance Available: No STAFF DIGITAL SOFTWARE ENGINEER – Data Engineer Are you interested in an exciting opportunity to help shape the user experience and design front-end applications for data-driven digital products that drive better process performance across a global company? The Data Driven Engineering and Global Information Technology groups group at the Goodyear Technology India Center, Hyderabad, India is looking for a dynamic individual with strong background in data engineering and infrastructure to partner with data scientists, information technology specialists as well as our global technology and operations teams to derive valuable insights from our expansive data sources and help develop data-driven solutions for important business applications across the company. Since its inception, the Data Science portfolio of projects continues to grow and includes areas of tire manufacturing, operations, business, and technology. The people in our Data Science group come from a broad range of backgrounds: Mathematics, Statistics, Cognitive Linguistics, Astrophysics, Biology, Computer Science, Mechanical, Electrical, Chemical, and Industrial Engineering, and of course - Data Science. This diverse group works together to develop innovative tools and methods for simulating, modeling, and analyzing complex processes throughout our company. We’d like you to help us build the next generation of data-driven applications for the company and be a part of the Information Technology and Data Driven Engineering teams. What You Will Do We think you’ll be excited about having opportunities to: Design and build robust, scalable, and efficient data pipelines and ETL processes to support analytics, data science, and digital products. Collaborate with cross-functional teams to understand data requirements and implement solutions that integrate data from diverse sources. Lead the development, management, and optimization of cloud-based data infrastructure using platforms such as AWS, Azure, or GCP. Architect and maintain highly available and performant relational database systems (e.g., PostgreSQL, MySQL) and NoSQL systems (e.g., MongoDB, DynamoDB). Partner with data scientists to ensure efficient and secure data access for modeling, experimentation, and production deployment. Build and maintain data services and APIs to facilitate access to curated datasets across internal applications and teams. Implement DevOps and DataOps practices including CI/CD for data workflows, infrastructure as code, containerization (Docker), and orchestration (Kubernetes). Learn about the tire industry and tire manufacturing processes from subject matter experts. Be a part of cross-functional teams working together to deliver impactful results. What We Expect Bachelor’s degree in computer science or a similar technical field; preferred: Master’s degree in computer science or a similar field 5 or more years of experience designing and maintaining data pipelines, cloud-based data systems, and production-grade data workflows. Experience with the following technology groups: Strong experience in Python, Java, or other languages for data engineering and scripting. Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, DynamoDB), including query optimization and schema design. Experience designing and deploying solutions on cloud platforms like AWS (e.g., S3, Redshift, RDS), Azure, or GCP. Familiarity with data modeling, data warehousing, and distributed data processing frameworks (e.g., Apache Spark, Airflow, dbt). Understanding of RESTful APIs and integration of data services with applications. Hands-on experience with CI/CD tools (e.g., GitHub Actions, Jenkins), Docker, Kubernetes, and infrastructure-as-code frameworks. Solid grasp of software engineering best practices, including code versioning, testing, and performance optimization. Good teamwork skills - ability to work in a team environment and deliver results on time. Strong communication skills - capable of conveying information concisely to diverse audiences. Goodyear is an Equal Employment Opportunity and Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to that individual's race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender identity, age, physical or mental disability, ethnicity, citizenship, or any other characteristic protected by law. Goodyear is one of the world’s largest tire companies. It employs about 68,000 people and manufactures its products in 53 facilities in 20 countries around the world. Its two Innovation Centers in Akron, Ohio and Colmar-Berg, Luxembourg strive to develop state-of-the-art products and services that set the technology and performance standard for the industry. For more information about Goodyear and its products, go to www.goodyear.com/corporate
Posted 4 days ago
0.0 years
0 Lacs
Gachibowli, Hyderabad, Telangana
On-site
Staff Data Engineer Location: Gachibowli Hyderabad, TG, IN Company: Goodyear Location: IN - Hyderabad Telangana Goodyear Talent Acquisition Representative: Katrena Calimag-Rupera Sponsorship Available: No Relocation Assistance Available: No STAFF DIGITAL SOFTWARE ENGINEER – Data Engineer Are you interested in an exciting opportunity to help shape the user experience and design front-end applications for data-driven digital products that drive better process performance across a global company? The Data Driven Engineering and Global Information Technology groups group at the Goodyear Technology India Center, Hyderabad, India is looking for a dynamic individual with strong background in data engineering and infrastructure to partner with data scientists, information technology specialists as well as our global technology and operations teams to derive valuable insights from our expansive data sources and help develop data-driven solutions for important business applications across the company. Since its inception, the Data Science portfolio of projects continues to grow and includes areas of tire manufacturing, operations, business, and technology. The people in our Data Science group come from a broad range of backgrounds: Mathematics, Statistics, Cognitive Linguistics, Astrophysics, Biology, Computer Science, Mechanical, Electrical, Chemical, and Industrial Engineering, and of course - Data Science. This diverse group works together to develop innovative tools and methods for simulating, modeling, and analyzing complex processes throughout our company. We’d like you to help us build the next generation of data-driven applications for the company and be a part of the Information Technology and Data Driven Engineering teams. What You Will Do We think you’ll be excited about having opportunities to: Design and build robust, scalable, and efficient data pipelines and ETL processes to support analytics, data science, and digital products. Collaborate with cross-functional teams to understand data requirements and implement solutions that integrate data from diverse sources. Lead the development, management, and optimization of cloud-based data infrastructure using platforms such as AWS, Azure, or GCP. Architect and maintain highly available and performant relational database systems (e.g., PostgreSQL, MySQL) and NoSQL systems (e.g., MongoDB, DynamoDB). Partner with data scientists to ensure efficient and secure data access for modeling, experimentation, and production deployment. Build and maintain data services and APIs to facilitate access to curated datasets across internal applications and teams. Implement DevOps and DataOps practices including CI/CD for data workflows, infrastructure as code, containerization (Docker), and orchestration (Kubernetes). Learn about the tire industry and tire manufacturing processes from subject matter experts. Be a part of cross-functional teams working together to deliver impactful results. What We Expect Bachelor’s degree in computer science or a similar technical field; preferred: Master’s degree in computer science or a similar field 5 or more years of experience designing and maintaining data pipelines, cloud-based data systems, and production-grade data workflows. Experience with the following technology groups: Strong experience in Python, Java, or other languages for data engineering and scripting. Deep knowledge of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, DynamoDB), including query optimization and schema design. Experience designing and deploying solutions on cloud platforms like AWS (e.g., S3, Redshift, RDS), Azure, or GCP. Familiarity with data modeling, data warehousing, and distributed data processing frameworks (e.g., Apache Spark, Airflow, dbt). Understanding of RESTful APIs and integration of data services with applications. Hands-on experience with CI/CD tools (e.g., GitHub Actions, Jenkins), Docker, Kubernetes, and infrastructure-as-code frameworks. Solid grasp of software engineering best practices, including code versioning, testing, and performance optimization. Good teamwork skills - ability to work in a team environment and deliver results on time. Strong communication skills - capable of conveying information concisely to diverse audiences. Goodyear is an Equal Employment Opportunity and Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to that individual's race, color, religion or creed, national origin or ancestry, sex (including pregnancy), sexual orientation, gender identity, age, physical or mental disability, ethnicity, citizenship, or any other characteristic protected by law. Goodyear is one of the world’s largest tire companies. It employs about 68,000 people and manufactures its products in 53 facilities in 20 countries around the world. Its two Innovation Centers in Akron, Ohio and Colmar-Berg, Luxembourg strive to develop state-of-the-art products and services that set the technology and performance standard for the industry. For more information about Goodyear and its products, go to www.goodyear.com/corporate Job Segment: Test Engineer, R&D Engineer, Software Engineer, Cloud, Computer Science, Engineering, Technology
Posted 4 days ago
15.0 years
0 Lacs
Hyderabad, Telangana
On-site
Principal Software Engineering Manager Hyderabad, Telangana, India Date posted Jul 25, 2025 Job number 1824112 Work site Microsoft on-site only Travel 0-25 % Role type People Manager Profession Software Engineering Discipline Software Engineering Employment type Full-Time Overview Overview The E+D Growth Team's role is to help grow our user and customer base so we can fulfill Microsoft's mission of empowering every person and organization on the planet to achieve more. We do this through Product-Led Growth motions that we develop, facilitate, and partner with teams throughout Microsoft to deliver so we can bring more of Microsoft's software - across Microsoft 365, Windows, and elsewhere - to more users and convert those users into customers. We work with every segment of the market including consumers and businesses of all sizes, helping to facilitate improved engagement, retention, and acquisition for the wide array of products inside of the Experiences and Devices organization. Lead the next wave of growth for Microsoft's most transformative products. As part of the E+D Growth team, you will help define and deliver our Product-Led Growth (PLG) strategy across Windows, Office, and beyond crafting magical, AI-powered experiences that hundreds of millions of people rely on every day. As a Principal Software Engineering Manager you will play a critical role in driving the adoption and monetization of Microsoft 365 Copilot through Product-Led Growth methodologies. This role requires a strategic thinker with a deep understanding of product development, user experience, experimentation, and data-driven decision-making. We are builders, explorers, and connectors and we are looking for a like-minded Software Engineering Managers who thrives on driving big ideas from spark to scale. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Bachelor's Degree in Computer Science, or related technical discipline AND 15+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. 5+ years of experience operating online services. 5+ years of people management experience. Preferred Qualifications: Bachelor's Degree in Computer Science OR related technical field AND 15+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, OR Python OR Master's Degree in Computer Science or related technical field AND 10+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience. Exceptional skills in influencing and aligning diverse stakeholders across product, design, marketing, research, and business disciplines. Ability to think strategically while diving deep into details, you can balance big-picture vision with day-to-day execution. Experience working with AI/ML-powered experiences, platform services, or large-scale subscription businesses is a plus. Passion for customer-centric innovation, operational excellence, and building inclusive, high-performance team cultures. 5+ years people management experience. People management experience at a big tech company. #DPG #ExDGrowth #IDCMicrosoft #DPG #ExDGrowth #IDCMicrosoft Responsibilities Guides partnership with appropriate stakeholders (e.g., project manager, technical lead) to determine user requirements within and across teams. Guides team and leads identification of dependencies and the development of design documents for a product, application, service, or platform. Optimizes, debugs, refactors, and reuses code to improve performance and maintainability, effectiveness, and return on investment (ROI). Guides team to drive multiple group's project plans, release plans, and work items in coordination with appropriate stakeholders (e.g., project managers). Guides team and act as an expert for Designated Responsible Individual (DRI) and monitors other engineers across product lines, working on call to monitor system/product/service for degradation, downtime, or interruptions. Leads product development and scaling to customer requirements and applies best practices for meeting scaling needs and performance expectations and holds accountability for products that do not meet expectations. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 4 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function smoothly and efficiently. You will also engage in problem-solving discussions and contribute innovative ideas to enhance application performance and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of data processing and transformation techniques. - Experience with distributed computing frameworks. - Familiarity with cloud platforms and services. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Chennai office. - A 15 years full time education is required., 15 years full time education
Posted 4 days ago
6.0 years
0 Lacs
India
On-site
What You'll Do Avalara, Inc., (www.Avalara.com), is the leading provider of cloud-based software that delivers a broad array of compliance solutions related to sales tax and other transactional taxes. We are building cloud-based tax compliance solutions to handle every transaction in the world. Every transaction you make, physical or digital, has a unique and nuanced tax calculation that accompanies it. We do those and we want to do all of them. Avalara is building the global cloud compliance platform, and the Build and Deployment Tooling Team contributes in allowing the development of this platform. Our engineering teams are diverse in their engineering practices, culture, and background. We create the systems that allow them to produce quality products at an increasing pace. As a member of the team, you will take part in architecting the tooling that lowers the barriers for development. You will report to Manager, Site Reliability Engineer This might be a good fit for you, if… Helping people do their best resonates with you. you love platform engineering you want to build cool things with cool people. you love automating everything you love building high impact tools and software which everyone depends on you love automating everything! What Your Responsibilities Will Be Some areas of work are Create tools that smooth the journey from idea to running in production Learn and promote best practices related to the build, test and deployment of software What You’ll Need To Be Successful Qualifications Software Engineering: Understand software engineering fundamentals and have experience developing software among a team of engineers. Experience practicing testing. Build Automation: Experience getting artifacts in many languages packaged and tested so that they can be trusted to go into Production. Automatically. Release Automation: Experience in getting artifacts running in production. Automatically. Observability: Experience developing service level indicators and goals, instrumenting software, and building meaningful alerts. Troubleshooting: Experience tracking down technical causes of distributed software. Containers/Container Orchestration Systems: A understanding of how to manage container-based systems especially on Kubernetes. Artificial Intelligence: A grounding in infrastructure for and the use of Agentic Systems. Infrastructure-as-Code: Experience deploying and maintaining infrastructure as code tools such as Terraform and Pulumi. Technical Writing: We will need to build documentation and diagrams for other engineering teams. Customer Satisfaction: Experience ensuring that code meets all functionality and acceptance criteria for customer satisfaction (our customers are other engineering teams and Avalara customers). GO: Our tooling is developed in GO Distributed Computing: Experience architecting distributed services across regions and clouds. GitLab: Experience working with, managing, and deploying. Artifactory: Experience working with, managing, and deploying. Technical Writing: write technical documents that people love and adore. Open Source: Build side-projects or contribute to other open-source projects. Experience Minimum 6 years of experience in a SaaS environment Bachelor's degree in computer science or equivalent Participate in an on-call rotation Experience with a data warehouse like Snowflake, Redshift, or Spark How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing over 54 billion customer API calls and over 6.6 million tax returns a year. Our growth is real - we're a billion dollar business - and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.
Posted 4 days ago
0 years
0 Lacs
Chandigarh, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
1.0 - 5.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have at least 3 years of hands-on experience in data modeling, ETL processes, developing reporting systems, and data engineering using tools such as ETL, Big Query, SQL, Python, or Alteryx. Additionally, you should possess advanced knowledge in SQL programming and database management. Moreover, you must have a minimum of 3 years of solid experience working with Business Intelligence reporting tools like Power BI, Qlik Sense, Looker, or Tableau, along with a good understanding of data warehousing concepts and best practices. Excellent problem-solving and analytical skills are essential for this role, as well as being detail-oriented with strong communication and collaboration skills. The ability to work both independently and as part of a team is crucial for success in this position. Preferred skills include experience with GCP cloud services such as BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker, Looker ML, Data Studio, and GCP QlikSense. Strong SQL skills and proficiency in various BI/Reporting tools to build self-serve reports, analytic dashboards, and ad-hoc packages leveraging enterprise data warehouses are also desired. Moreover, having at least 1 year of experience in Python and Hive/Spark/Scala/JavaScript is preferred. Additionally, you should have a solid understanding of consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations, and development delivery experience. Furthermore, it is important to have a good grasp of BI tools, architectures, and visualization solutions, coupled with an inquisitive and proactive approach to learning new tools and techniques. Strong oral, written, and interpersonal communication skills are necessary, and you should be comfortable working in a dynamic environment where problems are not always well-defined.,
Posted 4 days ago
1.0 - 5.0 years
0 Lacs
pune, maharashtra
On-site
We are seeking talented Python Developers who are passionate about creating machine learning solutions using Python and are interested in collaborating with a US Startup. If you are enthusiastic about gaining experience in a startup environment and have the flexibility to work from any location with a stable internet connection, then this opportunity is perfect for you. Whether you prefer working from a vacation destination or a rural setting, remote work is the norm, eliminating the need for long commutes or frequent in-person meetings. If you are ready to work diligently and enjoy life to the fullest, we welcome you to join our team. As part of the application process, candidates must complete a pre-screening behavioral assessment form for consideration. Requirements: - A Bachelors or Masters degree in Statistics/Math, Computer Science, Finance/Economics, Computer Engineering, or a related quantitative field (Ph.D. candidates are encouraged to apply) - Proficiency in Python Web Development with Flask - Knowledge of SQL, Unix, Docker, Git, and relational databases - Strong analytical, design, problem-solving, and debugging skills - Ability to work independently from a home office without constant supervision - Solid experience in DevOps and deployment pipelines for software deployment to servers (On-premise hosting and Azure) - Experience in Analytics/Machine Learning projects, including familiarity with SkLearn/spark libraries and other relevant machine learning packages in web servers - Understanding of software design patterns and best practices in software engineering - Flexible schedule with most work hours concentrated in the evenings post-college (approximately 3-5 hours daily) Responsibilities include: - Writing Python code with a focus on scalability, supportability, and maintainability - Software development, configuration, and customization - Diagnosing and resolving production issues - Enhancing and developing the company's technology suite through collaboration with development teams to identify application requirements - Assessing and prioritizing client feature requests Qualities we value in candidates: - Reliability, independence, and the ability to multitask - Honesty and transparency - Team player who enjoys collaborating and working in a team environment - Effective communication skills to convey goals to team members - Self-starter capable of taking ownership of projects and tasks - Passion for delivering exceptional products and experiences to customers with a strong sense of ownership - Willingness to experiment with new tools, techniques, and approaches even in the face of failure Desirable qualifications: - Pursuing or completed MS/Ph.D. in Computing, Physics, or other STEM fields - Curiosity and eagerness to learn about Analytics/Machine Learning - Previous experience in Financial Services is a bonus Benefits: - Remote-first company allowing 100% remote work to suit your schedule - Flexible working hours - Competitive stipend/salary Please note: - We maintain a zero-tolerance policy towards plagiarism in the screening test - Submissions must be sent as a zip attachment via email; other forms of submission will be automatically rejected - Candidates from top schools such as Pune University, Mumbai University, NIT, IISER, TIFR, IIT, ISI, or leading institutions in the USA/UK will be given preference,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Join as a Big Data Engineer at Barclays and lead the evolution of the digital landscape to drive innovation and excellence. Utilize cutting-edge technology to revolutionize digital offerings and ensure unparalleled customer experiences. To succeed in this role, you should possess the following essential skills: - Full Stack Software Development for large-scale, mission-critical applications. - Proficiency in distributed big data systems like Spark, Hive, Kafka streaming, Hadoop, Airflow. - Expertise in Scala, Java, Python, J2EE technologies, Microservices, Spring, Hibernate, REST APIs. - Experience with n-tier web application development and frameworks such as Spring Boot, Spring MVC, JPA, Hibernate. - Familiarity with version control systems, particularly Git; GitHub Copilot experience is a bonus. - Proficient in API Development using SOAP or REST, JSON, and XML. - Hands-on experience in developing back-end applications with multi-process and multi-threaded architectures. - Skilled in building scalable microservices solutions using integration design patterns, Dockers, Containers, and Kubernetes. - Knowledge of DevOps practices including CI/CD, Test Automation, Build Automation using tools like Jenkins, Maven, Chef, Git, Docker. - Experience with data processing in cloud environments like Azure or AWS. - Essential experience in Data Product development and Agile development methodologies like SCRUM. - Result-oriented with strong analytical and problem-solving skills. - Excellent verbal and written communication and presentation skills. Your primary responsibilities will include: - Developing and delivering high-quality software solutions using industry-aligned programming languages, frameworks, and tools, ensuring scalability, maintainability, and performance optimization. - Collaborating cross-functionally with product managers, designers, and engineers to define software requirements, devise solution strategies, and align with business objectives. - Promoting a culture of code quality and knowledge sharing through participation in code reviews and industry technology communities. - Ensuring secure coding practices to protect data and mitigate vulnerabilities, along with effective unit testing practices for proper code design and reliability. As a Big Data Engineer at Barclays, you will play a crucial role in designing, developing, and enhancing software to provide business, platform, and technology capabilities for customers and colleagues. You will contribute to technical excellence, continuous improvement, and risk mitigation while adhering to Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, and embodying the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
At Allstate, great things happen when our people work together to protect families and their belongings from life's uncertainties. For over 90 years, our innovative drive has kept us a step ahead of our customers" evolving needs. From advocating for safety measures like seat belts and airbags to being a leader in pricing sophistication, telematics, and device and identity protection. This role is responsible for leading the use of data to make decisions. You will develop and execute new machine learning predictive modeling algorithms, code tools using machine learning/predictive modeling for business decisions, integrate new data to improve modeling results, and find solutions to business problems through machine learning/predictive modeling. In addition, you will manage projects of small to medium complexity. We are seeking a Data Scientist to apply machine learning and advanced analytics to solve complex business problems. The ideal candidate will possess technical expertise, business acumen, and a passion for solving high-impact problems. Your responsibilities will include developing machine learning models, integrating new data sources, and delivering solutions that enhance decision-making. You will collaborate with cross-functional teams to translate insights into action, from design to deployment. Key Responsibilities: - Design, build, and validate statistical and machine learning models for key business problems. - Perform data exploration and analysis to uncover insights and improve model performance. - Communicate findings to stakeholders, collaborate with teams to ensure solutions are adopted. - Stay updated on modeling techniques, tools, and technologies, integrating innovative approaches. - Lead data science initiatives from planning to delivery, ensuring measurable business impact. - Provide mentorship to junior team members and lead technical teams as required. Must-Have Skills: - 4 to 8 years of experience in applied data science, delivering business value through machine learning. - Proficiency in Python with experience in libraries like scikit-learn, pandas, NumPy, and TensorFlow or PyTorch. - Strong foundation in statistical analysis, regression modeling, classification techniques, and more. - Hands-on experience with building and deploying machine learning models in cloud environments. - Ability to translate complex business problems into structured data science problems. - Strong communication, stakeholder management, analytical, and problem-solving skills. - Proactive in identifying opportunities for data-driven decision-making. - Experience in Agile or Scrum-based project environments. Preferred Skills: - Experience with Large Language Models (LLMs) and transformer architectures. - Experience with production-grade ML platforms and orchestration tools for scaling models. Primary Skills: Business Case Analyses, Data Analytics, Predictive Analytics, Predictive Modeling, Waterfall Project Management. Shift Time: Shift B (India). Recruiter Info: Annapurna Jha, email: ajhat@allstate.com. About Allstate: The Allstate Corporation is a leading insurance provider in the US, with operations in multiple countries, including India. Allstate India is a strategic business services arm focusing on technology, innovation, and operational excellence. Learn more about Allstate India here.,
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
The ideal candidate for the position will have the responsibility of designing, developing, and maintaining an optimal data pipeline architecture. You will be required to monitor incidents, perform root cause analysis, and implement appropriate actions to solve issues related to abnormal job execution and data corruption conditions. Additionally, you will automate jobs, notifications, and reports to improve efficiency. You should possess the ability to optimize existing queries, reverse engineer for data research and analysis, and calculate the impact of issues on the downstream side for effective communication. Supporting failures, data quality issues, and ensuring environment health will also be part of your role. Furthermore, you will maintain ingestion and pipeline runbooks, portfolio summaries, and DBAR, while enabling infrastructure changes, enhancements, and updates roadmap. Building the infrastructure for optimal extraction, transformation, and loading data from various sources using big data technologies, python, or web-based APIs will be essential. You will participate in code reviews with peers, have excellent communication skills for understanding and conveying requirements effectively. As a candidate, you are expected to have a Bachelor's degree in Engineering/Computer Science or a related quantitative field. Technical skills required include a minimum of 8 years of programming experience with python and SQL, experience with massively parallel processing systems like Spark or Hadoop, and a minimum of 6-7 years of hands-on experience with GCP, BigQuery, Dataflow, Data Warehousing, Data modeling, Apache Beam, and Cloud Storage. Proficiency in source code control systems (GIT) and CI/CD processes, involvement in designing, prototyping, and delivering software solutions within the big data ecosystem, and hands-on experience in generative AI models are also necessary. You should be able to perform code reviews to ensure code meets acceptance criteria, have experience with Agile development methodologies and tools, and work towards improving data governance and quality to enhance data reliability. EXL Analytics offers a dynamic and innovative environment where you will collaborate with experienced analytics consultants. You will gain insights into various business aspects, develop effective teamwork and time-management skills, and receive training in analytical tools and techniques. Our mentoring program provides guidance and coaching to every employee, fostering personal and professional growth. The opportunities for growth and development at EXL Analytics are limitless, setting the stage for a successful career within the company and beyond.,
Posted 4 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The Senior Data Engineer position at Annalect within the Technology team involves building products on cloud-based data infrastructure while collaborating with a team that shares a passion for technology, design, development, and data integration. Your main responsibilities will include designing, building, testing, and deploying data transfers across various cloud environments such as Azure, GCP, AWS, and Snowflake. You will also be tasked with developing data pipelines, monitoring, maintaining, and optimizing them. Writing at-scale data transformations using SQL and Python will be a crucial part of your role. Additionally, you will be expected to conduct code reviews and provide mentorship to junior developers. To excel in this position, you should possess a keen curiosity for understanding the business requirements driving the engineering needs. An enthusiasm for exploring new technologies and bringing innovative ideas to the team is highly valued. A minimum of 3 years of experience in SQL, Python, and Linux is required, along with familiarity with Snowflake, AWS, GCP, and Azure cloud environments. Intellectual curiosity, self-motivation, and a genuine passion for technology are essential attributes for success in this role. For this role, a degree in Computer Science, Engineering, or equivalent practical experience is preferred. Experience with big data, infrastructure setup, and working with relational databases like Postgres, MySQL, and MSSQL is advantageous. Familiarity with data processing tools such as Hadoop, Hive, Spark, and Redshift is beneficial as a significant amount of time will be dedicated to building and optimizing data transformations. The ability to independently manage projects from concept to implementation and maintenance is a key requirement. Working at Annalect comes with various perks, including a vibrant and collaborative work environment with engaging social and learning activities, a generous vacation policy, extended time off during the holiday season, and the advantage of being part of a global company while maintaining a startup-like flexibility and pace. The role offers the opportunity to work with a modern stack and environment, enabling continuous learning and experimentation with cutting-edge technologies to drive innovation.,
Posted 4 days ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
About The Role We are seeking an experienced Data Engineer with deep hands-on expertise in AWS, Azure Databricks, Snowflake, and modern data engineering practices to join our growing Data & AI Engineering team. The ideal candidate is a strategic thinker who can design scalable platforms, drive robust data solutions, and support high-impact AI/GenAI projects from the ground up. Key Responsibilities Working experience of 3 years in Data engineering. Design, build, and optimize scalable data pipelines using modern frameworks and orchestration tools. Develop and maintain ETL/ELT workflows using AWS, Azure Databricks, Airflow, and Azure Data Factory. Manage and model data in Snowflake to support advanced analytics and machine learning use cases. Collaborate with analytics, product, and engineering teams to align data solutions with business goals. Ensure high standards for data quality, governance, and pipeline performance. Mentor junior engineers and help lead a high-performing data and platform engineering team. Lead and support GenAI platform initiatives, including building reusable libraries, integrating vector databases, and developing LLM-based pipelines. Build components of agentic frameworks using Python, Spring AI, and deploy them on AWS EKS. Establish and manage CI/CD pipelines using Jenkins. Drive ML Ops and model deployment workflows to ensure reliable and scalable AI solution delivery. Required Qualifications Proven hands-on experience with Azure Databricks, Snowflake, Airflow, and Python. Strong proficiency in SQL, Spark, Spark Streaming, and modern data orchestration frameworks. Solid understanding of data modeling, ETL best practices, and performance optimization. Experience in cloud-native environments (AWS and/or Azure). Strong hands-on expertise in AWS EKS, CI/CD (Jenkins), and ML Ops/model deployment workflows. Ability to lead, mentor, and collaborate effectively across cross-functional teams. Preferred Qualifications Experience with Search Platforms such as Elasticsearch, SOLR, OpenSearch, or Vespa. Familiarity with Spring Boot microservices and EKS-based deployments. Background in Recommender Systems, with leadership roles in AI/ML projects. Expertise in GenAI platform engineering, including LLMs, RAG architecture, Vector Databases, and agentic design. Proficiency in Python, Java, Spring AI, and enterprise-grade software development. Ability to build platform-level solutions with a focus on reusability, runtime libraries, and scalability. What We Offer A unique opportunity to build and scale cutting-edge AI and data platforms that drive meaningful business outcomes. A collaborative, growth-oriented work culture with room for ownership and innovation. Competitive compensation and a comprehensive benefits package. Flexible hybrid/remote work model to support work-life balance. Work Location Chennai -Hybrid /Remote. (ref:hirist.tech)
Posted 4 days ago
0 years
0 Lacs
Greater Kolkata Area
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
1.0 - 5.0 years
0 Lacs
noida, uttar pradesh
On-site
Your journey at Crowe starts here with the opportunity to build a meaningful and rewarding career. At Crowe, you are trusted to deliver results and make an impact while having the flexibility to balance work with life moments. Your well-being is cared for, and your career is nurtured in an inclusive environment where everyone has equitable access to opportunities for growth and leadership. With over 80 years of history, Crowe has excelled in delivering excellent service through innovation across audit, tax, and consulting groups. As a Data Engineer at Crowe, you will provide critical integration infrastructure for analytical support and solution development for the broader Enterprise using market-leading tools and methodologies. Your expertise in API integration, pipelines or notebooks, programming languages (Python, Spark, T-SQL), dimensional modeling, and advanced data engineering techniques will be key in creating and delivering robust solutions and data products. You will be responsible for designing, developing, and maintaining the Enterprise Analytics Platform to support data-driven decision-making across the organization. Success in this role requires a strong interest and passion in data analytics, ETL/ELT best practices, critical thinking, problem-solving, as well as excellent interpersonal, communication, listening, and presentation skills. The Data team strives for an unparalleled client experience and will look to you to promote success and enhance the firm's image firmwide. To qualify for this role, you should have a Bachelor's degree in computer science, Data Analytics, Data/Information Science, Information Systems, Mathematics (or related fields), along with specific years of experience in SQL, data warehousing concepts, programming languages, managing projects, and utilizing tools like Microsoft Power BI, Delta Lake, or Apache Spark. It is preferred that you have hands-on experience or certification with Microsoft Fabric. Upholding Crowe's values of Care, Trust, Courage, and Stewardship is essential in this position, as we expect all team members to act ethically and with integrity at all times. Crowe offers a comprehensive benefits package to its employees and provides an inclusive culture that values diversity. You will have the opportunity to work with a Career Coach who will guide you in your career goals and aspirations. Crowe, a subsidiary of Crowe LLP (U.S.A.), a public accounting, consulting, and technology firm, is part of Crowe Global, one of the largest global accounting networks in the world. Crowe does not accept unsolicited candidates, referrals, or resumes from any staffing agency or third-party paid service. Referrals, resumes, or candidates submitted without a pre-existing agreement will be considered the property of Crowe.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled Data Engineer to join our team, working on end-to-end data engineering and data science use cases. The ideal candidate will have strong expertise in Python or Scala, Spark (Databricks), and SQL, building scalable and efficient data pipelines on Azure. Responsibilities include designing, building, and maintaining scalable ETL/ELT data pipelines using Azure Data Factory, Databricks, and Spark. Developing and optimizing data workflows using SQL and Python or Scala for large-scale data processing and transformation. Implementing performance tuning and optimization strategies for data pipelines and Spark jobs to ensure efficient data handling. Collaborating with data engineers to support feature engineering, model deployment, and end-to-end data engineering workflows. Ensuring data quality and integrity by implementing validation, error-handling, and monitoring mechanisms. Working with structured and unstructured data using technologies such as Delta Lake and Parquet within a Big Data ecosystem. Contributing to MLOps practices, including integrating ML pipelines, managing model versioning, and supporting CI/CD processes. Primary Skills required are Data Engineering & Cloud proficiency in Azure Data Platform (Data Factory, Databricks), strong skills in SQL and either Python or Scala for data manipulation, experience with ETL/ELT pipelines and data transformations, familiarity with Big Data technologies (Spark, Delta Lake, Parquet), expertise in data pipeline optimization and performance tuning, experience in feature engineering and model deployment, strong troubleshooting and problem-solving skills, experience with data quality checks and validation. Nice-to-Have Skills include exposure to NLP, time-series forecasting, and anomaly detection, familiarity with data governance frameworks and compliance practices, basics of AI/ML like ML & MLOps Integration, experience supporting ML pipelines with efficient data workflows, knowledge of MLOps practices (CI/CD, model monitoring, versioning). At Tesco, we are committed to providing the best for our colleagues. Total Rewards offered at Tesco are determined by four principles - simple, fair, competitive, and sustainable. Colleagues are entitled to 30 days of leave (18 days of Earned Leave, 12 days of Casual/Sick Leave) and 10 national and festival holidays. Tesco promotes programs supporting health and wellness, including insurance for colleagues and their family, mental health support, financial coaching, and physical wellbeing facilities on campus. Tesco in Bengaluru is a multi-disciplinary team serving customers, communities, and the planet. The goal is to create a sustainable competitive advantage for Tesco by standardizing processes, delivering cost savings, enabling agility through technological solutions, and empowering colleagues. Tesco Technology team consists of over 5,000 experts spread across the UK, Poland, Hungary, the Czech Republic, and India, dedicated to various roles including Engineering, Product, Programme, Service Desk and Operations, Systems Engineering, Security & Capability, Data Science, and others.,
Posted 4 days ago
5.0 - 9.0 years
0 Lacs
chandigarh
On-site
You will be joining the Microsoft Security organization, where security is a top priority due to the increasing digital threats, regulatory scrutiny, and complex estate environments. Microsoft Security aims to make the world a safer place by providing end-to-end security solutions to empower users, customers, and developers. As a Senior Data Scientist, you will be instrumental in enhancing our security posture by developing innovative models to detect and predict security threats. This role requires a deep understanding of data science, machine learning, and cybersecurity, along with the ability to analyze large datasets and collaborate with security experts to address emerging threats and vulnerabilities. Your responsibilities will include understanding complex cybersecurity and business problems, translating them into well-defined data science problems, and building scalable solutions. You will develop and deploy production-grade AI/ML systems for real-time threat detection, analyze large datasets to identify security risks, and collaborate with security experts to incorporate domain knowledge into models. Additionally, you will lead the design and implementation of data-driven security solutions, mentor junior data scientists, and communicate findings to stakeholders. To qualify for this role, you should have experience in developing and deploying machine learning models for security applications, preferably in a Big Data or cybersecurity environment. You should be familiar with the Azure tech stack, have knowledge of anomaly detection and fraud detection, and possess expertise in programming languages such as Python, R, or Scala. A Doctorate or Master's Degree in a related field, along with 5+ years of data science experience, is preferred. Strong analytical, problem-solving, and communication skills are essential, as well as proficiency in machine learning frameworks and cybersecurity principles. Preferred qualifications include additional experience in developing machine learning models for security applications, familiarity with data science workloads on the Azure tech stack, and contributions to the field of data science or cybersecurity. Your ability to drive large-scale system designs, think creatively, and translate complex data into actionable insights will be crucial in this role.,
Posted 4 days ago
12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Big Data Architect Exp : 12-18 years Location : Description : Position Summary We are looking for candidates with hands on experience in Big Data and Cloud Technologies. Must Have Technical Skills 10+ Years of experience Expertise in designing and developing applications using Big Data, DWH, Cloud technologies, ETL,SQL Must Have Expertise and hands-on experience- on Spark, and Hadoop echo system components Must Have Expertise and hand-on experience- of any of the Cloud (AWS/Azure/GCP) Must Have Good knowledge of Shell script & Java/Python Good knowledge of migration projects on Hadoop - Good to Have Good Knowledge of one of the Workflow engine like Oozie, Autosys Good to Have Good knowledge of Agile Development - Good to Have Passionate about exploring new technologies - Must Have Automation approach - Good to Have Responsibilities Define Data Warehouse modernization approach and strategy for the customer Align the customer on the overall approach and solution Design systems for meeting performance SLA Resolve technical queries and issues for team Work with the team to establish an end-to-end migration approach for one use case so that the team can replicate the same for other iterations Its WFO all the 5days Please share below details and updated CV if you would like to take your candidature ahead for this role : Current CTC : Expected CTC : Notice Period : Reason for job change : (ref:hirist.tech)
Posted 4 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Summary We are seeking a highly motivated AI/ML Engineer to design, develop, and deploy machine learning solutions that solve real-world problems. Responsibilities The ideal candidate should have strong foundations in machine learning algorithms, Python programming, and experience with model development, data pipelines, and production deployment in cloud or on-prem Responsibilities : Design and implement machine learning models and AI solutions for business use cases. Build and optimize data preprocessing pipelines for training and inference. Train, evaluate, and fine-tune supervised, unsupervised, and deep learning models. Collaborate with data engineers, product teams, and software developers. Deploy ML models into production using APIs, Docker, or cloud-native tools. Monitor model performance and retrain/update models as needed. Document model architectures, experiments, and performance metrics. Research and stay updated on new AI/ML trends and tools. Required Skills And Experience Strong programming skills in Python (NumPy, Pandas, Scikit-learn, etc. Experience with deep learning frameworks like TensorFlow, Keras, or PyTorch. Solid understanding of machine learning algorithms, data structures, and statistics. Experience with NLP, computer vision, or time series analysis is a plus. Familiarity with tools like Jupyter, MLflow, or Weights & Biases. Understanding of Docker, Git, and RESTful APIs. Experience with cloud platforms such as AWS, GCP, or Azure. Strong problem-solving and communication skills. Nice To Have Experience with MLOps tools and concepts (CI/CD for ML, model monitoring). Familiarity with big data tools (Spark, Hadoop). Knowledge of FastAPI, Flask, or Streamlit for ML API development. Understanding of transformer models (e.g., BERT, GPT) or LLM : Bachelors or Masters degree in Computer Science, Data Science, AI, or a related field. Certifications in Machine Learning/AI (e.g., Google ML Engineer, AWS ML Specialty) are a plus. (ref:hirist.tech)
Posted 4 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation
Posted 4 days ago
6.0 - 10.0 years
0 Lacs
kolkata, west bengal
On-site
We are looking for an experienced professional with strong mathematical and statistical expertise, as well as a natural curiosity and creative mindset to uncover hidden opportunities within data. Your primary goal will be to realize the full potential of the data by asking questions, connecting dots, and thinking innovatively. Responsibilities: - Design and implement scalable and efficient data storage solutions using Snowflake. - Write, optimize, and troubleshoot SQL queries within the Snowflake environment. - Provide forward-thinking solutions in the data engineering and analytics space. - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Identify gaps in existing pipelines and resolve issues. - Develop data models to meet reporting needs by working closely with the business. - Assist team members in resolving technical challenges. - Engage in technical discussions with client architects and team members. - Orchestrate data pipelines in scheduler via Airflow. - Integrate Snowflake with various data sources and third-party tools. Skills and Qualifications: - Bachelor's and/or master's degree in computer science or equivalent experience. - Minimum 7 years of experience in Data & Analytics with strong communication and presentation skills. - At least 6 years of experience in Snowflake implementations and large-scale data warehouse end-to-end implementation. - Databricks certified architect. - Proficiency in SQL and scripting languages (e.g., Python, Spark, PySpark) for data manipulation and automation. - Solid understanding of cloud platforms (AWS, Azure, GCP) and their integration with data tools. - Familiarity with data governance and data management practices. - Exposure to Data sharing, unity catalog, DBT, replication tools, and performance tuning will be advantageous. About Tredence: Tredence focuses on delivering powerful insights into profitable actions by combining business analytics, data science, and software engineering. We work with leading companies worldwide, providing prediction and optimization solutions at scale. Headquartered in the San Francisco Bay Area, we serve clients in the US, Canada, Europe, and Southeast Asia. Tredence is an equal opportunity employer that values diversity and is dedicated to fostering an inclusive environment for all employees. To learn more about us, visit our website: [Tredence Website](https://www.tredence.com/),
Posted 4 days ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What you’ll do? Perform general application development activities, including unit testing, code deployment to development environment and technical documentation. Works on one or more projects, making contributions to unfamiliar code written by team members. Participates in estimation process, use case specifications, reviews of test plans and test cases, requirements, and project planning. Diagnose and resolve performance issues. Documents code/processes so that any other developer is able to dive in with minimal effort. Develop, and operate high scale applications from the backend to UI layer, focusing on operational excellence, security and scalability. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit engineering team employing agile software development practices. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Able to write, debug, and troubleshoot code in mainstream open source technologies. Lead effort for Sprint deliverables, and solve problems with medium complexity What Experience You Need Bachelor's degree or equivalent experience 2+ years experience working with software design and Java, Python and Javascript programming languages and SQL 2+ years experience with software build management tools like Maven or Gradle 2+ years experience with HTML, CSS and frontend/web development 2+ years experience with software testing, performance, and quality engineering techniques and strategies 2+ years experience with Cloud technology: GCP, AWS, or Azure What Could Set You Apart Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms.
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi