Home
Jobs

14700 Ml Jobs - Page 50

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Category: Automation Testing Job Type: Full Time Job Location: Bangalore Experience: 4+ Years Skills: API Testing Automation Cypress Javascript QA Position Overview We are looking for an experienced Senior Test Engineer specializing in automation testing to join our team. The ideal candidate will have a strong background in manual and automated testing with tools like Cypress, Selenium, and Appium, along with expertise in API testing. As a creative engineering company, we value innovation, collaboration, and an automation-first mindset. Responsibilities Deliver high-quality QA outputs across manual and automation testing activities. Engage with clients daily through stand-ups, sprint planning, retrospectives, and other collaborative sessions. Manage manual and automation testing efforts to ensure optimal coverage and efficiency. Address client queries and concerns effectively, ensuring satisfaction. Keep stakeholders updated on work status through clear and regular communication. Desired Profile Technical Skills: Strong proficiency in Automation Testing using Cypress (mandatory), Selenium, and Appium. Proficient in JavaScript (mandatory) and Core Java for developing automation scripts. Solid experience in API Testing (manual + automation) using Postman and RESTAssured. Familiarity with frameworks and tools like TestNG, Jenkins, Git/Bitbucket, and BDD practices. Skilled in test management tools such as JIRA, TestRail, or TestLink. Extensive knowledge of QA strategies, methodologies, and best practices. Soft Skills Strong team player with excellent communication and problem-solving skills. Self-driven and capable of managing projects independently. Proactive in identifying issues and proposing solutions. Demonstrates an automation-first approach and KPI-driven mindset. Preferred Skills (Good To Have) Experience with QMetry automation frameworks. Knowledge of Flutter app automation. Familiarity with Low Code/No Code automation tools. Experience in testing cloud-based applications or IoT devices. Exposure to testing AI/ML models. If you are passionate about automation testing and eager to work in a collaborative and challenging environment, apply now!

Posted 4 days ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that’s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career? We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene’s high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force . We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. If this excites you, then apply below. We are seeking a Competency Leader to lead the design, development, and deployment of advanced AI and Generative AI (GenAI) solutions that deliver significant business value to our pharma clients. The ideal candidate will have extensive hands-on experience with AI/ML frameworks, cloud platforms, and MLOps, coupled with a deep understanding of GenAI technologies. Key Responsibilities: Design, develop, and optimize AI/ML models for practical applications, ensuring high performance, scalability, and reliability. Innovate using advanced GenAI technologies (e.g., LLMs, RAG, OpenAI APIs) to address complex business challenges. Implement and manage end-to-end MLOps pipelines, including CI/CD, model monitoring, retraining workflows, and versioning. Architect and deploy scalable, cost-effective AI/ML solutions on cloud platforms (AWS, Azure) Collaborate with cross-functional stakeholders to align AI solutions with business objectives Mentor and guide junior engineers, fostering innovation and adherence to best practices in AI/ML development and deployment Develop and train Generative AI models Perform data analysis and prepare data for AI model training Integrate AI models with Snowflake, AWS and other systems Required knowledge: 10+ years of experience in AI/ML development and deployment, including leadership roles Good knowledge in machine learning and Generative AI, especially content generation using AWS Bedrock and OpenAI based models Proficiency in working with GenAI tools and technologies such as LLMs, Retrieval-Augmented Generation (RAG), OpenAI APIs, LangChain, Streamlit and vector databases. Good experience in building scalable (Gen) AI applications on AWS. Strong background and understanding of vector databases. Good knowledge in Python for data science, as well as streamlit for rapid deployment of prototypes. Experience to work in an agile Experience in the setup and usage of CI/CD pipelines as well as writing software in a test-driven fashion. Experience in building (Gen) AI solutions on Snowflake is a plus. Experience with Agentic AI build Key Skill: Python, AI/ML Expertise, Deep Learning, NLP, MLOps, Cloud Platforms (AWS, Azure), GenAI (LLM, RAG, OpenAI), LangChain, Streamlit, API Architecture, Leadership, Business-Aligned Problem Solving, Prompt Engineering, Knowledge of Vector Databases. EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidate’s merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics.

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About The Company Clarifai is a leading, compute orchestration AI platform specializing in computer vision and generative AI. We empower organizations to transform unstructured image, video, text, and audio data into actionable insights, significantly faster and more accurately than manual processes. Founded in 2013 by Matt Zeiler, Ph.D., Clarifai has been at the forefront of AI innovation since achieving the top five placements in the 2013 ImageNet Challenge. Our diverse, globally distributed team operates across the United States, Canada, Estonia, Argentina, and India. We have secured $100M in funding, including a $60M Series C round, backed by industry leaders such as Menlo Ventures, Union Square Ventures, Lux Capital, NEA, LDV Capital, Corazon Capital, Google Ventures, NVIDIA, Qualcomm, and Osage. Clarifai is proud to be an equal-opportunity workplace committed to building and maintaining a diverse and inclusive team. Key Responsibilities Identify trending open-source AI models with strong community adoption, import them into the Clarifai Community, and validate them across real-world use cases. Create clear, engaging previews and demos—both technical and non-technical—that showcase model capabilities. Collaborate with Marketing to promote new models and generate compelling content around them. Engage with the open-source AI community to build relationships with original model authors and increase backlink visibility. Develop lightweight Python-based demos and utilities to highlight model performance and usability. Impact As an ML Community Ops Engineer, you will directly contribute to growing Clarifai's model ecosystem by adding cutting-edge AI models and making them accessible to users. Your work will expand Clarifai's reach, improve discoverability, and ensure our platform remains at the forefront of open-source AI. By bridging engineering, marketing, and community engagement, you'll help solidify Clarifai's presence in the AI developer ecosystem. Requirements Strong experience developing, fine-tuning, and evaluating machine learning models, including familiarity with model architectures and key evaluation metrics. Expertise in deep learning frameworks (e.g., PyTorch, TensorFlow, JAX) and architectures such as transformers and CNNs. Actively follows AI and ML trends—staying current with emerging models, benchmarks, and communities. Proficiency in Python, with ability to write clean, efficient code for ML workflows and data pipelines. Experience working with cloud platforms (e.g., AWS, GCP, Azure) for model deployment and compute orchestration. Solid software engineering fundamentals, including Git, modular design, and code testing. Practical experience with data preprocessing, feature engineering, and analysis of large datasets. Great to Have Strong experience developing, fine-tuning, and evaluating machine learning models, including familiarity with model architectures and key evaluation metrics. Expertise in deep learning frameworks (e.g., PyTorch, TensorFlow, JAX) and architectures such as transformers and CNNs. Actively follows AI and ML trends—staying current with emerging models, benchmarks, and communities. Proficiency in Python, with ability to write clean, efficient code for ML workflows and data pipelines. Solid software engineering fundamentals, including Git, modular design, and code testing. Practical experience with data preprocessing, feature engineering, and analysis of large datasets.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description The Business Analytics Analyst is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Identifies policy gaps and formulates policies. Interprets data and makes recommendations. Researches and interprets factual information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team. What do we do? The TTS Analytics team provides analytical insights to the Product, Pricing, Client Experience and Sales functions within the global Treasury & Trade Services business. The team works on business problems focused on driving acquisitions, cross-sell, revenue growth & improvements in client experience. The team extracts relevant insights, identifies business opportunities, converts business problems into analytical frameworks, uses big data tools and AI/ML techniques to drive data driven business outcomes in collaboration with business and product partners. Role Description The role will be Business Analytics Analyst (C10) in the TTS Analytics team The role will report to the AVP leading the team This role will be a key contributor to ideation on analytical projects to tackle strategic business priorities The role will involve working on multiple analyses through the year on business problems across the client life cycle – acquisition, engagement, client experience and retention – for the TTS business This will involve leveraging multiple analytical approaches, tools and techniques, working on multiple data sources (client profile & engagement data, transactions & revenue data, digital data, unstructured data like call transcripts, etc.) to provide data driven insights to business partners and functional stakeholders Identifies data patterns & trends, and provides insights to enhance business decision making capability in business planning, process improvement, solution assessment etc. Recommends actions for future developments & strategic business opportunities, as well as enhancements to operational policies Translate data into consumer or customer behavioral insights to drive targeting and segmentation strategies, and communicate clearly and effectively to business partners and senior leaders all findings Continuously improve processes and strategies by exploring and evaluating new data sources, tools, and capabilities Work closely with internal business partners in building, implementing, tracking and improving decision strategies Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: Experience: Bachelor’s Degree with 4+ years of experience in data analytics or Masters Degree with 2+ years of experience in data analytics Identifying and resolving business problems (around sales/marketing strategy optimization, pricing optimization, client experience, cross-sell and retention) preferably in the financial services industry Utilizing text data to derive business value by leveraging different NLP techniques (mid to expert level prior experience is a must) Leveraging and developing analytical tools and methods to identify patterns, trends and outliers in data Applying Predictive Modeling techniques for a wide range of business problem Working with data from different sources, with different complexities, both structured and unstructured Skills: Analytical Skills: Strong logical reasoning and problem solving ability Proficient in converting business problems into analytical tasks, and analytical findings into business insights Proficient in formulating analytical methodology, identifying trends and patterns with data Has the ability to work hands-on to retrieve and manipulate data from big data environments Tools and Platforms: Expert knowledge in Python, SQL, PySpark and related tools Proficient in MS Excel, PowerPoint This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Business Analysis ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Details Category: Data Science Location : Bangalore Experience Level: 4-8 Years Position Description We are looking for a Data Engineer who will play a pivotal role in transforming raw data into actionable intelligence through sophisticated data pipelines and machine learning deployment frameworks. They will collaborate across functions to understand business objectives, engineer data solutions, and ensure robust AI/ML model deployment and monitoring. This role is ideal for someone passionate about data science, MLOps, and building scalable data ecosystems in cloud environments. Key Responsibilities Data Engineering & Data Science: Preprocess structured and unstructured data to prepare for AI/ML model development. Apply strong skills in feature engineering, data augmentation, and normalization techniques. Manage and manipulate data using SQL, NoSQL, and cloud-based data storage solutions such as Azure Data Lake. Design and implement efficient ETL pipelines, data wrangling, and data transformation strategies. Model Deployment & MLOps Deploy ML models into production using Azure Machine Learning (Azure ML) and Kubernetes. Implement MLOps best practices, including CI/CD pipelines, model versioning, and monitoring frameworks. Design mechanisms for model performance monitoring, alerting, and retraining. Utilize containerization technologies (Docker/Kubernetes) to support deployment and scalability Business & Analytics Insights Work closely with stakeholders to understand business KPIs and decision-making frameworks. Analyze large datasets to identify trends, patterns, and actionable insights that inform business strategies. Develop data visualizations using tools like Power BI, Tableau, and Matplotlib to communicate insights effectively. Conduct A/B testing and evaluate model performance using metrics such as precision, recall, F1-score, MSE, RMSE, and model validation techniques. Desired Profile Proven experience in data engineering, AI/ML data preprocessing, and model deployment. Strong expertise in working with both structured and unstructured datasets. Hands-on experience with SQL, NoSQL databases, and cloud data platforms (e.g., Azure Data Lake). Deep understanding of MLOps practices, containerization (Docker/Kubernetes), and production-level model deployment. Technical Skills Proficient in ETL pipeline creation, data wrangling, and transformation methods. Strong experience with Azure ML, Kubernetes, and other cloud-based deployment technologies. Excellent knowledge of data visualization tools (Power BI, Tableau, Matplotlib). Expertise in model evaluation and testing techniques, including A/B testing and performance metrics. Soft Skills Strong analytical mindset with the ability to solve complex data-related problems. Ability to collaborate with cross-functional teams to understand business needs and provide actionable insights. Clear communication skills to convey technical details to non-technical stakeholders. If you are passionate to work in a collaborative and challenging environment, apply now!

Posted 4 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Sonatype is the software supply chain security company. We provide the world’s best end-to-end software supply chain security solution, combining the only proactive protection against malicious open source, the only enterprise grade SBOM management and the leading open source dependency management platform. This empowers enterprises to create and maintain secure, quality, and innovative software at scale. As founders of Nexus Repository and stewards of Maven Central, the world’s largest repository of Java open-source software, we are software pioneers and our open source expertise is unmatched. We empower innovation with an unparalleled commitment to build faster, safer software and harness AI and data intelligence to mitigate risk, maximize efficiencies, and drive powerful software development. More than 2,000 organizations, including 70% of the Fortune 100 and 15 million software developers, rely on Sonatype to optimize their software supply chains. The Opportunity We’re looking for a Senior Data Engineer to join our growing Data Platform team. You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, machine learning, and business intelligence across Sonatype. You’ll work closely with stakeholders across product, engineering, and business teams to ensure data is reliable, accessible, and actionable. This role is ideal for someone who thrives on solving complex data challenges at scale and enjoys building high-quality, maintainable systems. What You’ll Do Design, build, and maintain scalable data pipelines and ETL/ELT processes Architect and optimize data models and storage solutions for analytics and operational use Collaborate with data scientists, analysts, and engineers to deliver trusted, high-quality datasets Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake) Implement observability, alerting, and data quality monitoring for critical pipelines Drive best practices in data engineering, including documentation, testing, and CI/CD Contribute to the design and evolution of our next-generation data lakehouse architecture Minimum Qualifications 5+ years of experience as a Data Engineer or in a similar backend engineering role Strong programming skills in Python, Scala, or Java Hands-on experience with HBase or similar NoSQL columnar stores Hands-on experience with distributed data systems like Spark, Kafka, or Flink Proficient in writing complex SQL and optimizing queries for performance Experience building and maintaining robust ETL/ELT (Data Warehousing) pipelines in production Familiarity with workflow orchestration tools (Airflow, Dagster, or similar) Understanding of data modeling techniques (star schema, dimensional modeling, etc.) Bonus Points Experience working with Databricks, dbt, Terraform, or Kubernetes Familiarity with streaming data pipelines or real-time processing Exposure to data governance frameworks and tools Experience supporting data products or ML pipelines in production Strong understanding of data privacy, security, and compliance best practices Why You’ll Love Working Here Data with purpose: Work on problems that directly impact how the world builds secure software Modern tooling: Leverage the best of open-source and cloud-native technologies Collaborative culture: Join a passionate team that values learning, autonomy, and impact At Sonatype, we value diversity and inclusivity. We offer perks such as parental leave, diversity and inclusion working groups, and flexible working practices to allow our employees to show up as their whole selves. We are an equal-opportunity employer, and we do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. If you have a disability or special need that requires accommodation, please do not hesitate to let us know.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Shaikpet, Telangana, India

On-site

Linkedin logo

Position: Customer Support Engineer Company: Launch Ventures (for Talkingshops.com) Location: Hyderabad, Telangana (onsite) About Launch Ventures Launch Ventures is a niche, award-winning technology firm that co-creates products alongside domain experts—ranging from early-stage startups to Fortune 500 enterprises. We’ve built globally scaled products, some of which have attracted investments from Google. Our work spans across modern technologies including AI/ML, IoT, Blockchain, Cloud, and full-stack web/mobile applications. We take pride not just in writing great code, but in launching ventures that matter. Our culture emphasises product ownership, technical craftsmanship, and long-term impact. Talkingshops.com is one of our fastest-growing ventures. It’s a next-generation WhatsApp Commerce platform designed to empower small and mid-sized businesses to sell more effectively, communicate seamlessly, and operate smarter. About the Role As a Customer Support Engineer, you will be the first line of communication between our customers and the product. This is not just a support role — you’ll act as a trusted partner to users by troubleshooting issues, guiding them through solutions, and ensuring a seamless experience on the platform. Your contributions will directly influence product adoption, satisfaction, and retention. This role is ideal for someone who thrives on interacting with people, is comfortable with technology, and is driven by the satisfaction of resolving issues quickly and effectively. Key Responsibilities Customer Assistance Respond promptly and professionally to customer inquiries via phone, email, chat, or ticketing systems. Act as a point of contact for troubleshooting product issues, onboarding queries, and general user guidance. Technical Troubleshooting Diagnose and resolve issues related to the Talkingshops.com platform, including product configurations, integrations (e.g., WhatsApp Business API, payment gateways), and user access problems. Assist users in resolving connectivity issues, API errors, or data sync problems with platforms such as Shopify, WooCommerce, and others. Documentation and Knowledge Sharing Maintain detailed records of customer interactions, reported issues, troubleshooting steps, and resolutions in the CRM. Create and contribute to internal knowledge bases and customer-facing support articles or FAQs. Issue Escalation & Collaboration Work closely with the engineering and product teams to escalate unresolved or complex issues with complete context. Provide feedback from users to help improve product usability and customer satisfaction. Process & Quality Improvements Recommend process improvements or automation opportunities to enhance support quality and reduce turnaround times. Help refine onboarding and support play-books for faster, consistent customer issue resolution. Customer Experience Management Build rapport with customers and ensure a high degree of empathy and clarity in communication. Monitor support KPIs (response time, resolution time, CSAT scores) and strive for continuous improvement. Requirements What We’re Looking For Educational Background: Bachelor’s degree in any discipline (a technical or computer science background is preferred). Experience: 1–3 years of experience in a customer support, technical support, or client services role. Prior experience supporting SaaS, eCommerce, or B2B platforms is a strong advantage. Hands-on experience dealing with phone-based queries is essential. Skills: Strong communication skills — clear, concise, and empathetic. Ability to explain technical concepts in simple, non-technical language. Comfortable working with support tools like Freshdesk, Zendesk, HubSpot, or similar CRMs. Familiarity with WhatsApp commerce tools, APIs, or payment integrations is a plus. Multilingual communication (especially regional Indian languages) is a bonus. Benefits Why Join Us Opportunity to work on a high-impact product serving small and growing businesses. Dynamic, startup-like environment with the stability and mentorship of an experienced leadership team. Learn and grow across customer success, product thinking, and technical troubleshooting. Flat hierarchy, transparent communication, and a supportive team culture. Competitive salary, benefits, and opportunities for growth within the company.

Posted 4 days ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Organization Overview: ML Solutions team within Markets OPS Technology is dedicated to developing solutions using Artificial Intelligence, Machine Learning and Generative AI. This team is a leader in creating new ideas, innovative technology solutions, and ground-breaking solutions for Markets Operations and Other Line of Businesses. We work closely with our clients and business partners to progress solutions from ideation to production by leveraging the entrepreneurial spirit and technological excellence. Job Description: ML Solutions team is seeking a highly skilled and experienced Data Analytics/Machine Learning Engineer with a strong background in developing and deploying AI/ML models and Gen AI based solutions. In this hands-on role, you will be responsible for taking AI models from concept to production, utilizing traditional AI, generative AI and Large Language Models. You will collaborate closely with business partners and stakeholders to introduce and embrace these advanced technologies, elevating client experiences, delivering value to our customers, and ensuring regulatory compliance through cutting-edge technology solutions. Key Responsibilities: Hands-On Execution and Delivery: Actively contribute to the development and delivery of AI solutions, driving innovation and excellence within the team. Take a hands-on approach to ensure AI models are successfully deployed into production environments, meeting high-quality standards and performance benchmarks. Quality Control: Ensure the quality and performance of generative AI models, conducting rigorous testing and evaluation. Research and Development: Participate in research activities to explore and advance state-of-the-art generative AI techniques. Stay actively engaged in monitoring ongoing research efforts, keeping abreast of emerging trends, and ensuring that the Generative AI team remains at the forefront of the field. Cross-Functional Collaboration: Collaborate effectively with various teams, including product managers, engineers, and data scientists, to integrate AI technologies into products and services. 2-4 years of relevant experience Skills & Qualifications: Strong hands-on experience in Machine Learning, delivering complex solutions to production. Experience with Generative AI technologies essential. Knowledge of NLP, Name Entity Recognition In-depth knowledge of deep learning and Generative AI frameworks such as, Langchain, Lang Graph, Crew AI or similar. Experience with and other open-source frameworks/ libraries/ APIs like Hugging Face Transformers, Spacy, Pandas, scikit-learn, NumPy. Experience in using Machine Learning/Deep Learning: XGBoost, LightGBM, TensorFlow, PyTorch. Proficiency in Python Software Development, following Object-Oriented design patterns and best practices. Strong background in mathematics: linear algebra, probability, statistics, and optimization. Experience with evaluation, scoring with framework like ML Flow Experience of Docker container and edited a Docker file, experience with K8s is a plus. Experience with Postgres and Vector DBs a plus. Excellent problem-solving skills and the ability to think creatively. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Publications and contributions to the AI research community are a plus. Bachelor’s/Master’s degree or equivalent experience in Computer Science, Data Science, Statistics, or a related field. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 4 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Who We Are Bynd is redefining financial intelligence through advanced AI, transforming how leading investment banks, private equity firms, and equity researchers globally analyze and act upon critical information. Our founding team includes a Partner from Apollo ($750B AUM) and AI engineers from UIUC, IIT, and other top-tier institutions. Operating as both a research lab and a product company, we build cutting-edge retrieval systems and AI-driven workflow automation for knowledge-intensive financial tasks. Role Overview As an AI Intern at Bynd, you’ll work at the intersection of cutting-edge GenAI systems and rigorous classical ML evaluation methodologies. Your primary responsibility will be to build and refine evaluation pipelines for our existing AI-driven financial intelligence systems. You’ll collaborate closely with the founding team and top financial domain experts to ensure our models are not only powerful—but measurable, explainable, and reliable. If you’re excited by the idea of working hands-on with state-of-the-art LLMs, experimenting with RAG systems, and building frameworks that make AI outputs trustworthy and actionable, this role is made for you. Responsibilities • Design, implement, and iterate on evaluation pipelines for existing AI/ML systems, particularly GenAI-based and RAG-based architectures. • Develop test sets, metrics, and validation frameworks aligned with financial use cases. • Analyze model performance (both quantitative and qualitative) to uncover insights, gaps, and opportunities for improvement. • Work alongside full-stack and ML engineers to integrate evaluation systems into CI/CD workflows. • Assist in data collection, benchmark tasks, and A/B testing setups for LLM responses. • Stay up-to-date with academic and industry advancements in evaluation frameworks, prompt testing, and trustworthy AI. Preferred: • Prior hands-on experience with GenAI systems (e.g., OpenAI, Claude, Mistral, etc.), including prompt design and retrieval-augmented generation (RAG). • Solid understanding of classical ML concepts like training-validation splits, overfitting, data leakage, and cross-validation. • Familiarity with tools such as Weights & Biases, LangSmith, or custom logging/benchmarking suites. • Comfort with Python, evaluation libraries (e.g., sklearn, evaluate, bert-score, BLEU/ROUGE, etc.), and backend integration. • Experience working with unstructured financial data (PDFs, tables, earnings reports, etc.) is a massive plus. What We’re Looking For We’re looking for a fast learner with deep intellectual curiosity and strong fundamentals. You should be comfortable reasoning through ambiguity, rapidly testing hypotheses, and communicating technical decisions with clarity. You’re someone who thinks not just about building intelligent systems—but about how we measure intelligence meaningfully. This is an opportunity to work closely with a high-caliber founding team and ship impactful systems used by decision-makers at global financial institutions. If you’re passionate about building AI that works and works reliably, come build with us.

Posted 4 days ago

Apply

25.0 years

0 Lacs

Tondiarpet, Tamil Nadu, India

On-site

Linkedin logo

The Company PayPal has been revolutionizing commerce globally for more than 25 years. Creating innovative experiences that make moving money, selling, and shopping simple, personalized, and secure, PayPal empowers consumers and businesses in approximately 200 markets to join and thrive in the global economy. We operate a global, two-sided network at scale that connects hundreds of millions of merchants and consumers. We help merchants and consumers connect, transact, and complete payments, whether they are online or in person. PayPal is more than a connection to third-party payment networks. We provide proprietary payment solutions accepted by merchants that enable the completion of payments on our platform on behalf of our customers. We offer our customers the flexibility to use their accounts to purchase and receive payments for goods and services, as well as the ability to transfer and withdraw funds. We enable consumers to exchange funds more safely with merchants using a variety of funding sources, which may include a bank account, a PayPal or Venmo account balance, PayPal and Venmo branded credit products, a credit card, a debit card, certain cryptocurrencies, or other stored value products such as gift cards, and eligible credit card rewards. Our PayPal, Venmo, and Xoom products also make it safer and simpler for friends and family to transfer funds to each other. We offer merchants an end-to-end payments solution that provides authorization and settlement capabilities, as well as instant access to funds and payouts. We also help merchants connect with their customers, process exchanges and returns, and manage risk. We enable consumers to engage in cross-border shopping and merchants to extend their global reach while reducing the complexity and friction involved in enabling cross-border trade. Our beliefs are the foundation for how we conduct business every day. We live each day guided by our core values of Inclusion, Innovation, Collaboration, and Wellness. Together, our values ensure that we work together as one global team with our customers at the center of everything we do – and they push us to ensure we take care of ourselves, each other, and our communities. Job Description Summary: PayPal seeks an accomplished Senior Director of Engineering with a robust background in AI & Machine Learning (ML) engineering to lead the global AI Feature Engineering team, reporting directly to the VP of AI Technology. In partnership with other leaders within this org, the successful candidate will be responsible for driving the development and execution of PayPal’s AI Feature engineering lifecycle strategy and platform delivery and leading the global team responsible for building the tools and capabilities. This position is responsible to oversee the architecture, innovation, delivery and production feature platforms, including batch, online, graph-based, vector and near real-time engines. The ideal candidate will possess deep technical expertise, strategic vision, and leadership abilities to drive advancements in AI feature engineering. We foster a culture of experimentation and consistently strive for improvement and learning. You will work in a collaborative, trusting, and thought-provoking environment that encourages diversity of thought and innovative solutions that serve the best interests of our customers. Job Description: Job Responsibilities: Lead the global AI Feature Engineering team, and responsible for driving the development and execution of PayPal’s AI Feature engineering lifecycle strategy and platform delivery and leading the global team responsible for building the tools and capabilities. Leadership in Feature Engineering Architecture to develop robust platforms for real-time, near real-time, batch processing, and integration with vector/graph database technologies. Hands-on thought leadership for building on-premise and cloud-based feature data platforms for Risk, Consumer, and Merchant segments. Prototype and deploy cutting-edge feature engineering tools to optimize AI workflows, including building prototypes for demonstration or illustration purposes for peer groups, business partners, or senior leaders. Influence senior business and technology stakeholders across the organization to promote adoption and alignment with technology strategies. Define and execute strategies for feature discovery, lifecycle optimization, and maintaining high-performance feature stores. Partner with business stakeholders, model training teams, AI infrastructure groups, and product owners to streamline feature integration, resolve production issues, and define firmwide technology strategies and roadmaps. Establish systems for data health monitoring and alerting of solution events related to performance, scalability, availability, and reliability. Drive innovation to maximize the business benefits of adopting market-leading technology capabilities by rapidly evaluating, piloting, and scaling new innovations where appropriate. Build, mentor, and inspire a global team of engineers, partner with product managers, and ML Ops specialists to deliver operating excellence and achieve organizational goals. Lead the organization in implementing self-service capabilities for rapid feature creation and deployment by data scientists and engineers. Ability to build global teams and a passion for fostering an innovative culture. Qualifications/ Education/Experience/Skills: Bachelor’s/Master’s degree in Computer Science, AI/ML, Software Engineering, or related field, with 15+ years of experience. Proven expertise in feature engineering lifecycle management using tools and technologies on-prem or cloud based. Strong technical proficiency in big data platforms, cloud architecture (e.g., AWS, GCP), and vector/graph databases. Experience deploying large-scale ML pipelines with high reliability and uptime. Background with Machine Learning Frameworks , Realtime systems, and Big Data technologies such as Hadoop. Effective people management skills, with the ability to attract, retain and inspire a global high- performance engineering team. Effective cross-functional collaboration skills, bridging technical and business stakeholders in dynamic environments. Strong communication and strategic influencing skills, with an executive presence. Must be able to translate and distill complex data concepts to different audiences concisely. Ability to juggle multiple priorities and effectively deliver in a fast-paced, dynamic environment Comfortable working in an Agile and collaborative environment. Behaviors: Consistently demonstrates PayPal’s core values, which include Collaboration and Inclusion Partnership – organizational awareness, engages widely with peers and senior leadership across PayPal. Innovate – leads effectively in a dynamic, fast-changing environment, challenges convention. Deliver Stand out Results – develops decisive strategies that deliver tangible results. Execution – focuses the business area on strategic priorities; operates clear, simple systems. Enable and Grow Talent – builds capability and a succession pipeline. Leadership – contributes to the business leadership agenda; sets business area vision. For the majority of employees, PayPal's balanced hybrid work model offers 3 days in the office for effective in-person collaboration and 2 days at your choice of either the PayPal office or your home workspace, ensuring that you equally have the benefits and conveniences of both locations. Our Benefits: At PayPal, we’re committed to building an equitable and inclusive global economy. And we can’t do this without our most important asset—you. That’s why we offer benefits to help you thrive in every stage of life. We champion your financial, physical, and mental health by offering valuable benefits and resources to help you care for the whole you. We have great benefits including a flexible work environment, employee shares options, health and life insurance and more. To learn more about our benefits please visit https://www.paypalbenefits.com Who We Are: To learn more about our culture and community visit https://about.pypl.com/who-we-are/default.aspx Commitment to Diversity and Inclusion PayPal provides equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, pregnancy, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state, or local law. In addition, PayPal will provide reasonable accommodations for qualified individuals with disabilities. If you are unable to submit an application because of incompatible assistive technology or a disability, please contact us at paypalglobaltalentacquisition@paypal.com. Belonging at PayPal: Our employees are central to advancing our mission, and we strive to create an environment where everyone can do their best work with a sense of purpose and belonging. Belonging at PayPal means creating a workplace with a sense of acceptance and security where all employees feel included and valued. We are proud to have a diverse workforce reflective of the merchants, consumers, and communities that we serve, and we continue to take tangible actions to cultivate inclusivity and belonging at PayPal. Any general requests for consideration of your skills, please Join our Talent Community. We know the confidence gap and imposter syndrome can get in the way of meeting spectacular candidates. Please don’t hesitate to apply. REQ ID R0128071

Posted 4 days ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in integration and platform architecture focus on designing and implementing seamless integration solutions and robust platform architectures for clients. They enable efficient data flow and optimise technology infrastructure for enhanced business performance. Those in solution architecture at PwC will design and implement innovative technology solutions to meet clients' business needs. You will leverage your experience in analysing requirements, developing technical designs to enable the successful delivery of solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are looking for experienced AI/ML developers with: • strong analytical and problem-solving abilities • willingness to learn latest AI technologies and adapt to changing project requirements • Understanding of the core AI algorithms • ability to prioritize tasks and manage time effectively to meet deadlines • good verbal and written communication skills • ability to work collaboratively in a team setting Responsibilities: • Design, develop, and deploy AI/ML models in Azure App Services and Azure Kubernetes Service (AKS) • Implement and optimize AI solutions using frameworks like Lang-Chain, Semantic Kernel, and PyTorch • Develop and maintain applications using Microsoft .NET framework, integrating with Microsoft Teams and Graph API *Perform prompt engineering, tuning, and optimization of Large Language Models • Handle data preprocessing and feature engineering for both structured and unstructured data • Implement CI/CD pipelines using Azure DevOps • Manage data storage solutions using Azure Blob Storage • Collaborate with team members to ensure best practices in AI/ML development Document AI models, code, and processes clearly. Mandatory skill sets: Essential Skills: Strong proficiency in Python, TypeScript, and/or C# • Experience with AI frameworks including Lang-Chain, Semantic Kernel, PyTorch, and Scikit-learn • Expertise in Azure Cognitive Services, Azure Machine Learning, and Azure Bot Services Strong knowledge of NLP, LLMs, Vectorization, and Prompt Engineering • Experience with Azure Text Analytics • Proficiency in data engineering using Azure Data Factory and Azure SQL Database Experience with Azure DevOps and version control • Understanding of cloud-based AI/ML deployment • Experience with Microsoft Teams and Graph API integration • Knowledge of data preprocessing and feature engineering techniques Preferred skill sets: · Desirable Skills: 1. AI-900: Azure AI Fundamentals (Must Have) · Good to have: 1. DP-100: Azure Data Scientist Associate (Preferred) Years of experience required: 3+ Yrs Education qualification: BTech/BE/MTech from reputed institution/university as per the hiring norms Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills AI Programming Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Amazon Web Services (AWS), Analytical Thinking, Architectural Engineering, Brainstorm Facilitation, Business Impact Analysis (BIA), Business Process Modeling, Business Requirements Analysis, Business Systems, Business Value Analysis, Cloud Strategy, Communication, Competitive Advantage, Competitive Analysis, Conducting Research, Creativity, Embracing Change, Emotional Regulation, Empathy, Enterprise Architecture, Enterprise Integration, Evidence-Based Practice (EBP), Feasibility Studies {+ 41 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in integration and platform architecture focus on designing and implementing seamless integration solutions and robust platform architectures for clients. They enable efficient data flow and optimise technology infrastructure for enhanced business performance. Those in solution architecture at PwC will design and implement innovative technology solutions to meet clients' business needs. You will leverage your experience in analysing requirements, developing technical designs to enable the successful delivery of solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are looking for experienced AI/ML developers with: • strong analytical and problem-solving abilities • willingness to learn latest AI technologies and adapt to changing project requirements • Understanding of the core AI algorithms • ability to prioritize tasks and manage time effectively to meet deadlines • good verbal and written communication skills • ability to work collaboratively in a team setting Responsibilities: • Design, develop, and deploy AI/ML models in Azure App Services and Azure Kubernetes Service (AKS) • Implement and optimize AI solutions using frameworks like Lang-Chain, Semantic Kernel, and PyTorch • Develop and maintain applications using Microsoft .NET framework, integrating with Microsoft Teams and Graph API *Perform prompt engineering, tuning, and optimization of Large Language Models • Handle data preprocessing and feature engineering for both structured and unstructured data • Implement CI/CD pipelines using Azure DevOps • Manage data storage solutions using Azure Blob Storage • Collaborate with team members to ensure best practices in AI/ML development Document AI models, code, and processes clearly. Mandatory skill sets: Essential Skills: Strong proficiency in Python, TypeScript, and/or C# • Experience with AI frameworks including Lang-Chain, Semantic Kernel, PyTorch, and Scikit-learn • Expertise in Azure Cognitive Services, Azure Machine Learning, and Azure Bot Services Strong knowledge of NLP, LLMs, Vectorization, and Prompt Engineering • Experience with Azure Text Analytics • Proficiency in data engineering using Azure Data Factory and Azure SQL Database Experience with Azure DevOps and version control • Understanding of cloud-based AI/ML deployment • Experience with Microsoft Teams and Graph API integration • Knowledge of data preprocessing and feature engineering techniques Preferred skill sets: · Desirable Skills: 1. AI-900: Azure AI Fundamentals (Must Have) · Good to have: 1. DP-100: Azure Data Scientist Associate (Preferred) Years of experience required: 3+ Yrs Education qualification: BTech/BE/MTech from reputed institution/university as per the hiring norms Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills AI Programming Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Amazon Web Services (AWS), Analytical Thinking, Architectural Engineering, Brainstorm Facilitation, Business Impact Analysis (BIA), Business Process Modeling, Business Requirements Analysis, Business Systems, Business Value Analysis, Cloud Strategy, Communication, Competitive Advantage, Competitive Analysis, Conducting Research, Creativity, Embracing Change, Emotional Regulation, Empathy, Enterprise Architecture, Enterprise Integration, Evidence-Based Practice (EBP), Feasibility Studies {+ 41 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Position Summary: Adobe Firefly Gen AI Models and services group seeks passionate machine learning engineers to deliver groundbreaking generative AI experiences. In this role, you'll: Optimize and scale state-of-the-art generative AI models. Deploy AI-driven creative tools that empower millions globally. Solve real-world challenges in AI scalability and production readiness. Job Responsibilities Help architect and optimize large-scale foundation model pipelines in Generative AI Design and develop the GenAI backend services for Firefly, creating GPU optimized, efficient model pipelines that power the generative AI features on Firefly website, PPro, Photoshop, Illustrator, Express, Stock and other applications/surfaces Collaborate with outstanding Applied researchers and engineers to bring ideas to production Provide technical leadership and mentorship for junior team members Explore and research new and emerging ML and MLOps technologies to continuously improve Adobe’s GenAI engineering effectiveness and efficiency Review and provide feedback on features, technology, architecture, designs and test strategies. What you'll need to succeed Masters or Ph.D. in Computer Science, AI/ML, or related fields or B.Tech and strong experience in AI/ML 8+ years of experience Excellent communication and technical leadership skills Experience in tech working with a large number of contributors on time-sensitive and business-critical GenAI projects Experience in the latest Generative AI technologies, such as GAN, diffusion, transformer models Strong hands-on experience with large-scale GenAI model pipelines and/or shipping ML features Strong collaboration skills. Experience with working in a Matrix organization, driving alignment, drawing conclusions, and getting things done Preferred experience: Experience training or optimizing models (CUDA, Triton, TRT, AOT) Experience converting models from various frameworks like PyTorch and TensorFlow to other target formats to ensure compatibility and optimized performance across different platforms Good publication record in Computer Science, AI/ML, or related fields #FireflyGenAI Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The Opportunity Are you ready to contribute to Adobe's ambitious mission of empowering the next generation of creators? We're looking for an outstanding Computer Scientist 2 (Full Stack) to join our Adobe Firefly client team. This is an outstanding opportunity to craft next-generation AI/ML-powered creative tools that will reach millions worldwide. As a key player in our highly collaborative team, you will work closely with internal product teams and collaborators to architect and maintain user-facing experiences. Your diligent, meticulous approach, coupled with your ability to mentor and have a proactive mindset, will drive tangible results in our team-oriented culture. What You'll Do Help establish architecture and quality coding practices for the Adobe Firefly client platform Define long-term solutions for component-based architecture using functional programming Work closely with the design team, product management, and our internal clients to translate early ideas into interactive prototypes Engage with customers to identify problems, conduct A/B tests, and refine workflows Continuously expand your knowledge and skills to stay ahead of the latest development, test, and deployment methodologies What you need to succeed 8+ years of professional experience developing interactive web applications, preferably in the creative tool domain B.Tech or higher in Computer Science, or equivalent experience Excellent computer science fundamentals and a good understanding of design and performance of algorithms Deep knowledge of cloud-native applications and building and deploying web applications or interactive sites using React High proficiency in TypeScript or JavaScript (ES6+), Java, Spring Boot, Python, and Distributed Services design Passionate about quality of work, persistent, and uncompromising Proficient in writing code that is reliable, maintainable, secure, and performant Knowledge of AWS services and/or Azure, Docker, Jenkins, Splunk Proficiency in Test Driven Development (TDD) and functional programming style Confidence to be an opinionated, pragmatic developer, especially in writing high-performance, reliable, and maintainable code Ability to perform independently in a hybrid or remote-first work environment with strong written and verbal communication skills Bonus Qualifications Experience with Continuous Integration/Continuous Deployment (CI/CD) Exposure to generative AI models, including text-to-image and large language models Experience with video or similar multi-track non-linear editors Experience in UX design, design systems, or close collaboration with design teams Knowledge of modern web technologies, such as WASM, WebGPU, canvas rendering, security, asynchrony, and performance optimization Exposure to PyTorch/TensorFlow Knowledge of databases like Cosmos, Kafka, SQS, Redis, MongoDB, Solr, Dynamo, and Elastic Search Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! The Adobe Firefly client team is seeking a senior JavaScript/TypeScript engineer to contribute to an exciting, high profile effort to establish Adobe as the best creativity tools provider. We seek to empower the next generation of creators everywhere by turning creative intent into creative success. We are crafting new AI/ML powered tools empowering self expression and collaboration across the digital landscape. While implementing this far-reaching strategy we are focusing on using product driven development to drive rapid iteration and to continuously deliver measurable impact. This initiative is an outstanding opportunity to shape emerging next generation products reaching millions of creators worldwide. The Opportunity What are we looking for in an ideal lead front end developer . You will be joining a highly collaborative team of application and front end engineers working closely with the internal product teams and stakeholders. Your primary role is to architect and maintain the user-facing experience for Adobe Firefly. You have a user-centric, detail oriented approach, invite constructive collaboration, naturally strive to be a mentor and always work with a bias towards action. Most importantly you enjoy independently solving complex problems, have a deep empathy for customers, and drive tangible results in a team oriented culture. What You'll Do Help establish architecture and quality coding practices for the Adobe Firefly client platform Define long-term solutions for component based architecture using functional programming Work closely with the design team, product management and our internal clients translating early ideas into interactive prototypes Engage with customers to identify problems, A|B test solutions, and refine workflows Expand your knowledge and skills to stay ahead of the latest development, test, and deployment methodologies What you need to succeed 8+ years of professional experience developing interactive web applications, preferably in the creative tool domain B.Tech or higher in Computer Science, or equivalent experience Well established practice of building and deploying web applications or interactive sites using React High proficiency in TypeScript or JavaScript (ES6+) Fluent with Test Driven Development (TDD) Fluent in functional programming style Confidence to be an opinionated, pragmatic developer - especially in the areas of writing high-performance, reliable and maintainable code Ability to perform independently in a hybrid or remote first work environment supported by competent written and verbal communication skills Bonus Qualifications Experience with Continuous Integration/Continuous Deployment (CI/CD) Exposure to generative AI models, including text-to-image and large language models Experience with video or similar multi-track non-linear editors Experience in UX design, design systems or close collaboration with design teams Knowledge of modern web technologies, for example, WASM, WebGPU and canvas rendering, security, asynchrony and performance optimization Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.

Posted 4 days ago

Apply

2.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Kovai.co is a catalyst, sparking a revolution in the world of enterprise software and B2B SaaS, we are a technology powerhouse delivering best-in-class enterprise software and game-changing SaaS solutions across industries. At Kovai.co , we're rewriting the B2B landscape by empowering over 2,500 businesses worldwide with our award-winning SaaS solutions. Our Products Biztalk360 Turbo360 Document360 “UK headquarters. Indian innovation. Global impact.” Our journey has been nothing short of remarkable, having witnessed exponential growth and profitability right from our inception. We are on track towards $30 million in annual revenue – and we're just getting started. Kovai.co is fueled by a tribe of thoughtful helpers, obsessed with empowering customers, uplifting colleagues, and igniting our own journeys. Redefining tech is our game. Are you in? Join Kovai.co – where passion meets purpose. What's the job: Data Scientist What You’ll Do On The Job Understand business problems and formulate hypothesis Collecting and interpreting data, analyzing results, identifying patterns and trends in large data sets Research and develop advanced statistical and machine learning models for analysis of large-scale, high-dimensional data Build custom machine learning models for various use-cases such as recommendation, insight generation, forecasting, anomaly detection, and so on Performing data quality checks for quality assurance Build data pipelines to supply quality data to machine learning model in production environment Supports innovative analytical thinking & solutions that results into improved business performance Writing reports based on insights gained from exploratory data analysis and able to articulate it to product owner Practice industry well-known Machine Learning Framework and best practices Who’ll be a good fit: Proven working experience of 2+ years as a Data Scientist Strong in mathematics - statistics, probability, linear algebra, time series Proficiency in working with large datasets and data wrangling using SQL (Structured Query Language) Familiar with tools such as JuPyteR, Tensorflow ,Python , Azure ML, Prophet (Open source software), MS SQL Server Working knowledge of applied machine learning techniques such as regression, clustering, forecasting of time-series data and Classification models Proficiency in LSTM, hypothesis testing and supervised learning / unsupervised learning. Familiarity with all aspects of the analytical cycle such as data wrangling and visualisation, feature engineering, and model building Write scalable scripts to fetch or modify data from API endpoints Exposure to cloud technology is preferred (Azure) Excellent critical thinking, verbal, and written communication skills Equal Opportunities Kovai.co is committed to building a workforce that reflects the richness of our society. We believe in fostering a culture of belonging and respect for all. Kovai.co stands firmly against discrimination, ensuring equal opportunity for everyone to build a successful career. Submit Your Application You have successfully applied You have errors in applying Apply With Resume * First Name* Middle Name Last Name* Email* Mobile Phone Social Network and Web Links Provide us with links to see some of your work (Git/ Dribble/ Behance/ Pinterest/ Blog/ Medium) Employer Education

Posted 4 days ago

Apply

0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Way of Working - Office/Field - Employees will work full-time from their base location Swiggy is India's leading on-demand delivery platform with a tech-first approach to logistics and a solution-first approach to consumer demands. With a presence in 500 cities across India, partnerships with hundreds of thousands of restaurants, an employee base of over 5000, a 2 lakh+ strong independent fleet of Delivery Executives, we deliver unparalleled convenience driven by continuous innovation. Built on the back of robust ML technology and fueled by terabytes of data processed every day, Swiggy offers a fast, seamless and reliable delivery experience for millions of customers across India. From starting out as a hyperlocal food delivery service in 2014, to becoming a logistics hub of excellence today, our capabilities result not only in lightning-fast delivery for customers, but also in a productive and fulfilling experience for our employees. With Swiggy's New Supply and the recent launches of Swiggy Instamart, Swiggy Genie, and Guiltfree, we are consistently making waves in the market, while continually growing the opportunities we offer our people. Role – Sales Manager - Alcohol Job Responsibilities Serve as the primary point of contact for assigned client accounts, understanding their goals, needs, and challenges. Develop account strategies to overcome the challenges and action plans to meet client objectives and maximize account growth as per the target Conduct regular F2F business reviews with clients, discussing performance, identifying areas for improvement, and presenting new opportunities. Track and analyze account performance, sales data, account funnel, and market trends to identify opportunities and challenges. Drive business growth for newly onboarded partners by working on their basic hygiene and health metrics Deliver Incremental Revenue from the assigned clients through monetization and commercial improvements. Deliver incremental counter share for all assigned clients by strategic planning to dominate market share. Maintaining a strong relationship with alcohol owners and delivering best-in-class alcohol NPS. Collaborate with internal teams to coordinate and deliver exceptional service to clients, addressing any issues or concerns promptly.. Generate leads and proactively approach potential clients, presenting our value proposition and securing new partnerships. Desired Candidate Graduate with excellent communication skills. Good working knowledge and experience of e-commerce activities and all online marketing channels Confident, Pleasing and a go-getter personality Effective communication skills Attitude & Aptitude for Sales Should be a team player, working alongside people from all walks of life. Analytical, good Excel skills. Leadership and Influencing skills: Identify, builds, and use a wide network of contacts with people at all levels, internally and externally. Achieves a good result through a well-planned approach. Initiative & Flexibility: Recognizes the need to adapt to change & implement appropriate solutions. Be able to identify opportunities and recommend/influence change to increase the effectiveness and success of campaigns. Creativity & Initiative: Demonstrate creativity & originality in their work and have the personal drive and initiative to bring about change and help drive the business forward. Being the face of Swiggy in the market and standing up for the values we believe in. Key Skills Required P&L Understanding Market Research and Intelligence Customer Lifetime Value Business Development Data Logic Data Interpretation Data Visualization MS Excel Data Analysis Result Orientation Managing Relationships Conflict Management Problem-Solving "We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regards to race, colour, religion, sex, disability status, or any other characteristic protected by the law"

Posted 4 days ago

Apply

0 years

0 Lacs

Raipur, Chhattisgarh, India

On-site

Linkedin logo

Way of Working - Office/Field - Employees will work full-time from their base location About Swiggy Swiggy is India's leading on-demand delivery platform with a tech-first approach to logistics and a solution-first approach to consumer demands. With a presence in 500 cities across India, partnerships with hundreds of thousands of restaurants, an employee base of over 5000, a 2 lakh+ strong independent fleet of Delivery Executives, we deliver unparalleled convenience driven by continuous innovation. Built on the back of robust ML technology and fueled by terabytes of data processed every day, Swiggy offers a fast, seamless and reliable delivery experience for millions of customers across India. From starting out as a hyperlocal food delivery service in 2014, to becoming a logistics hub of excellence today, our capabilities result not only in lightning-fast delivery for customers, but also in a productive and fulfilling experience for our employees. With Swiggy's New Supply and the recent launches of Swiggy Instamart, Swiggy Genie, and Guiltfree, we are consistently making waves in the market, while continually growing the opportunities we offer our people. Role – Sales Manager I Job Responsibilities Serve as the primary point of contact for assigned client accounts, understanding their goals, needs, and challenges Develop account strategies to overcome the challenges and action plans to meet client objectives and maximize account growth as per the target Conduct regular F2F business reviews with clients, discussing performance, identifying areas for improvement, and presenting new opportunities Track and analyze account performance, sales data, account funnel, and market trends to identify opportunities and challenges Drive business growth for newly onboarded partners by working on their basic hygiene and health metrics Deliver Incremental Revenue from the assigned clients through monetization and commercial improvements Deliver incremental counter share for all assigned clients by strategic planning to dominate market share Maintaining a strong relationship with restaurant owners and delivering best-in-class restaurant NPS Collaborate with internal teams to coordinate and deliver exceptional service to clients, addressing any issues or concerns promptly Generate leads and proactively approach potential clients, presenting our value proposition and securing new partnerships Desired Candidate Graduate with excellent communication skills. Good working knowledge and experience of e-commerce activities and all online marketing channels Confident, Pleasing and a go-getter personality Effective communication skills Attitude & Aptitude for Sales Should be a team player, working alongside people from all walks of life Analytical, good Excel skills Leadership and Influencing skills: Identify, builds, and use a wide network of contacts with people at all levels, internally and externally. Achieves a good result through a well-planned approach Initiative & Flexibility: Recognizes the need to adapt to change & implement appropriate solutions. Be able to identify opportunities and recommend/influence change to increase the effectiveness and success of campaigns Creativity & Initiative: Demonstrate creativity & originality in their work and have the personal drive and initiative to bring about change and help drive the business forward. Being the face of Swiggy in the market and standing up for the values we believe in Key Skills Required P&L UnderstandingMarket Research and Intelligence Customer Lifetime Value Business Development Data Logic Data Interpretation Data Visualization MS Excel Data Analysis Result Orientation Managing Relationships Conflict Management Problem-Solving "We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regards to race, colour, religion, sex, disability status, or any other characteristic protected by the law"

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Brand & Advertising; Internal Communications; External Communications; Customer Experience Design; Research; Product Development; Solution Development; Pricing; Vision/ Strategic Planning; Strategic Forecasting & Analysis; Marketing Operations; Customer Analytics To provide advice and guidance at regional/divisional level and effectively manage a team of professionals and/or subject matter experts. Responsibilities may include interfacing with Corporate level management. Grade : 15 "Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date" Judgement & Decision Making Skills;Planning & Organizing Skills;Influencing & Persuasion Skills;Presentation Skills;Leadership Skills What Your Main Responsibilities Are Subject Matter Expert and Consultations: Excellent commercial and pricing acumen- Review deals with multiple lenses, make business sense of numbers, ability to articulate and converge the pricing story with the E2E customer journey at FedEx. Previous regional/global deal pricing experience will be preferred. Strong negotiation and communication skills- Explain the rationale and defend price levels when reviewing/ analysing Domestic and Global deals (across OpCos and Market Segments). Consult and develop efficient and effective approval framework, process and innovative solutions to increase pricing analyst effectiveness, fast deal turnaround, strong yield management - measuring and actions - across both strategic pricing as well as contract management (pricing implementation). Be agile, allocates and prioritizes resources effectively to manage the global/regional pricing demands. Support analysis of Global Pricing deals (across FedEx Operating Companies and functions) Consult and develop efficient and effective tools, process and innovative solutions to increase analyst effectiveness, faster implementation & executions, strong yield management – measuring and actions. Allocates and prioritizes resources effectively to manage the US, global, regional demands Stakeholder Management and delivering business proposition: Needs to have strong global stakeholder management skills, to engage with sales and marketing stakeholders across all levels. Good financial acumen in understanding P & Ls is essential Good understanding of pricing concepts and terminologies (yields/ margins etc.) is critical. Engages with the EU and Intl pricing leadership as and when necessary, in socializing and justifying the analyses and recommendations to all stakeholders. Good understanding of system managements, E2E pricing systems, Agreement generations, contract management, pricing audit & compliance, etc. Data Modelling To Enhance Business Value Network, understand and leverage the existing decision science & pricing analytics eco-system to support data science and fact-based analysis to develop business justifications and make relevant marketing analysis / optimization recommendations to improve ROI, pricing analysis / recommendations on global pricing policy and procedures to achieve business plan. Functional/ Product Support Creates or Engineers integrative solutions for regional and central functions in all stages of cross functional development i.e. Ideation to Launch measurement. Ensure Compliance Requirements Are Adhered Acts with fiscal responsibility in all aspects of the job, agreed parameters & in compliance with all relevant regulatory & legislative requirements of the division and FedEx standard processes. What We Are Looking For Essential elements of the Job: Collaborating with Global stakeholders (Memphis and other regions) Creating a shared vision Aligning with US-global demand / creating a roadmap Consultative and delivery-oriented mindset Agility in plan and execution with highly empowered and innovative team Leading Mumbai team as well as virtual teams across groups effectively Coaching and mentoring team to innovation and flawless execution Hiring and Talent development Strong communication and transparent working style with stakeholders Problem solving and change agent with strong project management capabilities Personal awareness of culture and casting the cultural shadow of strong leadership to team Strong understanding of E2E pricing systems, analytical tools, visualizations, web applications & online report developments, System setup and testing, etc. Critical technical skills such as – VBA, SQL, SAS, Power BI, etc. Good understanding of system / tool transformation capabilities, automation capabilities such as RPA, etc. Other Experience & Exposure (Good To Have) Advance Analytics, Business Intelligence, Data Engineering & Optimization, Data modelling, Tools & Application development, Visualization and Report development, Systems & Engineering, AI, ML, Data Science, Brand & Advertising; Internal Communications; External Communications; Customer Experience Design; Research; Product Development; Solution Development; Pricing; Vision/ Strategic Planning; Strategic Forecasting & Analysis; Marketing Operations; Customer Analytics FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 4 days ago

Apply

10.0 years

15 - 17 Lacs

India

Remote

Linkedin logo

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–10 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,agile methodologies,performance tuning,data,aws,airflow

Posted 4 days ago

Apply

10.0 years

15 - 17 Lacs

India

Remote

Linkedin logo

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–10 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,agile methodologies,performance tuning,data,aws,airflow

Posted 4 days ago

Apply

10.0 years

15 - 17 Lacs

India

Remote

Linkedin logo

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–10 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,agile methodologies,performance tuning,data,aws,airflow

Posted 4 days ago

Apply

10.0 years

15 - 17 Lacs

India

Remote

Linkedin logo

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–10 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,agile methodologies,performance tuning,data,aws,airflow

Posted 4 days ago

Apply

10.0 years

15 - 17 Lacs

India

Remote

Linkedin logo

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company A fast-growing enterprise technology consultancy operating at the intersection of cloud computing, big-data engineering and advanced analytics . The team builds high-throughput, real-time data platforms that power AI, BI and digital products for Fortune 500 clients across finance, retail and healthcare. By combining Databricks Lakehouse architecture with modern DevOps practices, they unlock insight at petabyte scale while meeting stringent security and performance SLAs. Role & Responsibilities Architect end-to-end data pipelines (ingestion → transformation → consumption) using Databricks, Spark and cloud object storage. Design scalable data warehouses/marts that enable self-service analytics and ML workloads. Translate logical data models into physical schemas; own database design, partitioning and lifecycle management for cost-efficient performance. Implement, automate and monitor ETL/ELT workflows, ensuring reliability, observability and robust error handling. Tune Spark jobs and SQL queries, optimizing cluster configurations and indexing strategies to achieve sub-second response times. Provide production support and continuous improvement for existing data assets, championing best practices and mentoring peers. Skills & Qualifications Must-Have 6–10 years building production-grade data platforms, including 3 years+ hands-on Apache Spark/Databricks experience. Expert proficiency in PySpark, Python and advanced SQL, with a track record of performance-tuning distributed jobs. Demonstrated ability to model data warehouses/marts and orchestrate ETL/ELT pipelines with tools such as Airflow or dbt. Hands-on with at least one major cloud platform (AWS or Azure) and modern lakehouse / data-lake patterns. Strong problem-solving skills, DevOps mindset and commitment to code quality; comfortable mentoring fellow engineers. Preferred Deep familiarity with the AWS analytics stack (Redshift, Glue, S3) or the broader Hadoop ecosystem. Bachelor’s or Master’s degree in Computer Science, Engineering or a related field. Experience building streaming pipelines (Kafka, Kinesis, Delta Live Tables) and real-time analytics solutions. Exposure to ML feature stores, MLOps workflows and data-governance/compliance frameworks. Relevant professional certifications (Databricks, AWS, Azure) or notable open-source contributions. Benefits & Culture Highlights Remote-first & flexible hours with 25+ PTO days and comprehensive health cover. Annual training budget & certification sponsorship (Databricks, AWS, Azure) to fuel continuous learning. Inclusive, impact-focused culture where engineers shape the technical roadmap and mentor a vibrant data community Skills: data modeling,big data technologies,team leadership,agile methodologies,performance tuning,data,aws,airflow

Posted 4 days ago

Apply

0 years

10 - 12 Lacs

India

Remote

Linkedin logo

Note: This is a remote role with occasional office visits. Candidates from Mumbai or Pune will be preferred About The Company Operating at the forefront of cloud analytics, big-data platform engineering, and enterprise AI , our teams design mission-critical data infrastructure for global clients across finance, retail, telecom, and emerging tech. We build distributed ingestion pipelines on Azure & Databricks, unlock real-time insights with Spark/Kafka, and automate delivery through modern DevOps so businesses can act on high-fidelity data, fast. Role & Responsibilities Engineer robust data pipelines: build scalable batch & streaming workflows with Apache Spark, Kafka, and Azure Data Factory/Databricks. Implement Delta Lakehouse layers: design bronze-silver-gold medallion architecture to guarantee data quality and lineage. Automate CI/CD for ingestion: create Git-based workflows, containerized builds, and automated testing to ship reliable code. Craft clean, test-driven Python: develop modular PySpark/Pandas services, enforce SOLID principles, and maintain git-versioned repos. Optimize performance & reliability: profile jobs, tune clusters, and ensure SLAs for throughput, latency, and cost. Collaborate in Agile squads: partner with engineers, analysts, and consultants to translate business questions into data solutions. Skills & Qualifications Must-Have 1-2 yrs hands-on with Apache Spark or Kafka and Python (PySpark/Pandas/Polars). Experience building Delta Lake / medallion architectures on Azure or Databricks. Proven ability to design event-driven pipelines and write unit/integration tests. Git-centric workflow knowledge plus CI/CD tooling (GitHub Actions, Azure DevOps). Preferred Exposure to SQL/Relational & NoSQL stores and hybrid lake-house integrations. STEM/computer-science degree or equivalent foundation in algorithms and OOP. Benefits & Culture Highlights Flexible, remote-first teams: outcome-driven culture with quarterly hackathons and dedicated learning budgets. Growth runway: clear promotion paths from Associate to Senior Engineer, backed by certified Azure & Databricks training. Inclusive collaboration: small, empowered Agile squads that value knowledge-sharing, mentorship, and transparent feedback. Skills: modern javascript,cloud,vector databases,angular,pipelines,ci,containerization,ml,aws,langchain,shell scripting,mlops,performance testing,knowledge-graph design (rdf/owl/sparql),data,feature engineering,ci/cd,python,aws services (sagemaker, bedrock, lambda),synthetic-data augmentation,generative ai,data-cataloging,metadata management,lineage,data governance

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies