Home
Jobs

3827 Databricks Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview We are PepsiCo PepsiCo is one of the world's leading food and beverage companies with more than $79 Billion in Net Revenue and a global portfolio of diverse and beloved brands. We have a complementary food and beverage portfolio that includes 22 brands that each generate more than $1 Billion in annual retail sales. PepsiCo's products are sold in more than 200 countries and territories around the world. PepsiCo's strength is its people. We are over 250,000 game changers, mountain movers and history makers, located around the world, and united by a shared set of values and goals. We believe that acting ethically and responsibly is not only the right thing to do, but also the right thing to do for our business. At PepsiCo, we aim to deliver top-tier financial performance over the long term by integrating sustainability into our business strategy, leaving a positive imprint on society and the environment. We call this Winning with Pep+ Positive . For more information on PepsiCo and the opportunities it holds, visit www.pepsico.com. PepsiCo Data Analytics & AI Overview: With data deeply embedded in our DNA, PepsiCo Data, Analytics and AI (DA&AI) transforms data into consumer delight. We build and organize business-ready data that allows PepsiCo’s leaders to solve their problems with the highest degree of confidence. Our platform of data products and services ensures data is activated at scale. This enables new revenue streams, deeper partner relationships, new consumer experiences, and innovation across the enterprise. The Data Science Pillar in DA&AI will be the organization where Data Scientist and ML Engineers report to in the broader D+A Organization. Also DS will lead, facilitate and collaborate on the larger DS community in PepsiCo. DS will provide the talent for the development and support of DS component and its life cycle within DA&AI Products. And will support “pre-engagement” activities as requested and validated by the prioritization framework of DA&AI. Data Scientist-Gurugram and Hyderabad The role will work in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Machine Learning Services and Pipelines. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Collaborate with data engineers and ML engineers to understand data and models and leverage various advanced analytics capabilities Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Use big data technologies to help process data and build scaled data pipelines (batch to real time) Automate the end-to-end ML lifecycle with Azure Machine Learning and Azure/AWS/GCP Pipelines. Setup cloud alerts, monitors, dashboards, and logging and troubleshoot machine learning infrastructure Automate ML models deployments Qualifications Minimum 3years of hands-on work experience in data science / Machine learning Minimum 3year of SQL experience Experience in DevOps and Machine Learning (ML) with hands-on experience with one or more cloud service providers. BE/BS in Computer Science, Math, Physics, or other technical fields. Data Science - Hands on experience and strong knowledge of building machine learning models - supervised and unsupervised models Programming Skills - Hands-on experience in statistical programming languages like Python and database query languages like SQL Statistics - Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Any Cloud - Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Model deployment experience will be a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is required Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills

Posted 2 days ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expertlevel SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelors in Computer Science, Engineering, Information Systems (Masters preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how.

Posted 2 days ago

Apply

10.0 - 13.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Years of Experience: 10-13 Years old Location: Navi Mumbai (Locals or relocation candidate) Notice Period: Immediate joiner preferred or 30 days Max Budget: 60 LPA Job Description: We are looking for an experienced AI/ML Lead to spearhead the design, development, and deployment of intelligent solutions across cloud environments. The ideal candidate will have a strong background in deep learning, big data processing, and AI integrations, with proven experience in applying OCR and prompt engineering in real-world enterprise use cases. This role requires both technical depth and team leadership to deliver scalable, production-grade models and collaborate across business functions. Key Responsibilities: • Lead AI/ML solution design and delivery across cloud platforms (Azure ML, AWS SageMaker). • Build, train, and deploy advanced models using Deep Learning Frameworks like TensorFlow and PyTorch. • Apply OCR techniques (with CNNs) for document image understanding and automation. • Drive prompt engineering for AI integrations (e.g., Copilot, OpenAI, IBM Watson). • Optimize model pipelines for performance and scalability within Azure Synapse and cloud-based data warehouses. • Collaborate across departments to apply AI in business intelligence and enterprise data architecture use cases. • Manage and mentor a team of junior AI engineers and analysts. • Work on structured and unstructured data pipelines using tools like Databricks, Spark, and Dask. • Utilize and integrate Amazon EC2, ECR, S3, Redshift and manage model training and deployment in cloud environments. Must-Have Skills: • Azure ML, AWS (SageMaker, CodeGuru, EC2, ECR, S3) • OCR with CNN • Deep Learning: TensorFlow, PyTorch • Prompt Engineering with OpenAI/Copilot • Big Data: Databricks, Spark, Dask • Business Intelligence & AI Integrations • Team Leadership & Client Communication

Posted 2 days ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

The Opportunity Under general supervision, provide analysis, insight, and recommendations by synthesizing information for various product categories and geographies by using competitive analyses, analytical modeling, and leveraging knowledge of the product portfolio and customer programs. Manage pricing process for the sales team including assessing, approving, and loading all pricing requests. Review, research, and analyze pricing to make recommendations on price enhancements, SKU rationalization and margin opportunities. Process quotation requests for stock standard items and special order (customer specific) items. The Opportunity As a Senior Data Scientist, you will leverage advanced statistical methods and machine learning techniques to analyze complex datasets, generate insights, and solve business problems. You will be responsible for designing, developing and deploying supervised/unsupervised ML models, maintaining ML pipelines, performing model maintenance activities, and collaborating with stakeholders to ensure the data strategy aligns with business objectives. What We're Looking For The ideal candidate will be responsible for Collect, clean, and analyze large datasets to generate insights and recommendations. Perform Data analysis and provide decision points based on the same. Develop and implement machine learning models to solve business problems. Maintain ML pipeline and fix production issues in stipulated time. Work on model maintenance and improvise model performance when necessary. Create utilities which can be re-usable for data processing, feature engineering, etc. Conduct hypothesis testing and statistical analyses to understand data trends. Work with junior Data Scientist – perform technical reviews, provide technical support. Communicate findings to non-technical stakeholders through reports and visualizations. Collaborate with engineers, product teams, and business stakeholders to translate business goals into data-driven solutions. CERTIFICATIONS (Good To Have) ML/AI certifications Cloud certifications – like AWS, Azure, GCP etc. Databricks or Snowflake certification. Experience Bachelor’s degree/Master’s degree in a quantitative field such as Data Science, Statistics, Computer Science, or Engineering. 5+ years of experience in working with a product development company as a senior ML/AI engineer – Model development, Model deployment, Model Maintenance 5+ years of experience with machine learning and deep learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch) – developed, deployed and maintained 5+ projects in production 5+ years of experience and proficiency in programming languages such as Python, Pyspark, Java. Strong hands-on experience in SQL or data manipulation/query language. Worked on Production grade/POC projects including generative AI and LLMs. Strong analytical skills and proficiency in statistical analysis and data visualization. Good communication and comprehension skills to explain ML algorithms to stakeholders in a non-technical manner. Performed technical reviews and provided technical support to junior data scientists. 3+ years of experience in code version tools like – git, GitHub, GitLab etc. How You Will Thrive And Create An Impact With your analytical mindset and critical thinking, you will have the opportunity to participate in project development activities and suggest data driven and AI based solutions, which will be the foundation of the success of the project. You will have the opportunity to lead ML projects starting from solution designing to deployment in production. With you proficient coding skills, you will have the opportunity to develop robust ML pipelines which will be modular, scalable and re-usable. You will have the opportunity to interact with business stakeholder and get first hand feedback on the ML model deliveries. You will have the opportunity to work with great minds who focusses on delivering quality products. Disclaimer The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor? Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes people's lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his mom's voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd Party Non-solicitation Policy By submitting candidates without having been formally assigned on and contracted for a specific job requisition by Avantor, or by failing to comply with the Avantor recruitment process, you forfeit any fee on the submitted candidates, regardless of your usual terms and conditions. Avantor works with a preferred supplier list and will take the initiative to engage with recruitment agencies based on its needs and will not be accepting any form of solicitation

Posted 2 days ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Coimbatore, Bengaluru

Hybrid

Naukri logo

Job Purpose: Assist with the development and maintenance of software solutions for new and existing projects. Deep technical skills are required, along with an ability to understand how all pieces fit together and are validated in a complex, distributed system. Duties and Responsibilities : 1. Develop software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle 2. Determine operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions 3. Improve operations by conducting systems analysis; recommending changes in guidelines and procedures 4. Update job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations 5. Actively participates hand on in product development and roadmap definition 6. Develop prototypes to prove the solutions business value to the Product Requirements 7. Represents technical viewpoint for various technologies during strategic planning 8. Respond to stakeholder requirements from a requirements & technology standpoint and discuss concepts, solutions, technical feasibility & risks with them 9. Support Developers by providing advice, coaching and educational opportunities 10. Participate in knowledge-sharing code reviews 11. Adhere to the Code of Conduct and be familiar with all compliance policies and procedures. Experience Required : 1. Seven years plus experience in software development 2. Healthcare experience / RCM applications knowledge / project experience preferred 3. Experience working with global team and working in a team-oriented, collaborative environment 4 . Experience with agile development 5. Immediate Joiners / Short Notice Period candidates Preferred. Required skills and knowledge: 1. Advanced coding skills in C#, .NET 2. Strong working knowledge in SQL, REST, Angular / React / Node 3. Experience in Cloud native development using Azure / AWS / GCP, Containerization, GenAI, Agentic AI ML Algorithm 4. Expert in git, CI/CD, Terraform, Containerization, ADFS, MS Entra ID, Agile methodologies 5. Knowledge in application logging, security, authentications, authorizations 6. Object-Oriented Programming and design principles. Preferred skills and knowledge : 1. Knowledge in Modern data technologies e.g. Delta Lake, Azure Data Lake, Blob storage, NoSQL DB, Databricks, PySpark / Scala/ Spark SQL 2. Ability to solve problems quickly and completely 3. Ability to multi-task and stay organized in a dynamic work environment 4. Possesses a positive attitude and ability to think outside the box 5. Understands and anticipates the possible failures in a growing system and knows how to prevent them 6. Utilizes source control with multiple concurrent branches 7. Must possess hands on technical skills, along with an ability to work independently or under limited supervision & guidance 8. Ability to write routine reports and correspondence 9. Ability to communicate effectively verbally and in writing

Posted 2 days ago

Apply

2.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

The Opportunity Works independently under close supervision, provide analysis, insight, and recommendations by synthesizing information for various product categories and geographies by using competitive analyses, analytical modeling, and leveraging knowledge of the product portfolio and customer programs. Manage pricing process for the sales team including assessing, approving, and loading all pricing requests. Review, research, and analyze pricing to make recommendations on price enhancements, SKU rationalization and margin opportunities. Process quotation requests for stock standard items and special order (customer specific) items. The Opportunity As a Data Scientist, you will leverage advanced statistical methods and machine learning techniques to analyze complex datasets, generate insights, and solve business problems. You will be responsible for developing and deploying ML models, performing deep-dive analyses, and collaborating with stakeholders to ensure the data strategy aligns with business objectives. What We're Looking For The ideal candidate will be responsible for Collect, clean, and analyze large datasets to generate insights and recommendations. Develop and implement machine learning models to solve business problems. Maintain ML pipeline and fix production issues in stipulated time. Work on model maintenance and improvise model performance when necessary. Conduct hypothesis testing and statistical analyses to understand data trends. Communicate findings to non-technical stakeholders through reports and visualizations. Collaborate with engineers, product teams, and business stakeholders to translate business goals into data-driven solutions. CERTIFICATIONS (Good To Have) ML/AI certifications Cloud certifications – like AWS, Azure, GCP etc. Databricks or Snowflake certification. Experience Bachelor’s or master’s degree in a quantitative field such as Data Science, Statistics, Computer Science, or Engineering. 2+ years of experience in working with a product development company as a ML/AI engineer. 2+ years of experience and proficiency in programming languages such as Python, Pyspark, Java. Strong hands-on experience in SQL or data manipulation/query language. 2+ years of experience with machine learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch). Good knowledge on NLP, transformers, LLM and generative AI. Strong analytical skills and proficiency in statistical analysis and data visualization. Good communication and comprehension skills to explain ML algorithms to stakeholders in a non-technical manner. Good Team player and ability to work in cross-functional teams. How You Will Thrive And Create An Impact With your analytical mindset and critical thinking, you will have the opportunity to participate in project development activities and suggest data driven and AI based solutions, which will be the foundation of the success of the project. With you proficient coding skills, you will have the opportunity to develop robust ML pipelines which will be modular, scalable and re-usable. You will have the opportunity to work with great minds who focusses on delivering quality products. Disclaimer The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor? Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes people's lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his mom's voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd Party Non-solicitation Policy By submitting candidates without having been formally assigned on and contracted for a specific job requisition by Avantor, or by failing to comply with the Avantor recruitment process, you forfeit any fee on the submitted candidates, regardless of your usual terms and conditions. Avantor works with a preferred supplier list and will take the initiative to engage with recruitment agencies based on its needs and will not be accepting any form of solicitation

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About Us Our leading SaaS-based Global Employment Platform™ enables clients to expand into over 180 countries quickly and efficiently, without the complexities of establishing local entities. At G-P, we’re dedicated to breaking down barriers to global business and creating opportunities for everyone, everywhere. Our diverse, remote-first teams are essential to our success. We empower our Dream Team members with flexibility and resources, fostering an environment where innovation thrives and every contribution is valued and celebrated. The work you do here will positively impact lives around the world. We stand by our promise: Opportunity Made Possible. In addition to competitive compensation and benefits, we invite you to join us in expanding your skills and helping to reshape the future of work. At G-P, we assist organizations in building exceptional global teams in days, not months—streamlining the hiring, onboarding, and management process to unlock growth potential for all. About The Position As a Senior Engineering Manager at Globalization Partners, you will be responsible for both technical leadership and people management. This includes contributing to architectural discussions, decisions, and execution, as well as managing and developing a team of Data Engineers (of different experience levels). What You Can Expect To Do Own the strategic direction and execution of initiatives across our Data Platform, aligning technical vision with business goals. Guide teams through architectural decisions, delivery planning, and execution of complex programs that advance our platform capabilities. Lead and grow high-performing engineering teams responsible for the full data and analytics stack—from ingestion (ETL and Streaming) through transformation, storage, and consumption—ensuring quality, reliability, and performance at scale. Partner cross-functionally with product managers, architects, engineering leaders, and stakeholders from Cloud Engineering and other business domains to shape product and platform capabilities, translating business needs into actionable engineering plans. Drive delivery excellence by setting clear expectations, removing blockers, and ensuring engineering teams are progressing efficiently towards milestones while maintaining technical integrity. Ensure adoption and consistency of platform standards and best practices, including shared components, reusable libraries, and scalable data patterns. Support technical leadership across teams by fostering a strong culture of engineering excellence, security, and operational efficiency. Guide technical leads in maintaining high standards in architecture, development, and testing. Contribute to strategic planning, including the evolution of the data platform roadmap, migration strategies, and long-term technology investments aligned with company goals. Champion agile methodologies and DevOps practices, driving continuous improvement in team collaboration, delivery cycles, and operational maturity. Mentor and develop engineering talent, creating an environment where individuals can thrive through coaching, feedback, and growth opportunities. Promote a culture of innovation, accountability, and psychological safety. Challenge the Data Platform Quality and Performance by building/monitoring quality KPI and building a quality-first culture What We Are Looking For Proven experience leading geographically distributed engineering teams in the design and delivery of complex data and analytics platforms. Strong technical foundation with hands-on experience in modern data architectures, handling structured and unstructured data, and programming in Python—capable of guiding teams and reviewing design and code at a high level when necessary. Proficiency in SQL and relational database technologies, with the ability to guide data modeling and performance optimization discussions. In-depth understanding of ETL processes and data integration strategies, with practical experience overseeing data ingestion (batch and streaming), transformation, and quality assurance initiatives. Familiarity with commercial data platforms (e.g., Databricks, Snowflake) and cloud-native data warehouses (e.g., Redshift, BigQuery), including trade-offs and best practices in enterprise environments. Working knowledge of data governance and cataloging solutions, such as Atlan, Alation, Informatica, or Collibra, and experience supporting enterprise data stewardship efforts. Deep understanding of data quality, experience in building quality processes, and usage of tools like Monte Carlo. Understanding of machine learning and AI workloads, including the orchestration of data pipelines for model training and deployment in both batch and streaming contexts. Strong analytical and problem-solving skills, with the ability to drive root-cause analysis, evaluate architectural trade-offs, and support decision-making in ambiguous or fast-changing environments. Exceptional communication skills, with a track record of clear and effective collaboration across technical and non-technical stakeholders. Fluent in English, both verbal and written, with the ability to influence at all levels of the organization. Bachelor’s degree in Computer Science or a related field; advanced degrees or equivalent professional experience are a plus. We will consider for employment all qualified applicants who meet the inherent requirements for the position. Please note that background checks are required, and this may include criminal record checks. G-P. Global Made Possible. G-P is a proud Equal Opportunity Employer, and we are committed to building and maintaining a diverse, equitable and inclusive culture that celebrates authenticity. We prohibit discrimination and harassment against employees or applicants on the basis of race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth, and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other legally protected status. G-P also is committed to providing reasonable accommodations to individuals with disabilities. If you need an accommodation due to a disability during the interview process, please contact us at careers@g-p.com.

Posted 2 days ago

Apply

5.0 - 10.0 years

0 Lacs

India

On-site

Linkedin logo

About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Company Overview At Oportun, we are on a mission to foster financial inclusion for all by providing affordable and responsible lending solutions to underserved communities. As a purpose-driven financial technology company, we believe in empowering our customers with access to responsible credit that can positively transform their lives. Our relentless commitment to innovation and data-driven practices has positioned us as a leader in the industry, and we are actively seeking exceptional individuals to join our team as Senior Software Engineer to play a critical role in driving positive change. Position overview We are seeking a highly skilled Platform Engineer with expertise in building self-serve platforms that combine real-time ML deployment and advanced data engineering capabilities. This role requires a blend of cloud-native platform engineering, data pipeline development, and deployment expertise. The ideal candidate will have a strong background in implementing data workflows, building platforms to enable self-serve for ML pipelines while enabling seamless deployments. Responsibilities Platform Engineering Design and build self-serve platforms that support real-time ML deployment and robust data engineering workflows. Create APIs and backend services using Python and FastAPI to manage and monitor ML workflows and data pipelines. Real-Time ML Deployment Implement platforms for real-time ML inference using tools like AWS SageMaker and Databricks. Enable model versioning, monitoring, and lifecycle management with observability tools such as New Relic. Data Engineering Build and optimise ETL/ELT pipelines for data preprocessing, transformation, and storage using PySpark and Pandas. Develop and manage feature stores to ensure consistent, high-quality data for ML model training and deployment. Design scalable, distributed data pipelines on platforms like AWS, integrating tools such as DynamoDB, PostgreSQL, MongoDB, and MariaDB. CI/CD and Automation Use CI/CD pipelines using Jenkins, GitHub Actions, and other tools for automated deployments and testing. Automate data validation and monitoring processes to ensure high-quality and consistent data workflows. Documentation and Collaboration Create and maintain detailed technical documentation, including high-level and low-level architecture designs. Collaborate with cross-functional teams to gather requirements and deliver solutions that align with business goals. Participate in Agile processes such as sprint planning, daily standups, and retrospectives using tools like Jira. Experience Required Qualifications 5-10 years experience in IT 5-8 years experience in platform backend engineering 1 year experience in DevOps & data engineering roles. Hands-on experience with real-time ML model deployment and data engineering workflows. Technical Skills Strong expertise in Python and experience with Pandas, PySpark, and FastAPI. Proficiency in container orchestration tools such as Kubernetes (K8s) and Docker. Advanced knowledge of AWS services like SageMaker, Lambda, DynamoDB, EC2, and S3. Proven experience building and optimizing distributed data pipelines using Databricks and PySpark. Solid understanding of databases such as MongoDB, DynamoDB, MariaDB, and PostgreSQL. Proficiency with CI/CD tools like Jenkins, GitHub Actions, and related automation frameworks. Hands-on experience with observability tools like New Relic for monitoring and troubleshooting. We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate. California applicants can find a copy of Oportun's CCPA Notice here: https://oportun.com/privacy/california-privacy-notice/. We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3).

Posted 2 days ago

Apply

2.0 - 5.0 years

0 Lacs

India

On-site

Linkedin logo

About Oportun Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009. WORKING AT OPORTUN Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups. Position Overview We are seeking a Senior Product Analyst to join our Analytics function — a high-impact, cross-functional team that supports decision-making across operations, acquisition, servicing, and communications. In this highly visible role, you will collaborate with product, engineering, marketing, and risk teams to identify business opportunities, design experiments, assess performance, and support strategic initiatives across the customer lifecycle. Responsibilities Lead strategic workstreams across servicing, communications, acquisition, and operations. Design and execute A/B tests to evaluate initiatives such as vendor performance, funnel optimization, feature launches, and communication strategies. Develop and refine communication strategies, tailored to where customers are in their lifecycle, to improve engagement and drive business outcomes. Evaluate and recommend tools, platforms, and processes to improve operational and customer-facing outcomes. Develop and maintain dashboards, KPIs, and reports to measure initiatives and inform stakeholder decisions. Collaborate cross-functionally to ensure implementation of recommendations and track business outcomes. Build and deliver clear, action-oriented presentations that drive stakeholder buy-in and strategic decision-making. Requirements Bachelor’s degree in a quantitative discipline such as Statistics, Mathematics, Economics, Computer Science, Engineering, or a related field. 2 to 5 years of analytics experience, ideally within FinTech or Product-based firms. Strong proficiency in SQL. Familiarity with Python and ETL pipeline setup in Databricks is a plus. Experience with visualization tools such as Domo, Power BI, or similar. Demonstrated experience working with clickstream data and analyzing product funnels. Ability to thrive in a fast-paced environment with strong ownership and execution. Excellent written and verbal communication skills, including the ability to craft compelling presentations. Strong interpersonal skills with a track record of strong collaboration across cross-functional teams.

Posted 2 days ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role:-Business Analyst-Investment domain Exp:-7-14 Years Location:-Hyderabad Skills :- Business Analysis,Core BA,Any BI tools, Any cloud(AWS,Azure,GCP or Snowflack) and Investment domain is mandatory Please share your resumes to jyothsna.g@technogenindia.com, Job Description:- Experience: • Bachelor’s degree in Finance, Economics, or a related discipline. • 10+ years of experience as a BSA or similar role in data analytics or technology projects. • 5+ years of domain experience in asset management, investment management, insurance, or financial services. • Familiarity with Investment Operations concepts such as Critical Data Elements (CDEs), data traps, and reconciliation workflows. • Working knowledge of data engineering principles: ETL/ELT, data lakes, and data warehousing. • Proficiency in BI and analytics tools such as Power BI, Tableau, MicroStrategy, and SQL. • Excellent communication, analytical thinking, and stakeholder engagement skills. • Experience working in Agile/Scrum environments with cross-functional delivery teams. The Ideal Qualifications Technical Skills: • Proven track record of Analytical and Problem-Solving skills. • In-depth knowledge of investment data platforms, including GoldenSource, NeoXam, RIMES, JPM Fusion, etc. • Expertise in cloud data technologies such as Snowflake, Databricks, and AWS/GCP/Azure data services. • Strong understanding of data governance frameworks, metadata management, and data lineage. • Familiarity with regulatory requirements and compliance standards in the investment management industry. • Hands-on experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. • Experience with cloud data platforms like Snowflake and Databricks. • Background in data governance, metadata management, and data lineage frameworks. Soft Skills: • Exceptional communication and interpersonal skills. • Ability to influence and motivate teams without direct authority. • Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. • Ability to lead cross-functional teams and manage complex projects.

Posted 2 days ago

Apply

6.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. With proven technology, we process the largest volume of payments in the world, driving the global economy every day. When you join Worldpay, you join a global community of experts and changemakers, working to reinvent an industry by constantly evolving how we work and making the way millions of people pay easier, every day. What makes a Worldpayer? It’s simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We’re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we’re determined, always staying open – winning and failing as one. We’re looking for a Sr AWS Databricks Admin to join our Big Data Team to help us unleash the potential of every business. Are you ready to make your mark? Then you sound like a Worldpayer. About The Team We are seeking a talented and experienced Senior AWS Databricks Platform Admin to join our dynamic team. The ideal candidate will have at least 6 years of hands-on experience in Databricks practices, cloud infrastructure management, automation, and CI/CD pipeline development. This role will involve collaborating with development, operations, and quality assurance teams to streamline and optimize our software delivery processes What You Will Be Doing Run Databricks at a workspace and account level. Set up and advise on architecture and scaling for the Databricks environment, including administering, configuring and installing libraries. Monitor and manage the Databricks workspace, clusters, Tenants, including workspace creation, user management, cloud resources, account usage and jobs Ensure high availability and performance of the Databricks environment Implement and maintain Databricks clusters, including auto-scaling, configuration, and tuning. Manage user access and permissions, ensuring data security and compliance with company policies. Implement and monitor security controls, including encryption, authentication, and network security. Conduct regular security audits and vulnerability assessments. Manage data ingestion and ETL processes within Databricks. Integrate Databricks with other data sources and tools, such as data lakes, warehouses, and BI tools. Provide technical support and assistance to data scientists and analysts using Databricks. Support the development and deployment of machine learning models and data science workflows. Facilitate the use of R and other statistical tools within the Databricks environment. Automate routine tasks and processes using scripting and automation tools. Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps. What You Bring 5+ years of experience working as Databricks administrator/architect Proficiency in cloud platforms such as AWS, Azure, or Google Cloud. Strong fluency with Python or Java Serving as the Databricks account owner, including security and privacy setup, marketplace plugins and integration with other tools Experience Unity Catalog migration, workspaces and audit logs Strong experience with Amazon Web Services (AWS) accounts and high-level usage monitoring Proficiency in scripting languages such as Python, Bash, or PowerShell. Experience with CI/CD tools like Jenkins, GitLab CI, CircleCI, or similar. Familiarity with monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack). Excellent problem-solving skills and attention to detail. Strong communication skills and the ability to work collaboratively in a team environment. Experience with AWS CLI and Networking Experience with architecting and maintaining high availability production systems Experience with developing monitoring architecture and implementing monitoring agents, dashboards, escalations, and alerts Knowledge of security controls for the public cloud (encryption of data in motion/rest and key management) Demonstrated knowledge and hands-on experience with AWS alerting/monitoring tools Experience with infrastructure as code (IaC) tools such as Terraform. Added Bonus If You Have AWS certification Databricks Certification Where you’ll own it You’ll own it in our modern Bangalore hub. With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the globe. Worldpay Perks - What We’ll Bring For You We know it’s bigger than just your career. It’s your life, and your world. That’s why we offer global benefits and programs to support you at every stage. Here’s a taste of what you can expect. A competitive salary and benefits. Time to support charities and give back to your community. Parental leave policy. Global recognition platform. Virgin Pulse access. Global employee assistance program. What Makes a Worldpayer At Worldpay, we take our Values seriously, and we live them every day. Think like a customer, Act like an owner, and Win as a team. Curious. Humble. Creative. We ask the right questions, listening and learning to get better every day. We simplify the complex and we’re always looking to create a bigger impact for our colleagues and customers. Empowered. Accountable. Dynamic. We stay agile, using our initiative, taking calculated risks to progress. Never standing still, never settling, we work at pace to achieve our goals. We champion our ideas and stay flexible to make them happen. We know that every action adds up. Determined. Inclusive. Open. Unlocking potential means working as one global community. Our work spans borders, and we stay united by our purpose. We collaborate, always encouraging others to perform at their best, welcoming new perspectives. Does this sound like you? Then you sound like a Worldpayer. Apply now to write the next chapter in your career. We can’t wait to hear from you. To find out more about working with us, find us on LinkedIn. Privacy Statement Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass

Posted 2 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Opportunity Works independently under close supervision, provide analysis, insight, and recommendations by synthesizing information for various product categories and geographies by using competitive analyses, analytical modeling, and leveraging knowledge of the product portfolio and customer programs. Manage pricing process for the sales team including assessing, approving, and loading all pricing requests. Review, research, and analyze pricing to make recommendations on price enhancements, SKU rationalization and margin opportunities. Process quotation requests for stock standard items and special order (customer specific) items. The Opportunity As a Data Scientist, you will leverage advanced statistical methods and machine learning techniques to analyze complex datasets, generate insights, and solve business problems. You will be responsible for developing and deploying ML models, performing deep-dive analyses, and collaborating with stakeholders to ensure the data strategy aligns with business objectives. What We're Looking For The ideal candidate will be responsible for Collect, clean, and analyze large datasets to generate insights and recommendations. Develop and implement machine learning models to solve business problems. Maintain ML pipeline and fix production issues in stipulated time. Work on model maintenance and improvise model performance when necessary. Conduct hypothesis testing and statistical analyses to understand data trends. Communicate findings to non-technical stakeholders through reports and visualizations. Collaborate with engineers, product teams, and business stakeholders to translate business goals into data-driven solutions. CERTIFICATIONS (Good To Have) ML/AI certifications Cloud certifications – like AWS, Azure, GCP etc. Databricks or Snowflake certification. Experience Bachelor’s or master’s degree in a quantitative field such as Data Science, Statistics, Computer Science, or Engineering. 2+ years of experience in working with a product development company as a ML/AI engineer. 2+ years of experience and proficiency in programming languages such as Python, Pyspark, Java. Strong hands-on experience in SQL or data manipulation/query language. 2+ years of experience with machine learning libraries (e.g., Scikit-learn, TensorFlow, PyTorch). Good knowledge on NLP, transformers, LLM and generative AI. Strong analytical skills and proficiency in statistical analysis and data visualization. Good communication and comprehension skills to explain ML algorithms to stakeholders in a non-technical manner. Good Team player and ability to work in cross-functional teams. How You Will Thrive And Create An Impact With your analytical mindset and critical thinking, you will have the opportunity to participate in project development activities and suggest data driven and AI based solutions, which will be the foundation of the success of the project. With you proficient coding skills, you will have the opportunity to develop robust ML pipelines which will be modular, scalable and re-usable. You will have the opportunity to work with great minds who focusses on delivering quality products. Disclaimer The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor? Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes people's lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his mom's voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd Party Non-solicitation Policy By submitting candidates without having been formally assigned on and contracted for a specific job requisition by Avantor, or by failing to comply with the Avantor recruitment process, you forfeit any fee on the submitted candidates, regardless of your usual terms and conditions. Avantor works with a preferred supplier list and will take the initiative to engage with recruitment agencies based on its needs and will not be accepting any form of solicitation

Posted 2 days ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Description Supports, develops, and maintains a data and analytics platform. Effectively and efficiently processes, stores, and makes data available to analysts and other consumers. Works with Business and IT teams to understand requirements and best leverage technologies to enable agile data delivery at scale. Note:- Although the role category in the GPP is listed as Remote, the requirement is for a Hybrid work model. Key Responsibilities Oversee the development and deployment of end-to-end data ingestion pipelines using Azure Databricks, Apache Spark, and related technologies. Design high-performance, resilient, and scalable data architectures for data ingestion and processing. Provide technical guidance and mentorship to a team of data engineers. Collaborate with data scientists, business analysts, and stakeholders to integrate various data sources into the data lake/warehouse. Optimize data pipelines for speed, reliability, and cost efficiency in an Azure environment. Enforce and advocate for best practices in coding standards, version control, testing, and documentation. Work with Azure services such as Azure Data Lake Storage, Azure SQL Data Warehouse, Azure Synapse Analytics, and Azure Blob Storage. Implement data validation and data quality checks to ensure consistency, accuracy, and integrity. Identify and resolve complex technical issues proactively. Develop reliable, efficient, and scalable data pipelines with monitoring and alert mechanisms. Use agile development methodologies, including DevOps, Scrum, and Kanban. Responsibilities Technical Skills: Expertise in Spark, including optimization, debugging, and troubleshooting. Proficiency in Azure Databricks for distributed data processing. Strong coding skills in Python and Scala for data processing. Experience with SQL for handling large datasets. Knowledge of data formats such as Iceberg, Parquet, ORC, and Delta Lake. Understanding of cloud infrastructure and architecture principles, especially within Azure. Leadership & Soft Skills Proven ability to lead and mentor a team of data engineers. Excellent communication and interpersonal skills. Strong organizational skills with the ability to manage multiple tasks and priorities. Ability to work in a fast-paced, constantly evolving environment. Strong problem-solving, analytical, and troubleshooting abilities. Ability to collaborate effectively with cross-functional teams. Competencies System Requirements Engineering: Uses appropriate methods to translate stakeholder needs into verifiable requirements. Collaborates: Builds partnerships and works collaboratively to meet shared objectives. Communicates Effectively: Delivers clear, multi-mode communications tailored to different audiences. Customer Focus: Builds strong customer relationships and delivers customer-centric solutions. Decision Quality: Makes good and timely decisions to keep the organization moving forward. Data Extraction: Performs ETL activities and transforms data for consumption by downstream applications. Programming: Writes and tests computer code, version control, and build automation. Quality Assurance Metrics: Uses measurement science to assess solution effectiveness. Solution Documentation: Documents information for improved productivity and knowledge transfer. Solution Validation Testing: Ensures solutions meet design and customer requirements. Data Quality: Identifies, understands, and corrects data flaws. Problem Solving: Uses systematic analysis to address and resolve issues. Values Differences: Recognizes the value that diverse perspectives bring to an organization. Preferred Knowledge & Experience Exposure to Big Data open-source technologies (Spark, Scala/Java, Map-Reduce, Hive, HBase, Kafka, etc.). Experience with SQL and working with large datasets. Clustered compute cloud-based implementation experience. Familiarity with developing applications requiring large file movement in a cloud-based environment. Exposure to Agile software development and analytical solutions. Exposure to IoT technology. Qualifications Qualifications: Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 3 to 5 years of experience in data engineering or a related field. Strong hands-on experience with Azure Databricks, Apache Spark, Python/Scala, CI/CD, Snowflake, and Qlik for data processing. Experience working with multiple file formats like Parquet, Delta, and Iceberg. Knowledge of Kafka or similar streaming technologies. Experience with data governance and data security in Azure. Proven track record of building large-scale data ingestion and ETL pipelines in cloud environments. Deep understanding of Azure Data Services. Experience with CI/CD pipelines, version control (Git), Jenkins, and agile methodologies. Familiarity with data lakes, data warehouses, and modern data architectures. Experience with Qlik Replicate (optional). Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2411635 Relocation Package No

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

AWS Data Engineer: Primary skillsets: AWS, Pyspark, SQL, Databricks, Python Secondary skillset- Any ETL Tool, Github, DevOPs(CI-CD) Experience: 3-4yrs Degree in computer science, engineering, or similar fields Mandatory Skill Set: Python, PySpark , SQL, AWS with Designing, developing, testing and supporting data pipelines and applications. 3+ years working experience in data integration and pipeline development. 3+ years of Experience with AWS Cloud on data integration with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems Databricks, Redshift experience is a major plus. 3+ years of experience using SQL in related development of data warehouse projects/applications (Oracle & amp; SQL Server) Strong real-life experience in python development especially in PySpark in AWS Cloud environment Strong SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch Workflow management tools like Airflow AWS cloud services: RDS, AWS Lambda, AWS Glue, AWS Athena, EMR (equivalent tools in the GCP stack will also suffice) Good to Have: Snowflake, Palantir Foundry

Posted 2 days ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Linkedin logo

Job Title: Sr. Node.js Developer Location: Ahmedabad, Gujarat Job Type: Full Time Department: MEAN Stack About Simform: Simform is a premier digital engineering company specializing in Cloud, Data, AI/ML, and Experience Engineering to create seamless digital experiences and scalable products. Simform is a strong partner for Microsoft, AWS, Google Cloud, and Databricks. With a presence in 5+ countries, Simform primarily serves North America, the UK, and the Northern European market. Simform takes pride in being one of the most reputed employers in the region, having created a thriving work culture with a high work-life balance that gives a sense of freedom and opportunity to grow. Role Overview: We are looking for a Sr. Node Developer who not only possesses extensive backend expertise but also demonstrates proficiency in system design, cloud services, microservices architecture, and containerization. (Additionally, a good understanding of frontend tech stack to give support to frontend developers is highly valued) We're currently seeking a seasoned Senior Node.js Engineer who not only possesses extensive backend expertise but also demonstrates proficiency in system design, cloud services, microservices architecture, and containerization. (Additionally, a good understanding of frontend tech stack to give support to frontend developers is highly valued) Key Responsibilities: Develop reusable, testable, maintainable, and scalable code with a focus on unit testing. Implement robust security measures and data protection mechanisms across projects. Champion the implementation of design patterns such as Test-Driven Development (TDD) and Behavior-Driven Development (BDD). Actively participate in architecture design sessions and sprint planning meetings, contributing valuable insights. Lead code reviews, providing insightful comments and guidance to team members. Mentor team members, assisting in debugging complex issues and providing optimal solutions. Required Skills & Qualifications: Excellent written and verbal communication skills. Experience: 4+yrs Advanced knowledge of JavaScript and TypeScript, including core concepts and best practices. Extensive experience in developing highly scalable services and APIs using various protocols. Proficiency in data modeling and optimizing database performance in both SQL and NoSQL databases. Hands-on experience with PostgreSQL and MongoDB, leveraging technologies like TypeORM, Sequelize, or Knex. Proficient in working with frameworks like NestJS, LoopBack, Express, and other TypeScript-based frameworks. Strong familiarity with unit testing libraries such as Jest, Mocha, and Chai. Expertise in code versioning using Git or Bitbucket. Practical experience with Docker for building and deploying microservices. Strong command of Linux, including familiarity with server configurations. Familiarity with queuing protocols and asynchronous messaging systems. Preferred Qualification: Experience with frontend JavaScript concepts and frameworks such as ReactJS. Proficiency in designing and implementing cloud architectures, particularly on AWS services. Knowledge of GraphQL and its associated libraries like Apollo and Prisma. Hands-on experience with deployment pipelines and CI/CD processes. Experience with document, key/value, or other non-relational database systems like Elasticsearch, Redis, and DynamoDB. Ability to build AI-centric applications and work with machine learning models, Langchain, vector databases, embeddings, etc. Why Join Us: Young Team, Thriving Culture Flat-hierarchical, friendly, engineering-oriented, and growth-focused culture. Well-balanced learning and growth opportunities Free health insurance. Office facilities with a game zone, in-office kitchen with affordable lunch service, and free snacks. Sponsorship for certifications/events and library service. Flexible work timing, leaves for life events, WFH, and hybrid options

Posted 2 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Key Responsibilities JOB DESCRIPTION Lead Inventory and Pricing Optimization Initiatives: Design and implement predictive and prescriptive models to optimize inventory placement, demand forecasting, pricing optimization, clearance models etc. Advanced Modeling & Machine Learning: Apply techniques such as time series forecasting, probabilistic modeling, optimization algorithms (e.g., MIP), and reinforcement learning to solve inventory and pricing-related problems. Cross-functional Collaboration: Partner with stakeholders across supply chain, merchandising, and technology to align machine learning/analytics initiatives with business goals and operational constraints. Innovation & Thought Leadership: Identify emerging trends and technologies in supply chain and pricing optimization; evaluate and prototype novel and pragmatic solutions to complex problems. Mentoring & Leadership: Provide technical mentorship to other data scientists and contribute to the overall impactful growth of the organization. Effective Communication: Have effective communication skills to convey complex quantitative analyses, analytic methodologies, and findings in a clear, concise, and actionable manner Operational Impact: Translate data insights into actionable recommendations that directly influence decisions around inventory planning, replenishment, and distribution. What We're Looking For PhD or Master’s degree in a quantitative field from a reputed college (e.g., Operations Research, Statistics, Computer Science) 10+ years of industry experience in data science, with a strong focus on inventory, pricing and supply chain optimization Deep understanding of inventory theory, supply chain planning, stochastic modeling, forecasting and optimization techniques Proficient in Python, SQL, and one or more optimization libraries (e.g., Gurobi, Pyomo) Strong background in ML model development and experience in implementing large models in production. Experience with large-scale data platforms (e.g., Spark, Snowflake, Databricks) Strong business acumen with the ability to communicate complex technical ideas to non-technical stakeholders Preferred Qualifications Experience in retail, e-commerce, or manufacturing supply chains Familiarity with WMS, and/or Network optimization tools (Optilogic, Llamasoft) Experience working in Agile or cross-functional product teams Prior experience leading technical teams or projects About Us Fanatics is building a leading global digital sports platform. We ignite the passions of global sports fans and maximize the presence and reach for our hundreds of sports partners globally by offering products and services across Fanatics Commerce, Fanatics Collectibles, and Fanatics Betting & Gaming, allowing sports fans to Buy, Collect, and Bet. Through the Fanatics platform, sports fans can buy licensed fan gear, jerseys, lifestyle and streetwear products, headwear, and hardgoods; collect physical and digital trading cards, sports memorabilia, and other digital assets; and bet as the company builds its Sportsbook and iGaming platform. Fanatics has an established database of over 100 million global sports fans; a global partner network with approximately 900 sports properties, including major national and international professional sports leagues, players associations, teams, colleges, college conferences and retail partners, 2,500 athletes and celebrities, and 200 exclusive athletes; and over 2,000 retail locations, including its Lids retail stores. Our more than 22,000 employees are committed to relentlessly enhancing the fan experience and delighting sports fans globally. About The Team Fanatics Commerce is a leading designer, manufacturer, and seller of licensed fan gear, jerseys, lifestyle and streetwear products, headwear, and hardgoods. It operates a vertically-integrated platform of digital and physical capabilities for leading sports leagues, teams, colleges, and associations globally – as well as its flagship site, www.fanatics.com. Fanatics Commerce has a broad range of online, sports venue, and vertical apparel partnerships worldwide, including comprehensive partnerships with leading leagues, teams, colleges, and sports organizations across the world—including the NFL, NBA, MLB, NHL, MLS, Formula 1, and Australian Football League (AFL); the Dallas Cowboys, Golden State Warriors, Paris Saint-Germain, Manchester United, Chelsea FC, and Tokyo Giants; the University of Notre Dame, University of Alabama, and University of Texas; the International Olympic Committee (IOC), England Rugby, and the Union of European Football Associations (UEFA). At Fanatics Commerce, we infuse our BOLD Leadership Principles in everything we do: Build Championship Teams Obsessed with Fans Limitless Entrepreneurial Spirit Determined and Relentless Mindset

Posted 2 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description Job Summary This position provides input and support for full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She performs tasks within planned durations and established deadlines. This position collaborates with teams to ensure effective communication and support the achievement of objectives. He/She provides knowledge, development, maintenance, and support for applications. Responsibilities Generates application documentation. Contributes to systems analysis and design. Designs and develops moderately complex applications. Contributes to integration builds. Contributes to maintenance and support. Monitors emerging technologies and products. Technical Skills Cloud Platforms: Azure (Databricks, Data Factory, Data Lake Storage, Synapse Analytics). Data Processing: Databricks (PySpark, Spark SQL), Apache Spark. Programming Languages: Python, SQL Data Engineering Tools: Delta Lake, Azure Data Factory, Apache Airflow Other: Git, CI/CD Professional Experience Design and implementation of a scalable data lakehouse on Azure Databricks, optimizing data ingestion, processing, and analysis for improved business insights. Develop and maintain efficient data pipelines using PySpark and Spark SQL for extracting, transforming, and loading (ETL) data from diverse sources.(Azure and GCP). Develop SQL stored procedures for data integrity. Ensure data accuracy and consistency across all layers. Implement Delta Lake for ACID transactions and data versioning, ensuring data quality and reliability. Create frameworks using Databricks and Data Factory to process incremental data for external vendors and applications. Implement Azure functions to trigger and manage data processing workflows. Design and implement data pipelines to integrate various data sources and manage Databricks workflows for efficient data processing. Conduct performance tuning and optimization of data processing workflows. Provide technical support and troubleshooting for data processing issues. Experience with successful migrations from legacy data infrastructure to Azure Databricks, improving scalability and cost savings. Collaborate with data scientists and analysts to build interactive dashboards and visualizations on Databricks for data exploration and analysis. Effective oral and written management communication skills. Qualifications Minimum 5 years of Relevant experience Bachelor’s Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics or related field Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 2 days ago

Apply

10.0 years

0 Lacs

Telangana

On-site

GlassDoor logo

Citco Bank is seeking an experienced Data Scientist to lead our growing analytics team. The ideal candidate will combine strong technical expertise, business acumen, and proven management experience to develop and implement sophisticated machine learning models and data-driven solutions. This role will be crucial in driving data-based decision-making across our banking and financial services operations working closely with cross-functional teams to deliver actionable insights and innovative solutions. Lead the development and implementation of advanced machine learning models and statistical analyses to solve complex business problems Collaborate with business stakeholders to understand requirements and translate them into analytical solutions Manage a team of business intelligence experts Work closely together with the technology teams in implementing new techniques into the Citco Bank environments Develop predictive models for risk assessment, fraud detection, customer behavior analysis or forecasting Create and maintain detailed documentation of methodologies, models, and processes Design and build scalable data pipelines and ETL workflows Present findings and recommendations to senior management and stakeholders Monitor model performance and implement improvements as needed Ensure compliance with banking regulations and data governance policies Education: Master's degree or Ph.D. in Data Science, Statistics, Computer Science, Mathematics, or related field Professional certifications in relevant technologies or methodologies are a plus Experience: 10+ years of experience in data science, with at least 4 years in the banking/financial services sector Proven track record of successfully implemented machine learning models in production Minimum 3 years of experience managing and leading data science teams Demonstrated experience in building and developing high-performing teams Technical Skills: Advanced knowledge of machine learning algorithms and statistical modeling Strong expertise in Python, R, or similar programming languages Proficiency in SQL and experience with big data technologies (Hadoop, Spark, Databricks) Experience with deep learning frameworks Knowledge of cloud platforms (AWS, Azure, or GCP) Expertise in data visualization tools (Tableau, Power BI, Qlik) Strong understanding of version control systems (Git) Experience with MLOps and model deployments Business Skills: Proven people management and leadership abilities Experience in resource planning and team capacity management Excellent problem-solving and analytical thinking abilities Strong communication and presentation skills Ability to translate complex technical concepts to non-technical stakeholders Experience in agile development methodologies Understanding of banking and financial services domain Domain Knowledge: Understanding of risk management, compliance, products and regulatory requirements in banking Knowledge of financial markets and instruments is a plus

Posted 2 days ago

Apply

5.0 years

5 - 8 Lacs

Hyderābād

Remote

GlassDoor logo

About the role: The Data Intelligence Center of Excellence is looking for a high-performing Senior Data Scientist to support Blackbaud customers through the creation and maintenance of intelligent data products. Additionally, the senior data scientist will collaborate with team members on research and thought leadership initiatives. What you’ll do: Use statistical techniques to manage and analyze large volumes of complex data to generate compelling insights to include predictive modeling, storytelling, and data visualization Integrate data from multiple sources to create dashboards and other end-user reports Interact with internal customers to identify and define topics for research and experimentation Contribute to white papers, presentations, and conferences as needed Communicate insights and findings from analyses to product, service, and business managers Work with data science team to automate and streamline modeling processes Manage standard tables and programs within the data science infrastructure, providing updates as needed Maintain updated documentation of products and processes Participate in team planning and backlog grooming for data science roadmap What you'll bring: We are seeking a Data Scientist with 5+ years of hands-on experience demonstrating strong proficiency in the following areas 2+ years of machine learning and/or statistics experience 2+ years of experience with data analysis in Python, R, SQL, Spark, or similar Comfortable asking questions and performing in-depth research when given vague or incomplete specifications Confidence to learn product functionality on own initiative via back-end research, online training resources, product manuals, and developer forums Experience with Databricks, Databricks SQL Analytics is a plus Stay up to date on everything Blackbaud, Blackbaud is a digital-first company which embraces a flexible remote or hybrid work culture. Blackbaud supports hiring and career development for all roles from the location you are in today! Blackbaud is proud to be an equal opportunity employer and is committed to maintaining an inclusive work environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, physical or mental disability, age, or veteran status or any other basis protected by federal, state, or local law.

Posted 2 days ago

Apply

10.0 years

1 - 2 Lacs

Hyderābād

On-site

GlassDoor logo

Join Chief Data and Analytics Organization to architect and evolve firm wide Data Mesh, AI/ML and GenAI, and Data Governance platform. As a Senior Principal Architect at JPMorgan Chase within the Enterprise Technology function, you provide expertise to enhance and develop architecture platforms based on modern cloud-based technologies, as well as support the adoption of strategic global solutions. Leverage your advanced architecture capabilities and collaborate with colleagues across the organization to drive best-in-class outcomes. The Data Management group is part of the Chief Analytics and Data Office at JPMorgan Chase. It operates firm wide, liaising with every line of business to deliver innovative data management products, services and value add solutions. Our vision is to continuously evolve using best-in-class tools and technologies in managing vast amounts of data to enable AI/ML initiatives throughout the firm and enhance our ability to provide market leading capabilities. Our philosophy is to function in a product operating model to iteratively discover, design, develop and deliver great outcomes to our customers quickly and securely. Our team delivers comprehensive data management solutions for making JPMC’s data discoverable, understandable, observable, trustworthy, accessible, and interoperable for authorized personas to drive BI and AI/ML initiatives with agility and velocity. It includes center of excellences for multiple product domains, such as Data Lake Services, Data Migrations Services, Unified Data Catalog and Controls, Entitlements Services and Data Analytics & Business Intelligence services, to drive digital transformation, cloud adoption and AI/ML capabilities throughout the firm. Job Responsibilities Advises cross-functional teams on technology selection to achieve target state architecture and decisions on improvements to current technologies Develops multi-year roadmaps aligned with business and architecture strategy and priorities Creates complex and scalable coding frameworks using appropriate software design Collaborate with product owners, engineers, clients, and vendors to establish and maintain platform objectives and target state architecture Create and evolve strategic architecture that enable our clients to achieve business objective (requires experience with C4, DDD, data modeling, and knowledge of SDLC practices) Focus on integrated “user experience” to enforce system thinking approach as part of the platform delivery Organize and manage solutions libraries that focus on end-to-end data lifecycle experience form data origination, data management, data governance, and data consumption Develops secure and high-quality production code, and reviews and debugs code written by others Serves as the function’s go-to subject matter expert Contributes to the development of technical methods in specialized fields in line with the latest product development methodologies Creates durable, reusable software frameworks that improves velocity and quality of output across teams and functions Champions the firm’s culture of diversity, equity, inclusion, and respect Required Qualifications, capabilities, and skills Formal training or certification on architecture concepts and 10+ years applied experience. Experience applying expertise and new methods to determine solutions for complex architecture problems in one or more technical disciplines Hands-on practical experience delivering system design, application development, testing, and operational stability Expertise in one or more programming language of Python or Java Expert level experience with AWS or other Public Cloud providers Expert level with Databricks, Snowflake, Starburst, Trino, AWS Data, AWS Analytics, and AWS AI/ML Hands on experience delivering BI and AI/ML solutions for business stakeholders and data scientists Influencer with a proven record of successfully driving change and transforming across organizational boundaries Ability to present and effectively communicate to Senior Leaders and Executives

Posted 2 days ago

Apply

10.0 years

1 - 2 Lacs

Hyderābād

On-site

GlassDoor logo

JOB DESCRIPTION Join Chief Data and Analytics Organization to architect and evolve firm wide Data Mesh, AI/ML and GenAI, and Data Governance platform. As a Senior Principal Architect at JPMorgan Chase within the Enterprise Technology function, you provide expertise to enhance and develop architecture platforms based on modern cloud-based technologies, as well as support the adoption of strategic global solutions. Leverage your advanced architecture capabilities and collaborate with colleagues across the organization to drive best-in-class outcomes. The Data Management group is part of the Chief Analytics and Data Office at JPMorgan Chase. It operates firm wide, liaising with every line of business to deliver innovative data management products, services and value add solutions. Our vision is to continuously evolve using best-in-class tools and technologies in managing vast amounts of data to enable AI/ML initiatives throughout the firm and enhance our ability to provide market leading capabilities. Our philosophy is to function in a product operating model to iteratively discover, design, develop and deliver great outcomes to our customers quickly and securely. Our team delivers comprehensive data management solutions for making JPMC’s data discoverable, understandable, observable, trustworthy, accessible, and interoperable for authorized personas to drive BI and AI/ML initiatives with agility and velocity. It includes center of excellences for multiple product domains, such as Data Lake Services, Data Migrations Services, Unified Data Catalog and Controls, Entitlements Services and Data Analytics & Business Intelligence services, to drive digital transformation, cloud adoption and AI/ML capabilities throughout the firm. Job Responsibilities Advises cross-functional teams on technology selection to achieve target state architecture and decisions on improvements to current technologies Develops multi-year roadmaps aligned with business and architecture strategy and priorities Creates complex and scalable coding frameworks using appropriate software design Collaborate with product owners, engineers, clients, and vendors to establish and maintain platform objectives and target state architecture Create and evolve strategic architecture that enable our clients to achieve business objective (requires experience with C4, DDD, data modeling, and knowledge of SDLC practices) Focus on integrated “user experience” to enforce system thinking approach as part of the platform delivery Organize and manage solutions libraries that focus on end-to-end data lifecycle experience form data origination, data management, data governance, and data consumption Develops secure and high-quality production code, and reviews and debugs code written by others Serves as the function’s go-to subject matter expert Contributes to the development of technical methods in specialized fields in line with the latest product development methodologies Creates durable, reusable software frameworks that improves velocity and quality of output across teams and functions Champions the firm’s culture of diversity, equity, inclusion, and respect Required Qualifications, capabilities, and skills Formal training or certification on architecture concepts and 10+ years applied experience. Experience applying expertise and new methods to determine solutions for complex architecture problems in one or more technical disciplines Hands-on practical experience delivering system design, application development, testing, and operational stability Expertise in one or more programming language of Python or Java Expert level experience with AWS or other Public Cloud providers Expert level with Databricks, Snowflake, Starburst, Trino, AWS Data, AWS Analytics, and AWS AI/ML Hands on experience delivering BI and AI/ML solutions for business stakeholders and data scientists Influencer with a proven record of successfully driving change and transforming across organizational boundaries Ability to present and effectively communicate to Senior Leaders and Executives ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success.

Posted 2 days ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Business Agility Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs. Additionally, you will monitor and optimize data workflows to enhance performance and reliability, ensuring that data is accessible and actionable for stakeholders. Roles & Responsibilities: - Need Databricks resource with Azure cloud experience - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Collaborate with data architects and analysts to design scalable data solutions. - Implement best practices for data governance and security throughout the data lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Good To Have Skills: Experience with Business Agility. - Strong understanding of data modeling and database design principles. - Experience with data integration tools and ETL processes. - Familiarity with cloud platforms and services related to data storage and processing. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education

Posted 2 days ago

Apply

0 years

10 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description: We are seeking a skilled Data Engineer to join our team, with a dual focus on infrastructure maintenance and seamless onboarding of data views. The ideal candidate will play a key role in ensuring stable data platform operations while enabling efficient data integration, especially in the context of complex upstream changes and fiscal year transitions. --- Key Responsibilities: · Perform infrastructure maintenance, including: o Azure subscription management, o Azure Infrastructure and Platform Operations o ETL pipeline monitoring o Source path validation and updates o Proactive issue identification and resolution · Manage data onboarding activities, including: o Integration of new data sources and views o Adjustments for FY rollover and evolving upstream systems · Collaborate with cross-functional teams to align platform readiness with business Please use these source to look for candidates Required Skills & Qualifications: Education: Bachelor's or master’s degree in computer science, Engineering, Information Systems, or a related technical field. Programming Languages: Proficiency in one or more programming languages commonly used in data engineering (e.g., Python, Java, Scala, SQL). Database Expertise: Strong knowledge of database systems, data modeling techniques, and advanced SQL proficiency. Experience with NoSQL databases is often a plus. ETL Tools & Concepts: Solid understanding of ETL/ELT processes and experience with relevant tools (e.g., Apache Airflow, Talend, Databricks, Azure Data Factory). Job Type: Full-time Pay: From ₹1,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 2 days ago

Apply

7.0 - 9.0 years

6 - 7 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

7 - 9 Years 1 Opening Trivandrum Role description 1. Production monitoring and troubleshooting in on Prem ETL and AWS environment 2. Working experience using ETL Datastage along with DB2 3. Awareness to use tools such as Dynatrace, Appdynamics, Postman , AWS CICD 4. Software code development experience in ETL batch processing and AWS cloud 5. Software code management, repository updates and reuse 6. Implementation and/or configuration, management, and maintenance of software 7. Implementation and configuration of SaaS and public, private and hybrid cloud-based PaaS solutions 8. Integration of SaaS and PaaS solutions with Data Warehouse Application Systems including SaaS and PaaS upgrade management 9. Configuration, Maintenance and support for entire DWA Application Systems landscape including but not limited to supporting DWA Application Systems components and tasks required to deliver business processes and functionally (e.g., logical layers of databases, data marts, logical and physical data warehouses, middleware, interfaces, shell scripts, massive data transfer and uploads, web development, mobile app development, web services and APIs) 10. DWA Application Systems support for day-to-day changes and business continuity and for addressing key business, regulatory, legal or fiscal requirements 11. Support for all Third-party specialized DWA Application Systems 12. DWA Application Systems configuration and collaboration with infrastructure service supplier required to provide application access to external/third parties 13. Integration with internal and external systems (e.g., direct application interfaces, logical middleware configuration and application program interface (API) use and development) 14. Collaboration with third party suppliers such as infrastructure service supplier and enterprise public cloud providers 15. Documentation and end user training of new functionality 16. All activities required to support business process application functionality and to deliver the required application and business functions to End Users in an integrated service delivery model across the DWA Application Development lifecycle (e.g., plan, deliver, run) . Maintain data quality and run batch schedules , Operations and Maintenance 17. Deploy code to all the environments (Prod, UAT, Performance, SIT etc.) 18. Address all open tickets within the SLA CDK (Typescript) CFT (YAML) Nice to have GitHub Scripting -Bash/SH Security minded/best practices known Python Databricks & Snowflake Skills Databricks,Datastage,CloudOps,production support About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 2 days ago

Apply

6.0 - 7.0 years

4 - 6 Lacs

Gurgaon

On-site

GlassDoor logo

Job Purpose We are looking for a skilled and versatile Data Engineer with hands-on experience in Databricks and backend development using either .NET or FastAPI. The ideal candidate will be responsible for building scalable data pipelines, integrating APIs, and enabling data-driven applications in a cloud-based environment. Desired Skills and experience  6-7 years’ experience as pure hands-on databricks engineer.  Strong understanding of full stack application development.  Experience within the pharma domain is a bonus  Programming Languages: Python, Pyspark  Tools: Databricks, MLFlow  Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts  Experience with delivering projects within an agile environment Key Responsibilities  Own and deliver bug fix/ feature development in databricks  Should be able to work independently on python/pyspark development  Own the development/maintenance of databricks workflows  Able to deploy external machine learning models in databricks and holds good understanding of AI/ML stack of databricks  Hands on experience in integrating databricks data with .net/fast apis  Hands on experience in migrating data across databricks clusters  Experienced in handling data quality issues in databricks platform  Able to deploy databricks models in code orange unity catalog.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies