Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
6 - 10 Lacs
Pune
Hybrid
Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred
Posted 3 days ago
4.0 - 6.0 years
7 - 12 Lacs
Hyderabad
Work from Office
As a Senior Cloud Data Platform (GCP) Engineer at Incedo, you will be responsible for managing and optimizing the Google Cloud Platform environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or Azure and have experience with big data technologies such as Hadoop or Spark. You will be responsible for configuring and optimizing the GCP environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that the GCP environment is secure and complies with relevant regulations. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Google Cloud Platform (GCP) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of GCP services and tools such as Google Cloud Storage, Google BigQuery, and Google Cloud Dataflow Experience in building scalable and reliable data pipelines using GCP services, Apache Beam, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on GCP Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 days ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 days ago
3.0 - 8.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description Understands the process flow and the impact on the project module outcome. Works on coding assignments for specific technologies basis the project requirements and documentation available Debugs basic software components and identifies code defects. Focusses on building depth in project specific technologies. Expected to develop domain knowledge along with technical skills. Effectively communicate with team members, project managers and clients, as required. A proven high-performer and team-player, with the ability to take the lead on projects. Design and create S3 buckets and folder structures (raw, cleansed_data, output, script, temp-dir, spark-ui) Develop AWS Lambda functions (Python/Boto3) to download Bhav Copy via REST API and ingest into S3 Author and maintain AWS Glue Spark jobs to: partition data by scrip, year and month convert CSV to Parquet with Snappy compression Configure and run AWS Glue Crawlers to populate the Glue Data Catalog Write and optimize AWS Athena SQL queries to generate business-ready datasets Monitor, troubleshoot and tune data workflows for cost and performance Document architecture, code and operational runbooks Collaborate with analytics and downstream teams to understand requirements and deliver SLAs Technical Skills 3+ years hands-on experience with AWS data services (S3, Lambda, Glue, Athena) PostgreSQL basics Proficient in SQL and data partitioning strategies Experience with Parquet file formats and compression techniques (Snappy) Ability to configure Glue Crawlers and manage the AWS Glue Data Catalog Understanding of serverless architecture and best practices in security, encryption and cost control Good documentation, communication and problem-solving skills Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 days ago
6.0 - 11.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About the job : Zeta Global is looking for an experienced ML Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organization. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning best-practices and industry standards. Be able to design and execute experiments to measure the impact of new features and machine learning models on business outcomes. Empower the product and engineering teams to make data-driven decisions. What you need to succeed 46 years of proven, hands-on industry experience as a Machine Learning Engineer. Strong command of Python and SQL; experience with distributed computing frameworks like Spark or Hadoop is a significant plus. A bachelors degree in computer science, Mathematics, Physics, Statistics, Operations Research, or a related field is preferred. Solid experience with ML infrastructureincluding model deployment, evaluation, data processing, and debuggingwith a track record of building scalable ML solutions from business requirements. Proficient in Applied LLMs and NLP, with experience developing solutions using frameworks such as Lang Chain, Lang Smith, and backend tools like Fast API. Hands-on experience with LLMs like GPT or Claude is highly desirable. Strong background in building, deploying, and integrating AI agents using LLMs is a big plus. Comfortable with software engineering best practices, including version control, unit/integration testing, and setting up CI/CD pipelines for reliable ML deployment. Committed to high code quality, with an emphasis on thorough code reviews and testing. Excellent communication skills, with the ability to collaborate across technical and non-technical teams and clearly present outcomes to business stakeholders. Our technology runs on the Zeta Marketing Platform, which powers end to end marketing programs for some of the worlds leading brands. With expertise encompassing all digital marketing channels Email, Display, Social, Search and Mobile Zeta orchestrates acquisition and engagement programs that deliver results that are scalable, repeatable and sustainable.
Posted 3 days ago
11.0 - 12.0 years
19 - 25 Lacs
Hyderabad
Work from Office
As a Principle Engineer - Data Science and Modeling at Incedo, you will be responsible for developing and deploying predictive models and machine learning algorithms to support business decision-making. You will work with data scientists, data engineers, and business analysts to understand business requirements and develop data-driven solutions. You will be skilled in programming languages such as Python or R and have experience in data science tools such as TensorFlow or Keras. You will be responsible for ensuring that models are accurate, efficient, and scalable. Roles & Responsibilities: Developing and implementing machine learning models and algorithms to solve complex business problems Conducting data analysis and modeling using statistical and data analysis tools Collaborating with other teams to ensure the consistency and integrity of data Providing guidance and mentorship to junior data science and modeling specialists Presenting findings and recommendations to stakeholders Foster a collaborative and supportive work environment, promoting open communication and teamwork. Demonstrate strong leadership skills, with the ability to inspire and motivate team members to perform at their best. Technical Skills Skills Requirements: Proficiency in statistical analysis techniques such as regression analysis, hypothesis testing, or time-series analysis. Knowledge of machine learning algorithms and techniques such as supervised learning, unsupervised learning, or reinforcement learning. Experience with data wrangling and data cleaning techniques using tools such as Python, R, or SQL. Understanding of big data technologies such as Hadoop, Spark, or Hive. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be able to identify problems, generate effective solutions, and troubleshoot issues that may arise while working on complex projects. Must Display Adaptability to changing circumstances, new technology, and shifting priorities. Nice-to-have skills Qualifications Qualifications 11-12 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 3 days ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Senior Software Engineer Data Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake
Posted 3 days ago
125.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description Associate Director – New Products Introduction Our organization is a global health care leader with a diversified portfolio of prescription medicines, vaccines and animal health products. The difference between potential and achievement lies in the spark that fuels innovation and inventiveness; this is the space where our organization has codified its 125-year legacy. Our success is backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. As our organization in India moves towards accelerating access and rapidly scaling the commercial business, we are seeking a New Products Lead. The NPI Lead will play a strong role in supporting the India Leadership team, ensuring that the assets in Our Company product pipeline are evaluated from commercial perspective readying for timely Launch & Access in India. Key Responsibilities Pipeline Management Proactively identify, assess, and prioritize new pharmaceutical products and candidates within our company global pipeline that have the potential to meet the specific healthcare needs and market opportunities in India. This involves conducting rigorous scientific and commercial evaluations, aligning product attributes with local epidemiology, regulatory requirements, and patient demographics. Monitor the progress of these products through various development stages, ensuring timely decision-making to optimize the portfolio for maximum impact and return on investment. Stakeholder Engagement Establish and nurture strong, collaborative relationships with a diverse range of stakeholders at market, region and global level, including regional business leaders like APLT, global product teams, clinical development and regulatory affairs team, and external partners such as healthcare providers and key opinion leaders. Facilitate transparent and effective communication channels to ensure alignment on strategic priorities. Provide regular, data-driven updates and strategic insights to senior leadership, highlighting the potential value, risks, and opportunities within the pipeline. Act as a trusted advisor by delivering actionable recommendations based on comprehensive analysis and market intelligence. Market Landscape Analysis Conduct in-depth analyses of the Indian pharmaceutical market landscape, including competitive products, emerging therapies, regulatory trends, payer environments, and patient access challenges. Develop a nuanced understanding of therapeutic areas relevant to the pipeline, identifying unmet medical needs and potential barriers to entry. Utilize this intelligence to inform and refine product positioning, pricing strategies, and market access plans, ensuring that the product strategy is both competitive and aligned with broader healthcare system goals. Financial Modeling Design and execute detailed financial models to evaluate the commercial viability and forecast the financial performance of new product candidates. Incorporate variables such as market size, pricing scenarios, reimbursement pathways, cost of goods, marketing expenses, and competitive dynamics. Use these models to support investment decisions, resource allocation, and strategic planning, providing clear, quantitative justifications for product prioritization and launch readiness. Cross-Functional Collaboration Support cross-functional teams including marketing, sales, regulatory affairs, medical affairs, and supply chain in coordinating product development and launch activities. Assist in organizing workshops, strategy sessions, and project tracking to help keep teams aligned on goals, timelines, and deliverables. Help gather and incorporate market insights and customer feedback into product development efforts to contribute to effective and market-driven launch plans. Strategy Development Develop and maintain a clear, strategic agenda for all product-related discussions and decision-making forums. Ensure that these agendas are focused on addressing the most critical unmet medical needs in the Indian market while aligning with our company’s overarching corporate goals and values. Prepare detailed briefing materials, prioritize discussion topics, and guide meetings to foster productive dialogue and consensus-building among stakeholders. Continuously revisit and adjust the agenda based on evolving market conditions, pipeline developments, and organizational priorities. Qualifications & Experience Graduate/ postgraduate from a Top Tier Engineering/ B School. 8+ years of experience in a project management, strategy and/or analytics role ideally in a healthcare corporate, startup and/or a leading consulting firm. Skills Shaping the future of new product development landscape of our company. Delivering in an increasingly complex industry landscape navigating multiple evolving challenges Execution Excellence Project execution, attention to detail and coordination skills Proven track record of “getting things done”. Strong analytical and problem-solving skills with experience in excel analysis / modeling Excellent communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels – regional and global. Proven ability to work collaboratively in cross-functional teams. Analytical thinking ability to problem solve, think holistically about organizational solutions to complex problems, create order and simplicity out of complex problems. We are a research-driven biopharmaceutical company. Our mission is built on the simple premise that if we “follow the science” and that great medicines can make a significant impact to our world. And we believe that a research-driven enterprise dedicated to world-class science can succeed by inventing medicine and vaccine innovations that make a difference for patients across the globe. Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Adaptability, Business, Content Creation, Cross-Functional Teamwork, Customer Engagement, Data Analysis, Digital Marketing, Direct Marketing, Email Marketing, Innovation, Marketing Automation, Marketing Communications Planning, Product Management, Product Planning, Search Engine Marketing (SEM), Social Media Operations, Software Product Management, Strategic Thinking, Vendor Management Preferred Skills Job Posting End Date 08/1/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R357809
Posted 3 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Reference # 319538BR Job Type Full Time Your role Are you fascinated by Advanced Analytics leveraging the latest Big Data technologies? We’re looking for a Senior Data Engineer to design, develop, lead the implementations of data products for analytics and business intelligence systems. At this role, you will: involve in the end-to-end development lifecycle of data engineering projects, from conceptualization to deployment, ensuring high-quality and scalable. recognize opportunities to reuse existing data flows and optimize to perform optimally lead the build of data ingestion and transformations that align with business needs ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements. work on data management - build, test, and maintain data pipelines fostering a culture of a continuous learning and mindsets for ways to improve, automate and reduce time to market. Your team You'll be part of the WMA STAAT Analytics team in Pune. We are developing cutting edge digital solutions to help internal and external clients make better decisions with data. You will be part of a growing team that is focused on blending solutions covering digital marketing, technology, optimization and analytics to drive business results and digital transformation. You'll be working in a global team, with colleagues in the United Kingdom, the United States and India. Your expertise understanding of data modelling, data warehousing principles and lakehouse architecture. excellent hands on / programming skills on big data analytics azure cloud & on-prem technologies - spark, python, azure synapse, data factory , azure databricks. advanced working knowledge of sql with experience in dwh / etl implementation and understand the etl/elt design patterns. good understanding of columnar and/or time series data design patterns and performance turning techniques understand the impact on the organisation of emerging trends in data tools, analysis techniques and data usage understanding of data / information security principles to ensure compliant handling and management of data experience in application development , including analyzing stories, writing code, implementing automated tests, contributing to release and iteration planning and developing the working practices of the team a conceptual understanding of software development including the full project lifecycle from working on multiple substantial projects using agile development methodologies About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 3 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Reference # 320031BR Job Type Full Time Your role Are you fascinated by advanced analytics leveraging the latest Big Data technologies? We’re looking for a Senior Data Engineer to design, develop, lead the implementations of data products for analytics and business intelligence systems. At this role, you will: involve in the end-to-end development lifecycle of data engineering projects, from conceptualization to deployment, ensuring high-quality and scalable. recognize opportunities to reuse existing data flows and optimize to perform optimally lead the build of data ingestion and transformations that align with business needs ensure the quality, security, reliability, and compliance of our solutions by applying our digital principles and implementing both functional and non-functional requirements. work on data management - build, test, and maintain data pipelines fostering a culture of a continuous learning and mindsets for ways to improve, automate and reduce time to market. Your team You'll be part of the WMA STAAT Analytics team in Pune. We are developing cutting edge digital solutions to help internal and external clients make better decisions with data. You will be part of a growing team that is focused on blending solutions covering digital marketing, technology, optimization and analytics to drive business results and digital transformation. You'll be working in a global team, with colleagues in the United Kingdom, the United States and India. Your expertise understanding of data modelling, data warehousing principles and lakehouse architecture. excellent hands on / programming skills on big data analytics azure cloud & on-prem technologies - spark, python, azure synapse, data factory , azure databricks. advanced working knowledge of sql with experience in dwh / etl implementation and understand the etl/elt design patterns. good understanding of columnar and/or time series data design patterns and performance turning techniques understand the impact on the organization of emerging trends in data tools, analysis techniques and data usage understanding of data / information security principles to ensure compliant handling and management of data communicate effectively with technical and non-technical stakeholders experience in application development , including analyzing stories, writing code, implementing automated tests, contributing to release and iteration planning and developing the working practices of the team About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.
Posted 3 days ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: Data Scientist Location: [Insert Location] Experience: 5–10 years (flexible based on expertise) Employment Type: Full-Time Compensation: [Insert Budget / Competitive as per industry standards] About the Role: We are looking for a highly skilled and innovative Data Scientist with deep expertise in Machine Learning, AI, and Cloud Technologies to join our dynamic analytics team. The ideal candidate will have hands-on experience in NLP, LLMs, Computer Vision , and advanced statistical techniques, along with the ability to lead cross-functional teams and drive data-driven strategies in a fast-paced environment. Key Responsibilities: Develop and deploy end-to-end machine learning pipelines including data preprocessing, modeling, evaluation, and production deployment. Work on cutting-edge AI/ML applications such as LLM-finetuning, NLP, Computer Vision, Hybrid Recommendation Systems , and RAG/CAG techniques . Leverage platforms like AWS (SageMaker, EC2) and Databricks for scalable model development and deployment. Handle data at scale using Spark, Python, SQL , and integrate with NoSQL and Vector Databases (Neo4j, Cassandra) . Design interactive dashboards and visualizations using Tableau for actionable insights. Collaborate with cross-functional stakeholders to translate business problems into analytical solutions. Guide data curation efforts and ensure high-quality training datasets for supervised and unsupervised learning. Lead initiatives around AutoML, XGBoost, Topic Modeling (LDA/LSA), Doc2Vec , and Object Detection & Tracking . Drive agile practices including Sprint Planning, Resource Allocation, and Change Management . Communicate results and recommendations effectively to executive leadership and business teams. Mentor junior team members and foster a culture of continuous learning and innovation. Technical Skills Required: Programming: Python, SQL, Spark Machine Learning & AI: NLP, LLMs, Deep Learning, Computer Vision, Hybrid Recommenders Techniques: RAG, CAG, LLM-Finetuning, Statistical Modeling, AutoML, Doc2Vec Data Platforms: AWS (SageMaker, EC2), Databricks Databases: SQL, NoSQL, Neo4j, Cassandra, Vector DBs Visualization Tools: Tableau Certifications (Preferred): IBM Data Science Specialization Deep Learning Nanodegree (Udacity) SAFe® DevOps Practitioner Certified Agile Scrum Master Professional Competencies: Proven experience in team leadership, stakeholder management , and strategic planning . Strong cross-functional collaboration and ability to drive alignment across product, engineering, and analytics teams. Excellent problem-solving, communication, and decision-making skills. Ability to manage conflict resolution, negotiation , and performance optimization within teams.
Posted 3 days ago
50.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Your Team Responsibilities The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Your Key Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Your Skills And Experience That Will Help You Excel Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us As a Fortune 50 company with more than 4000,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Working at Target means the opportunity to help all families discover the joy of everyday life. Caring for our communities is woven into who we are, and we invest in the places we collectively live, work and play. We prioritize relationships, fuel and develop talent by creating growth opportunities, and succeed as one Target team. At our core, our purpose is ingrained in who we are, what we value, and how we work. It’s how we care, grow, and win together. Position Overview We are seeking a highly skilled Senior Backend Engineer with expertise in distributed systems and extensive knowledge of Spring Boot to join our dynamic team. The ideal candidate will be responsible for designing, developing, and maintaining high-performance, scalable distributed systems using Java and related technologies. As a Senior Backend Engineer, you will play a crucial role in architecting and implementing solutions that meet the demands of our rapidly growing platform. Core responsibilities of this job are described within this job description. Job duties may change at any time due to business needs. Responsibilities Design & build high-quality, scalable, and resilient distributed systems with Java and Spring Boot. Collaborate with product, architecture, and UX partners to gather requirements, define architecture, and translate them into working software. Apply best practices for distributed systems—fault tolerance, horizontal scalability, performance tuning, and observability. Write clean, maintainable code and comprehensive tests that follow established standards. Review code & mentor junior engineers, offering constructive feedback and coaching. Troubleshoot production issues, isolate root causes, and deliver timely fixes with minimal user impact. Stay current on emerging tech, industry trends, and Java/Spring ecosystem updates, sharing insights with the team. Participate fully in Agile rituals (sprint planning, daily stand-ups, retros) and deliver work within committed time frames. Partner with DevOps to automate CI/CD pipelines and optimise runtime performance in Kubernetes. Contribute to architecture reviews, documenting designs, decisions, and runbooks. About You Bachelor’s degree in Computer Science (or equivalent practical experience). 3+ years of professional software development. Deep proficiency with Spring Boot (Core, MVC/WebFlux, Data/JPA, Security). Hands-on experience with microservices, REST APIs, and messaging (Kafka or RabbitMQ). Solid grasp of distributed-systems principles (CAP theorem, consensus, data consistency). Familiar with cloud-native deployment on Kubernetes/Docker; Skilled in both relational (MySQL/Postgres) and NoSQL (MongoDB/Redis) data stores, including schema design and tuning. Working knowledge of modern engineering workflows: Git, CI/CD, automated testing, and observability. Strong problem-solving, communication, and cross-functional collaboration skills. Proven track record of shipping high-quality software in fast-moving environments. Nice to have: Apache Spark, Hadoop, or other big-data platforms
Posted 3 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview About TII At Target, we have a timeless purpose and a proven strategy. And that hasn’t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target’s global team and has more than 4,000 team members supporting the company’s global strategy Data & Analytics Behind one of the world’s best loved brands is a uniquely capable and brilliant team of data scientists, engineers and analysts. The Target Data & Analytics team creates the tools and data products to sustainably educate and enable our business partners to make great data-based decisions at Target. We help develop the technology that personalizes the guest experience, from product recommendations to relevant ad content. We’re also the source of the data and analytics behind Target’s Internet of Things (iOT) applications, fraud detection, Supply Chain optimization and demand forecasting. We play a key role in identifying the test-and-measure or A/B test opportunities that continuously help Target improve the guest experience, whether they love to shop in stores or at Target.com. About You Four-year degree or equivalent experience 3+ year as a Data Analyst with strong academic performance in a quantitative field; or strong equivalent experience [add any specific analyst experience needed here] Advanced SQL experience writing complex queries Intermediately accomplished with Python or R Solid problem solving, analytical skills, data curiosity, data mining, Data creation and consolidation Support conclusions with a clear, understandable story that leverages descriptive statistics, basic inferential statistics, and data visualizations Willingness to ask questions about business objectives and the measurement needs for a project workstream, and be able to measure objectives & key results Excellent communication skills with the ability to speak to both business and technical teams, and translate ideas between them Knowledge of AB Testing methods, time series, S&OP planning, Forecasting models including statistical analysis Experience in analytics tools such as: SQL, Excel, Hadoop, Hive, Spark, Python, R, Domo, Adobe Analytics (or Google Analytics) and/or equivalent technologies Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging
Posted 3 days ago
3.0 - 5.0 years
1 - 4 Lacs
Pune, Bengaluru
Work from Office
Job Title:Data Analyst Experience: 3-5 Years Location:Pune and Bangalore Job Description : For Data Analyst role please look for candidates only from Pune and Bangalore locations. It is observed that later candidates deny to shift cities and then its a challenge to take up next steps. Pyspark, Banking domain are MUST have for DA role. Atleast 3 years of experience is needed. Even for Tech BA roles resumes are not relevant. Guess Aditi has already spoken to you on this. Please look for candidates who can join in maximum 20 days.
Posted 3 days ago
5.0 years
0 Lacs
India
Remote
Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Job Title: Producer Role: Freelancer Duration: 3-6 months Work Timings: 9am-6pm SGT Location: India (Remote) About the role: We are seeking a highly skilled and versatile Integrated Producer & Project Manager to join our Delivery Video Team. This hybrid role combines creative leadership with strategic project oversight perfect for someone who thrives at the intersection of storytelling and operational excellence. In this role, you will oversee the end-to-end delivery of video projects, ensuring they are delivered on time, within budget, and to the highest creative standard. You will act as a key liaison between clients, internal teams, and external partners, championing smooth workflows and inspiring great work. What you will be doing: Oversee the full video production lifecycle, from concept to delivery. Combine creative direction with strong project management to ensure high-quality outcomes. Act as a strategic partner to clients and a motivating leader for internal teams. Own and manage the full lifecycle of video projects, including pre-production, production, and post-production phases. Interrogate client briefs and translate them into actionable project plans that align with creative and business objectives. Develop and maintain detailed project timelines, budgets, and resource allocations, ensuring alignment across all stakeholders. Oversee production activities, including shoot planning, on-set supervision (where required), and post-production workflows. Ensure creative excellence and consistency, conducting quality checks at every stage. Act as the primary point of contact for clients, managing expectations, leading feedback rounds, and providing clear project updates. Identify risks early and proactively implement mitigation plans to keep projects on track. Support and mentor team members, fostering a collaborative, solutions-focused environment. Manage vendor relationships, negotiate contracts, and oversee external partner contributions when needed. Track and report on project performance, continuously optimising processes for greater efficiency and impact. What you need to be great in this role: Proactive and solutions-oriented mindset with a calm, adaptable approach under pressure. Strong leadership and motivational skills with a collaborative spirit. Excellent interpersonal and communication skills, clear - confident, and diplomatic. Passionate about creativity and storytelling, with a commitment to high-quality work. 3–5 years of proven experience in video production and project management, ideally within a creative agency, production company, or in-house content team. Strong understanding of video production workflows, including scripting, shooting, editing, and post-production. Demonstrated ability to manage multiple projects simultaneously in a fast-paced environment. Experience working with regional or global brands is a plus. Proficiency in budgeting, scheduling, and resource management tools. Familiarity with Adobe Creative Suite or similar creative software is advantageous. Experience working with cross-functional and remote teams. Passion for and inquisitive about AI and new technologies Understanding and knowledge of AI tools is beneficial, but ability to learn and digest benefits and features of AI tools is critical Req ID: 14011 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.
Posted 3 days ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
ECMS Req # 533704 Relevant and Total years of experience Relevant 4-5 yrs of exp and Total 6 + years Detailed job description - Skill Set: Strong Data Engineer with 5+ Years exp in Data Bricks , Pyspark , SQL Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform. Mentor junior engineers and provide technical guidance. Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Proven track record of delivering scalable and reliable data solutions in a fast-paced environment. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills with the ability to work effectively in cross-functional teams. Good to have experience with containerization technologies such as Docker and Kubernetes. Knowledge of DevOps practices for automated deployment and monitoring of data pipelines. Mandatory Skills Strong Data Engineer with 5+ Years exp in Data Bricks , Pyspark , SQL Design and develop data processing pipelines and analytics solutions using Databricks. Architect scalable and efficient data models and storage solutions on the Databricks platform. Collaborate with architects and other teams to migrate current solution to use Databricks. Optimize performance and reliability of Databricks clusters and jobs to meet SLAs and business requirements. Use best practices for data governance, security, and compliance on the Databricks platform. Mentor junior engineers and provide technical guidance. Stay current with emerging technologies and trends in data engineering and analytics to drive continuous improvement. Strong understanding of distributed computing principles and experience with big data technologies such as Apache Spark. Experience with cloud platforms such as AWS, Azure, or GCP, and their associated data services. Vendor Billing range in local currency (per day) INR 9000/Day Work Location Bangalore, Hyderabad (Preferred) Notice period 15 days WFO/WFH/Hybrid WFO Hybrid BGCHECK before or After onboarding Post onboarding Skill Matrix: Skills Exp Required Databricks 4-5 Yrs Py spark 4-5 Yrs Spark 4-5 Yrs SQL 4-5 Yrs AWS 2+ Yrs
Posted 3 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description We have an exciting opportunity for you to leverage data science methods to deliver actionable insights across the Card business. As a Data Science Associate within the Card Data and Analytics team, you will leverage skills in building insights from advanced analytics, Gen AI / LLM tools, analysis, data querying, and extracting insights from big data to support our Credit Card business. This role is a hands-on mix of consulting know-how, analytical proficiency in statistics, data science, and machine learning/AI, proficiency in SQL/Python programming, visualization methods, and technologies. Job Responsibilities Leverage your knowledge and analytical skills to uncover novel use cases of Big Data analytics for the Credit Card business. Support development of data science / AIML use cases for the Card business. Help partners in the Card business define their business problems and scope analytical solutions. Build an in-depth understanding of the Card domain and available data assets. Research, design, implement, and evaluate analytical approaches and models. Perform ad-hoc exploratory analyses and data mining tasks on diverse datasets. Communicate findings and obstacles to stakeholders to drive delivery to market. Required Qualifications, Capabilities, And Skills Bachelor’s degree in a relevant quantitative field required in an analytical field (e.g., Statistics, Economics, Applied Math, Operations Research, other Data Science fields). 4+ years of hands-on experience with data analytics; experience evaluating complex business problems and devising recommendations. Exceptional analytical, quantitative, problem-solving, and communication skills. Excellent leadership, consultative partnering, and collaboration across teams. Knowledge of statistical software packages (e.g., Python) and data querying languages (e.g., SQL). Experience across a broad range of modern analytics tools (e.g., Snowflake, Databricks, SQL, Spark, Python). Preferred Qualifications, Capabilities, And Skills Understanding of the key drivers within the credit card P&L is preferred. Financial services background preferred, but not required. Master’s degree or equivalent in an analytical field. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction. The CCB Data & Analytics team responsibly leverages data across Chase to build competitive advantages for the businesses while providing value and protection for customers. The team encompasses a variety of disciplines from data governance and strategy to reporting, data science and machine learning. We have a strong partnership with Technology, which provides cutting edge data and analytics infrastructure. The team powers Chase with insights to create the best customer and business outcomes.
Posted 3 days ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Techno Functional Engineer, AVP Location: Pune, India Role Description A Techno Functional Engineer is responsible for designing, developing and delivering end to end working product for the business users, based on a given broader business requirement, by applying techno functional expertise drawn from both technical and functional experience / knowledge so that to accomplish business goals efficiently and reliably in the area of data sourcing, data collation, data transformation, data modelling, aggregation/calculation and eventually delivering an enterprise/scaleable reporting solution Key Responsibilities Of This Role Include Responsible for leading activities which details the data sourcing to reporting requirements from Business users into system specific functional specifications. Active participation and/or contribution in the design of their solution components with business, with an innovative solution generation mindset Investigating re-use, ensuring that solutions are fit for purpose, reliable, maintainable, and can be integrated successfully into the overall functional solution and environment with clear, robust and well tested deployments. Adoption to industry IT best practices, on utilizing best tools in data consolidation, data modelling, data transformation, metadata management (rules) and reporting (such as Python, Numpy, Pandas, Tableau, BusinessObjects, SQL/PLSQL, and other data management/reporting tools) leading to a scaleable, traceable, quality and timely delivery of an effective reporting solution Adoption to IT roadmap and plans and implement new technologies/solutions in alignment to the bank’s architecture blueprint, including devising of plans to transition from legacy to target state Managing end to end delivery of realizing the business objectives and outcome expected (from reporting delivery perspective) Actively look for opportunities to improve the design at the onset and looking at performance of components by applying sound design, and the learning from feedback and observation gathered Serves as a functional specialist and as the day-to-day leader of work efforts in this area within a data and reporting discipline What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Uses and awareness of banking/financial industry initiatives to work with stakeholders to align with strategy and roadmap while supporting the development of market-driven, sustainable business, process and data/reporting architectures and/or solution Supports the implementation of common architecture and strategies and applies Bank wide standards and best practices. Supports the implementation of optimum architectures of technology solutions and drives analysts, designers, and engineers in technology teams to design, build, test and deliver high quality software solutions to meet business needs. Acts as functional lead/expert for a domain, applications and technology and completes high complex functional / non-functional specification documentation and designs. Completes and elaborates complex functional designs in accordance with the defined principles, blueprints, standards, patterns, etc. Designs and conceptualizes new complex business solution options with proof-of-concepts and articulates identified impacts and risks. Reviews testing requirements including test plans, test cases, test-data, and interface testing between different applications. Works with engineers to resolve complex functional issues arising from integration / user acceptance testing. Provides thought leadership in the development of new models/ techniques for delivering step change in business processes. Defines guiding principles for designing industrialised, high performance design for large volume data sets and business process solutions aligned to expected SLA of our users Acts as virtual supervisor for analysis and design work within a project/programme regionally. Reviews and provides feedback on functional solutions and performs quality assurance of project deliverables. Drives design challenges, implements key design and design building blocks, leverages design practices and insists on design patterns used by engineers. Translates and reviews logical data design at various stages of the data journey. Works with engineers, to priorities, trouble shoot and resolve reported bugs / issues / CRs (change requests) on applications. Drives data discovery, sourcing, modelling, and analytics to support the creation of data flows and models. This includes researching and profiling data sources in data categories of expertise. Plays an active and leading role in relevant Communities of Practice such as the Business Functional Analysis Community of Practice and other Design/Architecture related Communities of Practice. Undertaking peer reviews and reviewing solution designs and architectures, taking into consideration specific business, usability and functional constraints, requirements and dependencies. Your Skills And Experience Fluent in English (written/verbal). 12+ experience in managing teams of complex & sizeable global IT change projects under varying project/programme environments (waterfall, scrum/agile) and tight timelines. Strong working Experience in interpreting multiple end to end local and group regulatory jurisdictions (such as BASEL, statistical regulatory reporting, MI reporting) in the past 6+ years across Financial, Risk and Transactional reporting. Experience of various best practiced methods in Data Analysis, Functional Analysis, Data Model, Data Principles in a banking or financial reporting subject Experience of working on any of the Local Regulatory Reporting requirements for Regions (around MAS, APRA, RBI, Basel/ECB) and/or Financial Reporting visualization Experience in data / reporting tools such Tableau, SAP BusinessObjects, and other BI tools Experience in data management tools such as Python (Pandas, Numpy), Informatica/Spark (optional), SQL, PLSQL, Oracle/Hive, and other databases Experience working with business requirements through data transformation and reporting application design Experience in a financial domain (capital markets, transaction banking, and wealth management) and/or related support functions. Experience owning programme backlogs, driving programme increments and release content via prioritised features and establishing feature acceptance criteria. Advanced analytical and problem-solving experience and ability to independently identify issues, introduce new concepts, provide innovative insights/ solutions and oversee their delivery. High degree of accuracy and attention to detail. Strong planning and highly organised with ability to prioritize key deliverables across several work streams. Excellent communication and documentation / presentation skills. Self-motivated and flexibility to work autonomously coupled with ability to work in virtual teams and matrix/global organisations including appreciation of different cultures during collaborating and sharing. Strong ability to influence and motivate other team members and stakeholders through strong dialogue, facilitation and persuasiveness. Strong leadership and ability to advance decisions and escalations. Profound knowledge of methods and tooling for the Business Functional Analysis Profession, and/or Techno-functional discipline (Software development life cycle) Advanced knowledge of MS Office products. Experience in handling Banking products’ and/or Financial’s data in a regulatory or financial reporting setting or Industry, incl stakeholder needs, competitor and solution awareness within own area of expertise. Preferable if you have experience on some of the below as well Experience of working on Regulatory Reporting Vendor Packages such as Axiom, and recent data/reporting technologies, Looker, Spark/Spark SQL, TRINO, etc Experience in methods/practices of UX (user experience) for UI development Agile methodology delivery experience Education/Certification Degree from an accredited college or university with a preference for Computer Science (or IT related), Business Analysis or Data certification (and/or relevant work experience). Key Business Competencies Proficiency Level (1 to 5) Business Strategy P4 - Advanced Change Leadership P4 - Advanced Communication P3 - Experienced Industry Knowledge P4 - Advanced Innovation P3 - Experienced Managing Complexity P4 - Advanced Key Technical Competencies Proficiency Level (1 to 5) Business Analysis P5 - Expert Process Development and Management P4 - Advanced Quality Management P4 - Advanced How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 days ago
5.0 - 8.0 years
7 - 10 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Apache Spark Experience : 5-8 Years .
Posted 3 days ago
5.0 - 8.0 years
7 - 10 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Snowflake Experience : 5-8 Years .
Posted 3 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
We Are: Drive technology innovations that shape the way we live and connect. Our technology drives the Era of Pervasive Intelligence, where smart tech and AI are seamlessly woven into daily life. From self-driving cars and health-monitoring smartwatches to renewable energy systems that efficiently distribute clean power, Synopsys creates high-performance silicon chips that help build a healthier, safer, and more sustainable world. Apprenticeship Experience: At Synopsys, apprentices dive into real-world projects, gaining hands-on experience while collaborating with our passionate teams worldwide—and having fun in the process! You'll have the freedom to share your ideas, unleash your creativity, and explore your interests. This is your opportunity to bring your solutions to life and work with cutting-edge technology that shapes not only the future of innovation but also your own career path. Join us and start shaping your future today! Mission Statement: Our mission is to fuel today’s innovations and spark tomorrow’s creativity. Together, we embrace a growth mindset, empower one another, and collaborate to achieve our shared goals. Every day, we live by our values of Integrity, Excellence, Leadership, and Passion, fostering an inclusive culture where everyone can thrive—both at work and beyond. What You’ll Be Doing: Troubleshooting software programs in Emulation. Managing R&D SW regressions. Creating validation suites for feature enhancements. Learning and exploring new technologies. Networking with internal and external personnel on assigned tasks. What You’ll Need: Should be a fresh graduate engineer in Computer Science or Electronics (2025/2024). Knowledge of coding (C/C++) and scripting (Perl, Python). Understanding of Data Structures and Basic Operating Systems Concepts. Knowledge of Verilog/VHDL and EDA tools is a plus. Key Program Facts: Program Length: 12 months Location: Noida, India Working Model: In-office Full-Time/Part-Time: Full-time Start Date: August/September 2025 Equal Opportunity Statement: Synopsys is committed to creating an inclusive workplace and is an equal opportunity employer. We welcome all qualified applicants to apply, regardless of age, color, family or medical leave, gender identity or expression, marital status, disability, race and ethnicity, religion, sexual orientation, or any other characteristic protected by local laws. If you need assistance or a reasonable accommodation during the application process, please reach out to us.
Posted 3 days ago
5.0 - 10.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Zeta Marketing Platform (ZMP) which is a machine learning/AI powered customer acquisition and CRM multi-tenant platform. The backend developer will work on the core machine learning recommendation engine as well as a highly distributed event pipeline and stack that gets many thousands of events per second. Please find the job description in the below. As a senior member of the Software Engineering team, you will join the group responsible for designing, developing, and owning the distributed systems CRM platform for Zeta. You will collaborate you're your fellow Engineers and Product Managers to develop a roadmap and subsequent projects to build the next generation comprehensive, multichannel marketing solution that unifies and unlocks data across digital touch points, driving return on marketing investment. This position will be responsible for working on the design and development of the next generation of our big data Hub platform, working on technology that enables access to core of CRM efforts. You should have a deep knowledge of distributed systems and cloud architecture. You will need extensive design and development experience and passion about working with large amounts of data. Job Description : Your Impact: Implement the next version of our big data CRM platform Dexterously own balancing of production features Support, feature delivery, and retirement of technical debt Excited by building reliable, self-healing services with robust error handling Experience designing, developing, debugging, and operating resilient distributed systems that run across hundreds of compute nodes in multiple data centers Capable of driving and delivering thin slices of end-to-end functionality on a regular cadence with data-driven feedback loops Required Skills / Experience: 5+ years of software design and development experience B.S. or M.S. in Computer Science or related field Strong programming skills (preferably in Python) with specific focus on parallel and multithreaded programming Experience with multi-tenant architectures (IaaS, PaaS, SaaS) Experience with building distributed systems and highly scalable web applications Experience building RESTful web-services, Microservices Experience with Agile development methodology and Test-Driven Development Experience with a variety of datastores (Kafka, Postgresql, Redis, Memcached, CouchDB). Desired Skills / Experience: Experience with AWS, OpenStack or Azure for scaling web and mobile application backend infrastructure Knowledge of open-source distributed automation frameworks (e.g., Docker, Kubernetes, Rundeck, STAF/STAX, Chef, Puppet) Knowledge of Big Data technologies (e.g.,HBase, Spark, Aerospike, ElasticSearch)
Posted 3 days ago
2.0 - 4.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Zeta Global is looking for an experienced Machine Learning Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organisation. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning engineering best-practices and industry standards. Empower the product and engineering teams to make data-driven decisions. What you need to succeed: 2 to 4 years of proven experience as a Machine Learning Engineer in a professional setting. Proficiency in any programming language (Python preferable). Prior experience in building and deploying Machine learning systems. Experience with containerization: Docker & Kubernetes. Experience with AWS cloud services like EKS, ECS, EMR, Lambda, and others. Fluency with workflow management tools like Airflow or dbt. Familiarity with distributed batch compute technologies such as Spark. Experience with modern data warehouses like Snowflake or BigQuery. Knowledge of MLFlow, Feast, and Terraform is a plus.
Posted 3 days ago
10.0 - 15.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Position Summary... What youll do... About Team: This is the team which builds reusable technologies that aid in acquiring customers, onboarding and empowering merchants besides ensuring a seamless experience for both these stakeholders. We also optimize tariffs and assortment, adhering to the Walmart philosophy - Everyday Low Cost. In addition to ushering in affordability, we also create personalized experiences for customers the omnichannel way, across all channels - in-store, on the mobile app and websites. Marketplace is the gateway to domestic and international Third-Party sellers; we enable them to manage their end-to-end onboarding, catalog management, order fulfilment, return ; refund management. Our team is responsible for design, development, and operations of large-scale distributed systems by leveraging cutting-edge technologies in web/mobile, cloud, big data ; AI/ML. We interact with multiple teams across the company to provide scalable robust technical solutions. What youll do: We are looking for a Staff Engineer with deep expertise in Apache Spark, Kafka and Java who can lead the architecture, design, and development of scalable products from scratch. You will be a key contributor to our platform, solving data-intensive challenges, influencing architectural decisions, and mentoring other engineers. Design and develop highly scalable and performant systems using Apache Spark, Kafka , Java , and distributed computing principles. Lead end-to-end product development from ideation, system design, development, deployment to monitoring. Architect solutions that are resilient, fault-tolerant, and can scale horizontally. Collaborate with product managers, architects, and cross-functional teams to deliver high-quality features. Evaluate and adopt new technologies and frameworks where appropriate. Define and enforce engineering best practices, including code reviews, testing, and CI/CD pipelines. Guide and mentor junior and mid-level engineers within the team. Own the technical roadmap and contribute to long-term strategy for the data and platform team. What youll bring: 10+ years of software development experience with strong programming skills in Java . 4+ years of hands-on experience working with Apache Spark (core, SQL, streaming). Proven experience in building and scaling products from scratch . Solid understanding of distributed systems, data pipelines, and system design principles. Strong background in algorithms, data structures, and design patterns. Experience with data storage technologies like HDFS, Hive, Parquet, or similar. Hands-on with containerization and orchestration tools (Docker, Kubernetes) is a plus. Familiarity with cloud platforms (AWS/GCP/Azure) is a strong advantage. Demonstrated ability to lead technical initiatives and drive high-impact decisions. Excellent problem-solving, communication, and mentoring skills. Preferred Qualifications: Experience in leading cross-functional teams or technical pods. Knowledge of data governance, lineage, and observability tools. Exposure to stream processing frameworks like Flink, etc. Exposure to Netflix conductor. Exposure to Google Big Query is Plus. Opportunity to drive technical strategy and innovation. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. Thats what we do at Walmart Global Tech. Were a team of software engineers, data scientists, cybersecurity experts and service professionals within the worlds leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. . Flexible, hybrid work . Benefits . Belonging . At Walmart, our vision is everyone included. By fostering a workplace culture where everyone isand feelsincluded, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, were able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions while being welcoming of all people. Minimum Qualifications... Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years experience in software engineering or related area Primary Location... BLOCK- 1, PRESTIGE TECH PACIFIC PARK, SY NO. 38/1, OUTER RING ROAD KADUBEESANAHALLI, , India
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France