Jobs
Interviews

5704 Databricks Jobs - Page 24

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

9 - 10 Lacs

Gurgaon

On-site

Location: Gurgaon - Haryana, India Additional Location: Bangalore - Karnataka, India - Manyata Park Outer Ring Road, IN_Chennai_RMZ One Paramount_HII, IN_Coimbatore_Echo Point Plaza_HCS, IN_Hyderabad_AWFIS Space Solutions_HCS, Pune - Maharashtra, India - Rajiv Ganhi Infotec Park Job Family: Engineering Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T3(A) Job ID: R-46021-2025 Description & Requirements Introduction: A Career at HARMAN - Harman Tech Solutions (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN HTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs. Empower the company to create new digital business models, enter new markets, and improve customer experiences What You Will Do Develop and execute test scripts to validate data pipelines, transformations, and integrations. Formulate and maintain test strategies—including smoke, performance, functional, and regression testing—to ensure data processing and ETL jobs meet requirements. Collaborate with development teams to assess changes in data workflows and update test cases to preserve data integrity. Design and run tests for data validation, storage, and retrieval using Azure services like Data Lake, Synapse, and Data Factory, adhering to industry standards. Continuously enhance automated tests as new features are developed, ensuring timely delivery per defined quality standards. Participate in data reconciliation and verify Data Quality frameworks to maintain data accuracy, completeness, and consistency across the platform. Share knowledge and best practices by collaborating with business analysts and technology teams to document testing processes and findings. Communicate testing progress effectively with stakeholders, highlighting issues or blockers, and ensuring alignment with business objectives. Maintain a comprehensive understanding of the Azure Data Lake platform's data landscape to ensure thorough testing coverage. What You Need 3-6 years of QA experience with a strong focus on Big Data testing, particularly in Data Lake environments on Azure's cloud platform. Proficient in Azure Data Factory, Azure Synapse Analytics and Databricks for big data processing and scaled data quality checks. Proficiency in SQL, capable of writing and optimizing both simple and complex queries for data validation and testing purposes. Proficient in PySpark, with experience in data manipulation and transformation, and a demonstrated ability to write and execute test scripts for data processing and validation. Hands-on experience with Functional & system integration testing in big data environments, ensuring seamless data flow and accuracy across multiple systems. Knowledge and ability to design and execute test cases in a behaviour-driven development environment. Fluency in Agile methodologies, with active participation in Scrum ceremonies and a strong understanding of Agile principles. Familiarity with tools like Jira, including experience with X-Ray or Jira Zephyr for defect management and test case management. Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions. What Makes You Eligible Be willing to travel up to 25%, domestic and international travel if required. Successfully complete a background investigation as a condition of employment What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.) Professional development opportunities through HARMAN University’s business and leadership academies. Flexible work schedule with a culture encouraging work life integration and collaboration in a global environment. An inclusive and diverse work environment that fosters and encourages professional and personal development. Tuition reimbursement. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity employer. HARMAN strives to hire the best qualified candidates and is committed to building a workforce representative of the diverse marketplaces and communities of our global colleagues and customers. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.HARMAN attracts, hires, and develops employees based on merit, qualifications and job-related performance.(www.harman.com)

Posted 1 week ago

Apply

5.0 - 8.0 years

9 - 10 Lacs

Gurgaon

Remote

Location: Gurgaon - Haryana, India Additional Location: Bangalore - Karnataka, India - Manyata Park Outer Ring Road, IN_Chennai_RMZ One Paramount_HII, IN_Coimbatore_Echo Point Plaza_HCS, IN_Hyderabad_AWFIS Space Solutions_HCS, Pune - Maharashtra, India - Rajiv Ganhi Infotec Park Job Family: Engineering Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T3(A) Job ID: R-45175-2025 Description & Requirements Introduction: A Career at HARMAN Digital Transformation Solutions (DTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences What You Will Do Develop and execute test scripts to validate data pipelines, transformations, and integrations. Formulate and maintain test strategies—including smoke, performance, functional, and regression testing—to ensure data processing and ETL jobs meet requirements. Collaborate with development teams to assess changes in data workflows and update test cases to preserve data integrity. Design and run tests for data validation, storage, and retrieval using Azure services like Data Lake, Synapse, and Data Factory, adhering to industry standards. Continuously enhance automated tests as new features are developed, ensuring timely delivery per defined quality standards. Participate in data reconciliation and verify Data Quality frameworks to maintain data accuracy, completeness, and consistency across the platform. Share knowledge and best practices by collaborating with business analysts and technology teams to document testing processes and findings. Communicate testing progress effectively with stakeholders, highlighting issues or blockers, and ensuring alignment with business objectives. Maintain a comprehensive understanding of the Azure Data Lake platform's data landscape to ensure thorough testing coverage. What You Need to Be Successful 5-8 years of QA experience with a strong focus on Big Data testing, particularly in Data Lake environments on Azure's cloud platform. Proficient in Azure Data Factory, Azure Synapse Analytics and Databricks for big data processing and scaled data quality checks. Proficiency in SQL, capable of writing and optimizing both simple and complex queries for data validation and testing purposes. Proficient in PySpark, with experience in data manipulation and transformation, and a demonstrated ability to write and execute test scripts for data processing and validation. Hands-on experience with Functional & system integration testing in big data environments, ensuring seamless data flow and accuracy across multiple systems. Knowledge and ability to design and execute test cases in a behaviour-driven development environment. Fluency in Agile methodologies, with active participation in Scrum ceremonies and a strong understanding of Agile principles. Familiarity with tools like Jira, including experience with X-Ray or Jira Zephyr for defect management and test case management. Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions. What We Offer Flexible work environment, allowing for full-time remote work globally for positions that can be performed outside a HARMAN or customer location Access to employee discounts on world-class Harman and Samsung products (JBL, HARMAN Kardon, AKG, etc.) Extensive training opportunities through our own HARMAN University Competitive wellness benefits Tuition reimbursement “Be Brilliant” employee recognition and rewards program An inclusive and diverse work environment that fosters and encourages professional and personal development You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! HARMAN is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or Protected Veterans status. HARMAN offers a great work environment, challenging career opportunities, professional training, and competitive compensation. (www.harman.com) You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity employer. HARMAN strives to hire the best qualified candidates and is committed to building a workforce representative of the diverse marketplaces and communities of our global colleagues and customers. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.HARMAN attracts, hires, and develops employees based on merit, qualifications and job-related performance.(www.harman.com)

Posted 1 week ago

Apply

16.0 years

2 - 6 Lacs

Gurgaon

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design and develop applications and services running on Azure, with a strong emphasis on Azure Databricks, ensuring optimal performance, scalability, and security Build and maintain data pipelines using Azure Databricks and other Azure data integration tools Write, read, and debug Spark, Scala, and Python code to process and analyze large datasets Write extensive query in SQL and Snowflake Implement security and access control measures and regularly audit Azure platform and infrastructure to ensure compliance Create, understand, and validate design and estimated effort for given module/task, and be able to justify it Possess solid troubleshooting skills and perform troubleshooting of issues in different technologies and environments Implement and adhere to best engineering practices like design, unit testing, functional testing automation, continuous integration, and delivery Maintain code quality by writing clean, maintainable, and testable code Monitor performance and optimize resources to ensure cost-effectiveness and high availability Define and document best practices and strategies regarding application deployment and infrastructure maintenance Provide technical support and consultation for infrastructure questions Help develop, manage, and monitor continuous integration and delivery systems Take accountability and ownership of features and teamwork Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: B. Tech or MCA (16+ years of formal education) Overall 7+ years of experience 5+ years of experience in writing advanced level SQL 3+ years of experience in Azure (ADF), Databricks and DevOps 3+ years of experience in architecting, designing, developing, and implementing cloud solutions on Azure 2+ years of experience in writing, reading, and debugging Spark, Scala, and Python code Proficiency in programming languages and scripting tools Understanding of cloud data storage and database technologies such as SQL and NoSQL Familiarity with DevOps practices and tools, such as continuous integration and continuous deployment (CI/CD) and Teraform Proven ability to collaborate with multidisciplinary teams of business analysts, developers, data scientists, and subject-matter experts Proven proactive approach to spotting problems, areas for improvement, and performance bottlenecks Proven excellent communication, writing, and presentation skills Experience in interacting with international customers to gather requirements and convert them into solutions using relevant skills Preferred Qualifications: Experience and skills with Snowflake Knowledge of AI/ML or LLM (GenAI) Knowledge of US Healthcare domain and experience with healthcare data At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Bhubaneshwar

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in driving the success of application initiatives and fostering a collaborative environment. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate training and development opportunities for team members to enhance their skills. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform. - Strong understanding of data integration and ETL processes. - Experience with cloud computing platforms and services. - Familiarity with data governance and compliance standards. - Ability to analyze and interpret complex data sets. Additional Information: - The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Bhubaneswar office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

1.0 years

3 - 9 Lacs

Noida

On-site

Senior Data & Applied Scientist Noida, Uttar Pradesh, India Date posted Jul 14, 2025 Job number 1844835 Work site Microsoft on-site only Travel None Role type Individual Contributor Profession Research, Applied, & Data Sciences Discipline Data Science Employment type Full-Time Overview Do you want to be on the leading edge of using big data and help drive engineering and product decisions for the biggest productivity software on the planet? Office Product Group (OPG) has embarked on a mission to delight our customers by using data-informed engineering to develop compelling products and services. OPG is looking for an experienced professional with a passion for delivering business value with data insights and analytics to join our team as a Data & Applied Scientist. We are looking for a strong Senior Data Scientist with a proven track record of solving large, complex data analysis problems in a real-world software product development setting. Ideal candidates should be able to take a business or engineering problem from a Product Manager or Engineering leader and translate it to a data problem. This includes all the steps to identify and deeply understand potential data sources, conduct the appropriate analysis to reveal actionable insights, and then operationalize the metrics or solution into PowerBI dashboards. You will be delivering results through innovation and persistence when similar candidates have given up. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications: Doctorate in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 1+ year(s) data-science experience (e.g., managing structured and unstructured data, applying statistical techniques and reporting results) OR Master's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 3+ years data-science experience (e.g., managing structured and unstructured data, applying statistical techn OR Bachelor's Degree in Data Science, Mathematics, Statistics, Econometrics, Economics, Operations Research, Computer Science, or related field AND 5+ years data-science experience (e.g., managing structured and unstructured data, applying statistical tec OR equivalent experience. 2+ years customer-facing, project-delivery experience, professional services, and/or consulting experience. Preferred Qualifications: 7+ years of experience involving programming with languages Python/R and hands on experience using technologies such as SQL, Kusto, Databricks, Spark etc. 7+ years of experience working with data exploration and data visualization tools like PowerBI or similar. Candidate must be able to communicate complex ideas and concepts to leadership and deliver results. Candidate must be comfortable in manipulating and analyzing complex, high dimensional data from varying sources to solve difficult problems. Bachelors or higher degrees in Computer Science, Statistics, Mathematics, Physics, Engineering, or related disciplines. Responsibilities Dashboard Development and Maintenance: Design, build, and maintain interactive dashboards and reports in PowerBI to visualize key business metrics and insights. Work closely with stakeholders to understand their data visualization needs and translate business requirements into technical specifications. Data Extraction and Analysis: Perform ad-hoc data extraction and analysis from various data sources, including SQL databases, cloud-based data storage solutions, and external APIs. Ensure data accuracy and integrity in reporting and analysis. Deliver high impact analysis to diagnose and drive business critical insights to guide product and business development. Metric Development and Tracking: Be the SME who understand landscape of what data (telemetry) are and should be captured Advice feature teams on telemetry best practices to ensure business needs for data are met. Collaborate with product owners and other stakeholders to define and track key performance indicators (KPIs) and other relevant metrics for business performance. Identify trends and insights in the data to support decision-making processes. User Journey and Funnel Analysis: Assist product owners in mapping out user journeys and funnels to understand user behavior and identify opportunities for feature improvement. Develop and implement ML models to analyze user journeys and funnels. Utilize a variety of techniques to uncover patterns in user behavior that can help improve the product. Forecasting and Growth Analysis: Support the forecasting of key results (KRs) and growth metrics through data analysis and predictive modeling. Provide insights and recommendations to help drive strategic planning and execution. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Andhra Pradesh

Remote

ABOUT EVERNORTH: Evernorth℠ exists to elevate health for all, because we believe health is the starting point for human potential and progress. As champions for affordable, predictable and simple health care, we solve the problems others don’t, won’t or can’t. Our innovation hub in India will allow us to work with the right talent, expand our global footprint, improve our competitive stance, and better deliver on our promises to stakeholders. We are passionate about making healthcare better by delivering world-class solutions that make a real difference. We are always looking upward. And that starts with finding the right talent to help us get there. Software Engineering Lead Analyst Position Overview The job profile for this position is Software Engineering Developer – Lead Analyst, which is a Band 3+ Contributor Career Track Role. Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position, you see is right for you, we encourage you to apply! Our people make all the difference in our success. Join our dynamic team as a Data Engineering Developer Lead Analyst, where you'll leverage your expertise in cutting-edge technologies to drive innovation and strategic influence. As a key contributor, you'll design, build, test, and deliver large-scale software applications and platforms, collaborating closely with IT and business teams to own and drive major deliverables. You'll automate processes using Databricks and Azure, and work in a team that values innovation, cloud-first, self-service-first, and automation-first mindsets. You’ll need to demonstrate strong analytical and technical skills to positively influence data engineering product delivery. Engage with internal and external stakeholders to build solutions as part of Enterprise Data Engineering. Your proven domain expertise and exceptional problem-solving skills will be crucial in understanding technical concepts, addressing business issues, and breaking down large problems into manageable ones. Responsibilities The candidate will be responsible to deliver business needs end to end from requirements to development into production. Through hands-on engineering approach in Databricks environment, this individual will deliver data engineering toolchains, platform capabilities, and reusable patterns. The applicant will be responsible to follow software engineering best practices with an automation first approach and continuous learning and improvement mindset. The applicant will ensure adherence to enterprise architecture direction and architectural standards. The applicant should be able to collaborate in a high-performing team environment, and an ability to influence and be influenced by others. Qualifications Required Skills: Cloud-based security principles and protocols like OAuth2, JWT, data encryption, hashing data, secret management, etc. Expertise in Big data technologies such as Spark, Hadoop, Databricks, Snowflake, EMR, Azure Data Factory Good understanding of Kafka, Kafka Streams, Spark Structured streaming, configuration-driven data transformation and curation Expertise in building cloud-native microservices, containers, Kubernetes and platform-as-a-service technologies such as OpenShift, CloudFoundry Experience in multi-cloud software-as-a-service products such as Databricks, Snowflake Experience in Infrastructure-as-Code (IaC) tools such as terraform Experience in messaging systems such as Apache ActiveMQ, WebSphere MQ, Apache Artemis, Kafka, Azure Service Bus Experience in API and microservices stack such as Spring Boot, Quarkus, Expertise in Cloud technologies such as Azure Data Factory, Lambda, S3, Elastic Search, API Gateway, CloudFront Experience with one or more of the following programming and scripting languages – Python, Scala, JVM-based languages, or JavaScript, and ability to pick up new languages Experience in building CI/CD pipelines using Jenkins, Github Actions, Azure DevOps Strong expertise with source code management and its best practices Proficient in self-testing of applications, unit testing and use of mock frameworks, test-driven development (TDD) Knowledge on Behavioral Driven Development (BDD) approach Ability to perform detailed analysis of business problems and technical environments Strong oral and written communication skills Ability to think strategically, implement iteratively and estimate financial impact of design/architecture alternatives Continuous focus on an on-going learning and development Expertise in Agile software development principles and patterns Expertise in building streaming, batch and event-driven architectures and data pipelines Required Experience & Education: 5 to 7 years of experience in software engineering, building data engineering pipelines, middleware and API development and automation More than 1 years of experience in Databricks within an Azure environment Data Engineering experience Bachelor's degree in Statistics, Mathematics, Computer Science, Economics, or a related field. Master's degree preferred. Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 week ago

Apply

0 years

9 - 10 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: · Requirement gathering and analysis · Design of data architecture and data model to ingest data · Experience with different databases like Synapse, SQL DB, Snowflake etc. · Design and implement data pipelines using Azure Data Factory, Databricks, Synapse · Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases · Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage · Implement data security and governance measures · Monitor and optimize data pipelines for performance and efficiency · Troubleshoot and resolve data engineering issues · Hands on experience on Azure functions and other components like realtime streaming etc · Oversee Azure billing processes, conducting analyses to ensure cost-effectiveness and efficiency in data operations. · Provide optimized solution for any problem related to data engineering · Ability to work with verity of sources like Relational DB, API, File System, Realtime streams, CDC etc. · Strong knowledge on Databricks, Delta tables Mandatory skill sets: SQL, ADF, ADLS, Synapse, Pyspark, Databricks, data modelling Preferred skill sets: Pyspark, Databricks Years of experience required: 7 – 10 yrs Education qualification: B.tech/MCA and MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 week ago

Apply

3.0 years

0 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: 1. Must have a minimum of 3 years of experience in data modelling and data visualization with Microsoft Power BI. 2. Can develop and design power BI dashboards and publish them to Power BI service. Good knowledge on data gateways. 3. Must have a strong background in writing DAX,SQL queries and Python 4. Can work independently to perform the data analysis and build visualization. 5. Good to have exposure on Azure Data Factory, Power Automate, Databricks 6. Good communication skills and a team player. 7. Should have PL-300 / DA-100 certification cleared Mandatory skill sets: Power BI Developer Preferred skill sets: Azure Data Factory, Power Automate, Databricks Years of experience required: 2-4 Years Education qualification: B.E./B.Tech/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Azure Data Factory, Microsoft Power Automate, Power BI Optional Skills Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview PepsiCo Data BI & Integration Platforms is seeking a Midlevel Cloud Platform technology leader, responsible for overseeing the deployment, and maintenance of big data and analytics cloud infrastructure projects on Azure/AWS for its Europe, AMESA,APAC PepsiCo business and Global Business Service. The ideal candidate will have hands-on experience with Azure/AWS services - Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Cloud Infrastructure & Automation Implement and support application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based infrastructure using Azure/AWS services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (Microsoft, AWS). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or Azure/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Qualifications Bachelor’s degree in computer science. At least 8 to 10 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 6 years in a technical leadership role Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in Azure/AWS big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Deep expertise in Azure/AWS networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in Azure/AWS platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At Arctic Wolf, we're redefining the cybersecurity landscape. With our employee Pack members, spread out globally, committed to setting new industry standards. Our accomplishments speak for themselves, from our recognition in the Forbes Cloud 100 , CNBC Disruptor 50 , Fortune Future 50 , and Fortune Cyber 60 to winning the 2024 CRN Products of the Year award. We’re proud to be named a Leader in the IDC MarketScape for Worldwide Managed Detection and Response Services and earning a Customers' Choice distinction from Gartner Peer Insights . Our Aurora Platform also received CRN’s Products of the Year award in the inaugural Security Operations Platform category. Join a company that’s not only leading, but also shaping, the future of security operations. Our mission is simple: End Cyber Risk. We’re looking for a Developer to be part of making this happen. About The Team Our team is at the forefront of integrating cybersecurity operations with advanced data intelligence. We transform complex security telemetry into actionable insights by leveraging cutting-edge AI technologies and scalable data platforms like Databricks. Our work supports multiple security products and internal teams, providing the foundation for advanced threat intelligence. We are evolving our capabilities by developing intelligent cybersecurity data processing AI agents to enhance automation, efficiency, and analytical capabilities in threat detection and response. About The Role As a Developer, you will build AI-powered platforms that transform massive security telemetry into actionable intelligence. Your expertise in Python, SQL, Large Language Models (LLMs), and API development will drive sophisticated, data-driven security solutions. You will collaborate closely with engineering and security research teams to develop AI-driven cybersecurity data processing agents that automate, enrich, and analyze security data for enhanced threat detection and response. Responsibilities Build natural language interfaces for querying complex security data. Design APIs and integration layers for AI-powered security analytics platforms. Use Large Language Models (LLMs) for data enrichment and analysis to build AI-powered cybersecurity workflows. Design and implement intelligent agents to streamline threat intelligence analysis. Collaborate with security researchers, threat analysts, and engineers to enhance data-driven security capabilities. Ensure high-quality software development through rigorous testing, documentation, and best practices. Required Qualifications 3+ years of experience in software development, preferably in cybersecurity telemetry workflows. Strong proficiency in Python and SQL for advanced data processing. Experience integrating Large Language Models (LLMs) using LangGraph and building AI agents. Expertise in Databricks, Apache Spark for scalable data analytics. Hands-on experience developing REST APIs for data services and integrations. Expertise in cloud-based data processing on AWS. Experience with modern development methodologies such as Agile, and Kanban. Bachelor’s degree in computer science, Engineering, or equivalent technical experience. Preferred Qualifications Experience with cybersecurity telemetry (logs, alerts, network traffic, SIEM data, etc.) and a strong understanding of data engineering principles. Familiarity with threat intelligence frameworks (MITRE ATT&CK, etc.). Deep expertise in security data analysis and experience with AI-driven security analytics platforms. Ability to translate complex security challenges into AI-powered solutions. Advanced degree in Computer Science, Cybersecurity, or a related field Why Arctic Wolf? At Arctic Wolf, we foster a collaborative and inclusive work environment that thrives on diversity of thought, background, and culture. This is reflected in our multiple awards, including Top Workplace USA (2021-2024), Best Places to Work – USA (2021-2024), Great Place to Work – Canada (2021-2024), Great Place to Work – UK (2024), and Kununu Top Company – Germany (2024). Our commitment to bold growth and shaping the future of security operations is matched by our dedication to customer satisfaction, with over 7,000 customers worldwide and more than 2,000 channel partners globally. As we continue to expand globally and enhance our technology, Arctic Wolf remains the most trusted name in the industry. Our Values Arctic Wolf recognizes that success comes from delighting our customers, so we work together to ensure that happens every day. We believe in diversity and inclusion, and truly value the unique qualities and unique perspectives all employees bring to the organization. And we appreciate that—by protecting people’s and organizations’ sensitive data and seeking to end cyber risk— we get to work in an industry that is fundamental to the greater good. We celebrate unique perspectives by creating a platform for all voices to be heard through our Pack Unity program. We encourage all employees to join or create a new alliance. See more about our Pack Unity here. We also believe and practice corporate responsibility, and have recently joined the Pledge 1% Movement, ensuring that we continue to give back to our community. We know that through our mission to End Cyber Risk we will continue to engage and give back to our communities. All wolves receive compelling compensation and benefits packages, including: Equity for all employees Flexible annual leave, paid holidays and volunteer days Training and career development programs Comprehensive private benefits plan including medical insurance for you and your family, life insurance (3x compensation), and personal accident insurance. Fertility support and paid parental leave Arctic Wolf is an Equal Opportunity Employer and considers applicants for employment without regard to race, colour, religion, sex, orientation, national origin, age, disability, genetics, or any other basis forbidden under federal, provincial, or local law. Arctic Wolf is committed to fostering a welcoming, accessible, respectful, and inclusive environment ensuring equal access and participation for people with disabilities. As such, we strive to make our entire employee experience as accessible as possible and provide accommodations as required for candidates and employees with disabilities and/or other specific needs where possible. Please let us know if you require any accommodations by emailing recruiting@arcticwolf.com.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Senior Infrastructure Architect Would being part of a digital transformation excite you? Are you passionate about infrastructure security? Join our digital transformation team We operate at the heart of the digital transformation of our business. Our team is responsible for the cybersecurity, architecture and data protection for our global organization. We advise on the design and validation of all systems, infrastructure, technologies and data protection. Partner the best As a Senior Infrastructure Architect, you will be responsible for: Participate in the domain technical and business discussions relative to future architect direction. Assist in the analysis, design and development of a roadmap and implementation based upon a current vs. future state in a cohesive architecture viewpoint. Gather and analyze data and develop architectural requirements at project level. Participate in the infrastructure architecture governance model. Support design and deployment of infrastructure solutions meeting standardization, consolidation, TCO, security, regulatory compliance and application system qualities, for different businesses. Research and evaluate emerging technology, industry and market trends to assist in project development and/or operational support activities. Coach and mentor team members Fuel your passion To be successful in this role you will: Bachelor's Degree. A minimum 8 years of professional experience. Have an experience in Azure infra services and automating deployments Have an experience working in DevOps and Data bricks Have hands on experience working with database technologies, including ETL tools including Databricks Workflows using Pyspark / Python, and an ability to learn new technologies. Have strong proficiency in writing and optimizing SQL queries and working with databases. Skilled level expertise in design of computing or network or storage to meet business application system qualities Understands technical and business discussions relative to future architecture direction aligning with business goals. Understands concepts of setting and driving architecture direction. Familiar with elements of gathering architecture requirements. Understands architecture standards concepts to apply to project work. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns: Working remotely from home or any other work location Flexibility in your work schedule to help fit in around life! Talk to us about your desired flexible working options when you apply Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all of our people are developed, engaged and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent and develop leaders at all levels to bring out the best in each other. About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward – making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress? Join us and become part of a team of people who will challenge and inspire you! Let’s come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. R139742

Posted 1 week ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Databricks Engineer: 9+ years of total experience with 5 years relevant experience in the mandatory skills. Mandatory Skills: Databricks, Hadoop, Python, Spark, Spark SQL, PySpark, AirFlow And IBM StreamSet Required Skills & Experience: Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse). Knowledge on medallion architecture, DLT and unity catalog within Databricks. Experience in migrating data from on-prem Hadoop to Databricks/AWS Understanding of core AWS services, uses, and AWS architecture best practices Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc. Solid knowledge on Airflow Solid knowledge on CI/CD pipeline in AWS technologies Application migration of RDBMS, java/python applications, model code, elastic etc. Solid programming background on scala, python, SQL Work Location Bangalore

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Reference # 314307BR Job Type Full Time Your role We are seeking a hands-on Data Engineer in the Group Compliance & Regulatory Governance (GCRG) Technology team to support data integration, reporting & analytics platform with a longer-term opportunity to contribute to our strategic Compliance Analytics and Reporting Platform. Your team You will be a key member of an expanding Data, Analytics & Reporting team, which is part of the Group Compliance & Regulatory Governance (GCRG) Technology team. We are a small, friendly bunch who take pride in the quality of work that we do. As a team, we provide data analytics and reporting solutions on top of a big data platform in close collaboration with other delivery teams, business partners and other global engineering teams. Your expertise A degree-level education; preferably in Computer Science (bachelor’s or master’s degree). Ideally 8+ years of hands-on design and development experience in several of the relevant technology areas, preferably in a cloud environment (Lakehouse architecture, Azure Data Lake, Scala, Databricks, Spark/Spark SQL, Spring Boot, ReactJS, Kubernetes, Postgres). Ideally 5+ years of hands-on experience in distributed processing using Databricks, Apache Python/Spark, Kafka & leveraging the Airflow scheduler/executor framework. Ideally 2+ years of hands-on programming experience in Scala (must have), Python & Java (preferred). Coding skills in Java, SQL, Spring Boot, and other Java-based frameworks (added advantage). Experience working in Microsoft Azure DevOps topics (added advantage). Coding skills in UI development using ReactJS (Optional). Experience working with Agile development methodologies and delivering within Azure DevOps, automated testing on tools used to support CI and release management. Strong communication skills, both to management and teams. Background in compliance and risk management. A collaborative approach towards problem-solving, working closely with other colleagues in the global team and sensitivity towards diversity. Business knowledge in Compliance and Risk Management areas. About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Description Impetus Technologies enables the Intelligent Enterprise™ with innovative data engineering, cloud, and enterprise AI services. Recognized as an AWS Advanced Consulting Partner, Elite Databricks Consulting Partner, Data & AI Solutions Microsoft Partner, and Elite Snowflake Services Partner, Impetus offers a suite of cutting-edge IT services and solutions to drive innovation and transformation for businesses across various industries. With a proven track record with Fortune 500 clients, Impetus drives growth, enhances efficiency, and ensures a competitive edge through continuous innovation and flawless delivery. Role Description This is a full-time on-site role for a Senior BI Engineer. The Senior BI Engineer will be responsible for developing data models, building and maintaining ETL processes, creating and managing data warehouses, and developing dashboards. The role also involves conducting data analysis to generate insights and support business decisions, ensuring data quality and compliance, and collaborating with various teams to meet business requirements. Experience: 4 to 6 years Qualifications Proficient in Data Modeling and Data Warehousing skills Demonstrate strong skills in databases / Datawarehouse such as Oracle, MySQL, DB2, Databricks or Snowflake with expertise in writing complex SQL queries Sound knowledge and experience in developing Power BI semantic modelling for any relevant BI tool Strong Extract, Transform, Load (ETL) skills Experience in creating and managing Dashboards Excellent Analytical Skills Good communication and teamwork abilities Ability to work in a fast-paced, dynamic environment Experience in the IT or consulting industry is a plus Bachelor's degree in technologies, Master of Computer Applications or a related field

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role Grade Level (for internal use): 09 The Team: The team works in an Agile environment and adheres to all basic principles of Agile. As a Quality Engineer, you will work with a team of intelligent, ambitious, and hard-working software professionals. The team is independent in driving all decisions and responsible for the architecture, design and development of our products with high quality. The Impact Achieve Individual objectives and contribute to the achievement of team objectives. Work on problems of moderate scope where analysis of situations or data requires a review of a variety of factors. ETL Testing from various feeds on server (Oracle, SQL, HIVE server, Databricks) using different testing strategy to ensure the data quality and data consistency, timeliness. Achieve the above intelligently and economically using QA best practices. What Is In It For You Be the part of a successful team which works on delivering top priority projects which will directly contributing to Company’s strategy. This is the place to enhance your Testing skills while adding value to the business. As an experienced member of the team, you will have the opportunity to own and drive a project end to end and collaborate with developers, business analysts and product managers who are experts in their domain which can help you to build multiple skillsets. Responsibilities As a Quality Engineer, you are responsible for: Defining Quality Metrics: Defining quality standards and metrics for the current project/product. Working with all stake holders to ensure that the quality metrics is reviewed, closed, and agreed upon. Create a list of milestones and checkpoints and set measurable criteria to check the quality on timely basis. Defining Testing Strategies: Defining processes for test plan and several phases of testing cycle. Planning and scheduling several milestones and tasks like alpha and beta testing. Ensuring all development tasks meet quality criteria through test planning, test execution, quality assurance and issue tracking. Work closely on the deadlines of the project. Keep raising the bar and standards of all the quality processes with every project. Thinking of continuous innovation. Managing Risks: Understanding and defining areas to calculate the overall risk to the project. Creating strategies to mitigate those risks and take necessary measures to control the risks. Communicating or creating awareness to all the stake holders for the various risks Understand & review the current risks and escalate. Process Improvements: Challenge yourself continuously to move towards automation for all daily works and help others in the automation. Create milestones for yearly improvement projects and set. Work with the development team to ensure that the quality engineers get apt support like automation hooks or debug builds wherever and whenever possible. Basic Qualifications What we are looking for: Bachelor's/PG degree in Computer Science, Information Systems or equivalent. 3-6 yrs of intensive experience in Database and ETL testing. Experience in running queries, data management, managing large data sets and dealing with databases. Strong in creating SQL queries that can parse and validate business rules/calculations. Experience in writing complex SQL Scripts, Stored Procedures, Integration packages. Experience in tuning and improving DB performance of complex enterprise class applications. Develop comprehensive test strategy, test plan and test cases to test big data implementation. Proficient with software development lifecycle (SDLC) methodologies like Agile, QA methodologies, defect management system, and documentation. Good at setting Quality standards in various new testing technologies in the industry. Good at identifying and defining areas to calculate the overall risk to the project and creating strategies to mitigate those risks and escalate as necessary. Excellent Analytical and communication skills are essential, with strong verbal and writing proficiencies. Preferred Qualifications Strong in ETL and Big Data Testing Proficiency in SQL About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 314958 Posted On: 2025-07-15 Location: Gurgaon, India

Posted 1 week ago

Apply

3.0 - 8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

TCS HIRING !!! Role : Data Scientist Required Technical Skill Set : Data Science Experience: 3-8 years Locations: Kolkata,Hyd,Bangalore,Chennai,Pune Job Description: Must-Have** (Ideally should not be more than 3-5) Proficiency in Python or R for data analysis and modeling. Strong understanding of machine learning algorithms (regression, classification, clustering, etc.). Experience with SQL and working with relational databases. Hands-on experience with data wrangling, feature engineering, and model evaluation techniques. Experience with data visualization tools like Tableau, Power BI, or matplotlib/seaborn. Strong understanding of statistics and probability. Ability to translate business problems into analytical solutions. Good-to-Have Experience with deep learning frameworks (TensorFlow, Keras, PyTorch). Knowledge of big data platforms (Spark, Hadoop, Databricks). Experience deploying models using MLflow, Docker, or cloud platforms (AWS, Azure, GCP). Familiarity with NLP, computer vision, or time series forecasting. Exposure to MLOps practices for model lifecycle management. Understanding of data privacy and governance concepts.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About us: Where elite tech talent meets world-class opportunities! At Xenon7, we work with leading enterprises and innovative startups on exciting, cutting-edge projects that leverage the latest technologies across various domains of IT including Data, Web, Infrastructure, AI, and many others. Our expertise in IT solutions development and on-demand resources allows us to partner with clients on transformative initiatives, driving innovation and business growth. Whether it's empowering global organizations or collaborating with trailblazing startups, we are committed to delivering advanced, impactful solutions that meet today's most complex challenges. We are building a community of top-tier experts and we're opening the doors to an exclusive group of exceptional AI & ML Professionals ready to solve real-world problems and shape the future of intelligent systems. Structured Onboarding Process We ensure every member is aligned and empowered: Screening - We review your application and experience in Data & AI, ML engineering, and solution delivery Technical Assessment - 2-step technical assessment process that includes an interactive problem-solving test, and a verbal interview about your skills and experience Matching you to Opportunity - We explore how your skills align with ongoing projects and innovation tracks Who We're Looking For As a Data Analyst, you will work closely with business stakeholders, data engineers, and data scientists to analyze large datasets, build scalable queries and dashboards, and provide deep insights that guide strategic decisions. You'll use Databricks for querying, transformation, and reporting across Delta Lake and other data sources.nd act on data with confidence. Requirements 6+ years of experience in data analysis, BI, or analytics roles Strong experience with Databricks Notebooks, SQL, and Delta Lake Proficiency in writing complex SQL queries (joins, CTEs, window functions) Experience with data profiling, data validation, and root-cause analysis Comfortable working with large-scale datasets and performance tuning Solid understanding of data modeling concepts and ETL workflows Experience with business intelligence tools (e.g., Power BI, Tableau) Familiarity with Unity Catalog and data access governance (a plus) Exposure to Python or PySpark for data wrangling (a plus) Benefits At Xenon7, we're not just building AI systems—we're building a community of talent with the mindset to lead, collaborate, and innovate together. Ecosystem of Opportunity: You'll be part of a growing network where client engagements, thought leadership, research collaborations, and mentorship paths are interconnected. Whether you're building solutions or nurturing the next generation of talent, this is a place to scale your influence Collaborative Environment: Our culture thrives on openness, continuous learning, and engineering excellence. You'll work alongside seasoned practitioners who value smart execution and shared growth Flexible & Impact-Driven Work: Whether you're contributing from a client project, innovation sprint, or open-source initiative, we focus on outcomes—not hours. Autonomy, ownership, and curiosity are encouraged here Talent-Led Innovation: We believe communities are strongest when built around real practitioners. Our Innovation Community isn't just a knowledge-sharing forum—it's a launchpad for members to lead new projects, co-develop tools, and shape the direction of AI itself

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are hiring - Data Engineering Specialist About the client Our client is a British multinational general insurance company headquartered in London, England, has major operations in the United Kingdom, Ireland, Scandinavia and Canada. It provides insurance products and services in more than 100 countries through a network of local partners. · Key Responsibilities: The purpose of the Data Engineer is to build and unit test code for Projects and Programmes on Azure Cloud Data and Analytics Platform. Analyse business requirements and support\create design for requirements. Build and deploy new/changes to data mappings, sessions, and workflows in Azure Cloud Platform – key focus area would be Azure Databricks. ·Develop performant code. · Perform ETL routines performance tuning, troubleshooting, support, and capacity estimation. · Conduct thorough testing of ETL code changes to ensure quality deliverables · Provide day-to-day support and mentoring to end users who are interacting with the data · Profile and understand large amounts of source data available, including structured and analyse defects and provide fixes · Provide release notes for deployments · Support Release activities · Problem solving attitude Keep up to date with new skills - Develop technology skills in other areas of Platform · Skills & Experience Required -Experienced in ETL tools, data projects -Recent Azure experience – Strong knowledge of Azure Data Bricks (Python/SQL) -Good knowledge of SQL & Python · Strong Analytical skills · Azure DevOps knowledge -Experience with Azure Databricks, Logic Apps would be highly desirable -Experience with Python programming would be highly desirable -Experience with Azure Functions would be a plus Interested candidates can apply by sharing their resume at techcareers@invokhr.com or apply via LinkedIn job post.

Posted 1 week ago

Apply

5.0 - 10.0 years

17 - 32 Lacs

Kochi

Hybrid

We are conducting a Weekday walk-in drive in Kochi from 15th July to 21s t July 2025 (Weekday only). Venue : Neudesic, an IBM Company, 3 rd Floor, Block A, Prestige Cyber Green Phase 1, Smart City, Kakkanad, Ernakulam, Kerala 682030 Time : 2 PM - 6 PM Date : 28 June 2025, Saturday Experience : 5+ yrs Mode of Interview : In-Person Only for candidates can join in 30 days. Azure Data Engineer Skills required : SQL, Python, PySpark, Azure Data Factory, Azure Data Lake Gen2, Azure Databricks, Azure Synapse, NoSQL DBs, Data Warehouses, GenAI (desirable) Strong data engineering skills in data cleansing, transformation, enrichment, semantic analytics, real-time analytics, ML/DL (desirable), streaming, data modeling, and data management.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position Overview: ShyftLabs is seeking an experienced Databricks Architect to lead the design, development, and optimization of big data solutions using the Databricks Unified Analytics Platform. This role requires deep expertise in Apache Spark, SQL, Python, and cloud platforms (AWS/Azure/GCP). The ideal candidate will collaborate with cross-functional teams to architect scalable, high-performance data platforms and drive data-driven innovation. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to accelerate business growth across various industries by focusing on creating value through innovation. Job Responsibilities Architect, design, and optimize big data and AI/ML solutions on the Databricks platform. Develop and implement highly scalable ETL pipelines for processing large datasets. Lead the adoption of Apache Spark for distributed data processing and real-time analytics. Define and enforce data governance, security policies, and compliance standards. Optimize data lakehouse architectures for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineers to enable AI/ML-driven insights. Oversee and troubleshoot Databricks clusters, jobs, and performance bottlenecks. Automate data workflows using CI/CD pipelines and infrastructure-as-code practices. Ensure data integrity, quality, and reliability across all data processes. Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 8+ years of hands-on experience in data engineering, with at least 5+ years in Databricks Architect and Apache Spark. Proficiency in SQL, Python, or Scala for data processing and analytics. Extensive experience with cloud platforms (AWS, Azure, or GCP) for data engineering. Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture. Hands-on experience with CI/CD tools and DevOps best practices. Familiarity with data security, compliance, and governance best practices. Strong problem-solving and analytical skills in a fast-paced environment. Preferred Qualifications: Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer). Hands-on experience with MLflow, Feature Store, or Databricks SQL. Exposure to Kubernetes, Docker, and Terraform. Experience with streaming data architectures (Kafka, Kinesis, etc.). Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker). Prior experience working with retail, e-commerce, or ad-tech data platforms. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.

Posted 1 week ago

Apply

3.5 years

0 Lacs

Kochi, Kerala, India

On-site

Job Title - + + Management Level: Location: Kochi, Coimbatore, Trivandrum Must have skills: Big Data, Python or R Good to have skills: Scala, SQL Job Summary A Data Scientist is expected to be hands-on to deliver end to end vis a vis projects undertaken in the Analytics space. They must have a proven ability to drive business results with their data-based insights. They must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Responsibilities Roles and Responsibilities Identify valuable data sources and collection processes Supervise preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns for insurance industry. Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Collaborate with engineering and product development teams Hands-on knowledge of implementing various AI algorithms and best-fit scenarios Has worked on Generative AI based implementations Professional And Technical Skills 3.5-5 years’ experience in Analytics systems/program delivery; at least 2 Big Data or Advanced Analytics project implementation experience Experience using statistical computer languages (R, Python, SQL, Pyspark, etc.) to manipulate data and draw insights from large data sets; familiarity with Scala, Java or C++ Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) and experience with applications Hands on experience in Azure/AWS analytics platform (3+ years) Experience using variations of Databricks or similar analytical applications in AWS/Azure Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Strong mathematical skills (e.g. statistics, algebra) Excellent communication and presentation skills Deploying data pipelines in production based on Continuous Delivery practices. Additional Information Multi Industry domain experience Expert in Python, Scala, SQL Knowledge of Tableau/Power BI or similar self-service visualization tools Interpersonal and Team skills should be top notch Nice to have leadership experience in the past About Our Company | Accenture

Posted 1 week ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Looking for Immediate Joiners. Location: Mumbai Expereince-5+ Years As a Senior Databricks Administrator, you will be responsible for the setup, configuration, administration, and optimization of the Databricks Platform on AWS. This role will play a critical part in managing secure, scalable, and high-performing Databricks environments, with a strong focus on governance, user access management, cost optimization, and platform operations. You will collaborate closely with engineering, infrastructure, and compliance teams to ensure that the Databricks platform meets enterprise data and regulatory requirements. Must-have Skills 6+ years of experience in Databricks administration on AWS or multi-cloud environments. Deep understanding of Databricks workspace architecture, Unity Catalog, and cluster configuration best practices. Strong experience in managing IAM policies, SCIM integration, and access provisioning workflows. Hands-on experience with monitoring, cost optimization, and governance of large-scale Databricks deployments. Hands-on experience with infrastructure-as-code (Terraform) and CI/CD pipelines. Experience with ETL orchestration and collaboration with engineering teams (Databricks Jobs, Workflows, Airflow).

Posted 1 week ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description - Data Analyst About the Role We’re looking for a data-savvy, hands-on analyst who’s comfortable working with Python and Power BI and has a basic understanding of Databricks. You don’t need to be an expert- but you should be curious, eager to learn, and comfortable working with data from multiple systems. This is not just a pricing support role, it’s a key position in shaping how our Databricks environment connects to business reporting across the company. You’ll help the Market Intelligence and Global Pricing teams bridge technical data flows with business reporting and support the continued development of a Python-based pricing algorithm (developed with a top consultancy). The ideal candidate has medium-level Python skills, can confidently navigate Power BI, and is open to learning more about Databricks, ETL, and data science. AI tools like ChatGPT or Copilot are part of your workflow, helping you write or troubleshoot scripts and think creatively about solutions. Key Responsibilities Work with data stored in Databricks, including maintaining and improving a Python-based pricing allocation model (random forest + clustering) and building additional data pipelines as needed for business reporting. Use and adapt Python scripts to transform data from Databricks, ERP systems, and other internal sources into clean, BI-ready datasets. Build and maintain Power BI dashboards using multiple data sources: ERP (M3, BPCS), Excel, and customer inputs. Help manage basic ETL workflows: extract, clean, and transform data even without a dedicated data engineer. Define, configure, and register new data sources (e.g. Excel, CSVs) in Power BI and created structured, reusable models. Collaborate with business and IT stakeholders to identify and transform the right data within Databricks for dashboarding and reporting needs. Translate business questions into scalable datasets and intuitive Power BI dashboards for marketing, pricing, and executive teams. Gradually support more advanced analytics: clustering, A/B testing, predictive modeling, and performance diagnostics. Who You Are Comfortable working with Python (medium level) and able to write, debug, or adapt data scripts. Confident using AI tools like ChatGPT or GitHub Copilot to create or fix Python code, with judgment on when to refine it manually. Familiar with or open to learning Databricks. Basic experience is a plus, and our IT team will help you develop further. Skilled in Power BI, especially with data modeling, DAX, and combining structured and unstructured sources. Comfortable navigating ERP systems, data lakes, and warehouse environments, even when data isn’t clean or standardized. Able to work independently to troubleshoot data issues and build end-to-end reporting solutions. A strong communicator who can explain technical results to commercial or non-technical audiences. Curious, proactive, and eager to grow your skillset in data science, predictive analytics, and data engineering over time. Preferred Qualifications Bachelor’s or master’s degree in data science, Engineering, Business Analytics, Economics, or a related field. 4+ Years of experience in relevant field. Working knowledge of SQL, DAX, or similar query tools. Experience or interest in machine learning, clustering algorithms, random forest models, A/B testing, and hypothesis testing. Understanding of pricing, commercial analytics, or the mining/heavy equipment industry is a plus but not required. Why Join Us? You’ll be part of a growing analytics function at a key moment of transformation. This role gives you the chance to shape how we manage data from raw ERP or customer files to dynamic dashboards in Power BI. With hands-on exposure to Databricks, Python, and AI-assisted scripting, you’ll work across multiple domains and play a visible role in how we make data-driven decisions. Epiroc is a global productivity partner for mining and construction customers, and accelerates the transformation toward a sustainable society. With ground-breaking technology, Epiroc develops and provides innovative and safe equipment, such as drill rigs, rock excavation and construction equipment and tools for surface and underground applications. The company also offers world-class service and other aftermarket support as well as solutions for automation, digitalization and electrification. Epiroc is based in Stockholm, Sweden, had revenues of more than SEK 60 billion in 2023, and has around 18 200 passionate employees supporting and collaborating with customers in around 150 countries. Learn more at www.epiroc.com.

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Veeam, the #1 global market leader in data resilience, believes businesses should control all their data whenever and wherever they need it. Veeam provides data resilience through data backup, data recovery, data portability, data security, and data intelligence. Based in Seattle, Veeam protects over 550,000 customers worldwide who trust Veeam to keep their businesses running. Join us as we move forward together, growing, learning, and making a real impact for some of the world’s biggest brands. The future of data resilience is here - go fearlessly forward with us. This APJ-based role is focused on supporting the multitude of existing enablement programs for our internal and channel selling programs. With a drive for analysis, the successful candidate will have a passion for using data to inform decisions. Utilising strong skills in reporting and metrics, they will be responsible for guiding the creation and utilisation of content. The successful candidate will be involved in identifying, planning and shaping end-user enablement materials and activities. Working closely with the cross-functional APJ teams in the sales, technical and marketing and teams it requires a candidate who is curious about data, ROI and the measurement of program success. As part of the APJ sales acceleration team this role reports to the APJ Sales Acceleration Senior Director and would be office-based 3 days a week (Tue-Thur), as well as including international travel to regions including India, Korea, Australia and Southeast Asia as needed. Responsibilities Be involved in the creation/delivery/ execution and reporting of sales-based programs such as (but not limited to) webinars, partner competency programs, sales training microlearning. Measure the efficacy / ROI of enablement programs. Provide detailed analysis of what works and why and use these findings to guide further programs Create and maintain accurate data required for event attendance such as launchpad and sales training experience in providing detailed reporting on progress with tools such a Tableau, Excel, Monday.com and others Actively manage the onboarding process Experience in working with procurement departments for logistics on hotel based events. Evaluate existing programs to ensure their quality and effectiveness Communicate weekly with stakeholders from enablement team and marketing & sales stakeholders. Be the APJ leader in maintaining content repositories on platforms such as Cornerstone. Qualifications Familiarity with sales methodologies and their adaptation into a sales environment. Awareness of or experience with Salesforce.com or similar CRM preferred. Awareness of or experience with collaboration tools like MS Teams, WebEx, etc. Facilitation & coaching experience. Creation/maintenance of Monday.com boards Advanced Excel skills Advanced Tableau / DataBricks skills Familiarity with basic AI concepts such as LLM, token weighting, etc. Experience in DISC Solid ROI research credentials Proven Veeam portfolio knowledge. Demonstrated experience in either a partner or partner ecosystem training role Excellent communication and interpersonal skills. Proven record of driving programs and projects independently with success. Exceptional organization skills with the ability to manage multiple projects simultaneously. Ability to adapt in a fast-paced work environment; must be a high-energy, motivated self-starter. Able to travel as needed (up to 30%) international travel. Veeam Software is an equal opportunity employer and does not tolerate discrimination in any form on the basis of race, color, religion, gender, age, national origin, citizenship, disability, veteran status or any other classification protected by federal, state or local law. All your information will be kept confidential. Please note that any personal data collected from you during the recruitment process will be processed in accordance with our Recruiting Privacy Notice. The Privacy Notice sets out the basis on which the personal data collected from you, or that you provide to us, will be processed by us in connection with our recruitment processes. By applying for this position, you consent to the processing of your personal data in accordance with our Recruiting Privacy Notice.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Computer Vision & Machine Learning Lead Engineer We are seeking Computer Vision & Machine Learning Engineer with strong software development expertise who can architect & develop ML based solutions for computer vision applications and deploy them at scale. Minimum qualifications: Bachelor’s or Master’s degree in computer science, Electrical Engineering, Information Systems, or a related field. 7+ years of extensive software development experience in Python, Pytorch, reading/debugging code in Python, C++ & Shell 4+ years of experience directly working on ML based solutions preferable convolutional neural networks applied to computer vision problem statements Proficiency in software design and architecture and Object-Oriented programming. Experience working with docker or similar containerization frameworks along with container orchestration Experience with Linux/Unix or similar systems, from the kernel to the shell, file systems, and client-server protocols. Experience troubleshooting and resolving technical issues in an application tech stack including AI/ML. Solid understanding of common SQL and No SQL databases Experience working with AWS or similar platforms Strong communication skills and ability to work effectively in a team. Preferred qualifications: Experience working with distributed clusters and multi-node environment. Familiar with basics of web technologies and computer networking AWS certifications or similar Formal academic background in Machine Learning Experience working with large image datasets (100K+ images) Responsibilities Architect and develop Machine Learning Based computer vision algorithms for various applications Responsible for delivering software and solutions while meeting all quality standards Design, implement and optimize machine learning training & inference pipelines and algorithms on cloud or on-prem hardware Understand functional and non-functional requirements of features and breakdown tasks for the team Take ownership of delivery for self as well as team Collaborate closely with product owners and domain/technology experts to integrate and validate software within a larger system. Engage with internal teams and provide support to teams located in North America & Europe Base Skillsets Python, Pytorch, one of the cloud platform AWS / GCP / Azure, Linux, Docker, Database Optional Skillsets: Databricks, MLOps, CI/CD

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies