Jobs
Interviews

1515 Data Bricks Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

Do you love leading data-driven transformations and mentoring teams in building scalable data platforms? Were looking for a Data Tech Lead to drive innovation, architecture, and execution across our data ecosystem. Your Role: Lead the design and implementation of modern data architecture, ETL/ELT pipelines, and data lakes/warehouses Set technical direction and mentor a team of talented data engineers Collaborate with product, analytics, and engineering teams to translate business needs into data solutions Define and enforce data modeling standards, governance, and naming conventions Take ownership of the end-to-end data lifecycle: ingestion, transformation, storage, access, and monitoring Evaluate and implement the right cloud/on-prem tools and frameworks Troubleshoot and resolve complex data challenges, while optimizing for performance and cost Contribute to documentation, design blueprints, and knowledge sharing We’re Looking For Someone With: Proven experience in leading data engineering or data platform teams Expertise in designing scalable data architectures and modern data stacks Strong hands-on experience with cloud platforms (AWS/Azure/GCP) and big data tools Proficiency in Python, SQL, Spark, Databricks, or similar tools A passion for clean code, performance tuning, and high-impact delivery Strong communication, collaboration, and leadership skills

Posted 1 month ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Pune

Work from Office

The headlines Job Title Data Consultant (Delivery) Start Date Mid-July 2025 Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??700,000 ??2,100,000/annum A bit about the role Were looking for passionate Data Consultants to join our Delivery team; a thriving and fast-growing community of some of the industrys best cloud data engineers at all levels, ranging from interns and graduates up to seasoned experts In this role, you'll combine technical expertise with strong commercial and client-facing skills You'll get the unique opportunity to work with advanced tools and methodologies, develop innovative solutions, and play an integral part in delivering value to our clients In a culture that values growth, mentorship, and technical excellence, this is the perfect opportunity for a data engineer looking to make a real impact within an international, industry-leading consultancy What you'll be doing Delivering high-quality data solutions by successfully managing development tasks with minimal guidance Working with industry-leading technologies such as Snowflake, Matillion, Power BI, and Databricks, with a focus on mastering at least one toolset while expanding your expertise in others Building trusted relationships with clients, managing expectations, and finding opportunities to add value beyond project scope Contributing to internal knowledge-sharing, delivering presentations, training sessions, and thought leadership content Driving business impact by engaging with stakeholders, understanding business challenges, and translating them into data-driven solutions Leading investigations, client workshops, and demonstrations to showcase technical expertise and problem-solving skills Balancing multiple priorities effectively, knowing when to escalate issues and when to push forward with solutions independently Helping shape the Snap Analytics team by mentoring junior consultants and sharing your expertise with others What you'll need to succeed Technical Expertise Strong experience in SQL, data modelling, and ETL processes Exposure to tools like Snowflake, Matillion, Databricks, or Power BI is highly desirable A Problem-Solving Mindset The ability to identify multiple solutions, analyse trade-offs, and confidently propose the best approach Client Engagement Skills Strong communication and stakeholder management abilities, ensuring seamless collaboration with clients at all levels Analytical Thinking The capability to evaluate data solutions critically and proactively identify opportunities for optimisation Ownership & Initiative Be self-motivated and accountable, with a proactive approach to learning and personal development A 'Team Player' Mentality Willingness to contribute to internal initiatives, support colleagues, and help grow Snap Analytics as a company So, what's in it for you A chance to work with the latest cloud data platforms, shaping enterprise-scale data solutions We'll support your journey towards technical certifications and leadership roles A collaborative and supportive culture in which we believe in knowledge-sharing, teamwork, and helping each other succeed The opportunity to write blogs, contribute to industry discussions, and become a recognised expert in your field A rewarding compensation package with opportunities for progression About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Matillion, Databricks, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!

Posted 1 month ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Pune, Gurugram, Delhi / NCR

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 1 month ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 18 Lacs

Chennai

Work from Office

Role Overview Are you passionate about building scalable data systems and working on cutting-edge cloud technologies? We're looking for a Senior Data Engineer to join our team and play a key role in transforming raw data into powerful insights. What Youll Do: Design, develop, and optimize scalable ETL/ELT pipelines and data integration workflows Build and maintain data lakes, warehouses, and real-time streaming pipelines Work with both structured and unstructured data, ensuring clean, usable datasets for analytics & ML Collaborate with analytics, product, and engineering teams to implement robust data models Ensure best practices around data quality, governance, lineage, and security Code in Python, SQL, PySpark, and work on Databricks Operate in AWS environments using Redshift, Glue, S3 Continuously monitor and optimize pipeline performance Document workflows and contribute to engineering standards What We’re Looking For: Strong hands-on experience in modern data engineering tools & platforms Cloud-first mindset with expertise in AWS data stack Solid programming skills and a passion for building high-performance data systems Excellent communication & collaboration skills

Posted 1 month ago

Apply

8.0 - 13.0 years

16 - 20 Lacs

Bengaluru

Work from Office

About the Role Tech lead with BI and analytics experience. Hands-on with GenAI code assist tools and API development. Responsibilities Tech lead with BI and analytics experience. Hands-on with GenAI code assist tools and API development. Qualifications Experience: 8-15 Years Required Skills Python Clickhouse Databricks Azure Cloud Containerization technologies

Posted 1 month ago

Apply

5.0 - 9.0 years

2 - 3 Lacs

Ahmedabad

Work from Office

*instinctools is a software development company that provides custom software solutions for businesses of all sizes. Our team works closely with clients to understand their specific needs and provide personalized solutions that meet their business requirements. *instinctools is looking for a Senior Data Engineer for one of our clients. Our Client is one of the TOP-5 global management consulting firms considered to be among the most prestigious ones in the world. Hundreds of customers from Fortune-500 , including the largest global financial institutions, the worlds top media companies, technology companies and federal government agencies rely on our Clients proven platform and services. Project is a dynamic solution empowering companies to optimize promotional activities for maximum impact. It collects and validates data, analyzes promotion effectiveness, plans calendars, and integrates seamlessly with existing systems. The tool enhances vendor collaboration, negotiates better deals, and employs machine learning to optimize promotional plans, enabling companies to make informed decisions and maximize return on investment. Stack on the project: Databricks, SQL, and Spark, AWS, Python. Tasks: Build and optimize data pipelines using Databricks, SQL, and Apache Spark. Design and implement scalable data processing systems. Manage and optimize data pipelines. Ensure the quality and efficiency of data flows. Our expectations of the ideal candidate: 5+ years of experience as a Data Engineer. Deep expertise in big data technologies, particularly Databricks, SQL, and Spark. Very strong SQL skills. Experience in data modeling and ETL processes. Experience with analytics engineering is a plus. Experience with DBT, AWS, Python. Soft Skills: Prefer problem solving style over experience Ability to clarify requirements with the customer Willingness to pair with other engineers when solving complex issues Good communication skills English: Upper-Intermediate or higher We offer: flexible working time (from Indian location) professional and ambitious team learning opportunities, seminars and conferences and time for exploring new technologies co-funding for language courses (English)

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to coding standards and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data processing and analytics workflows.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Knowledge of data integration techniques and ETL processes. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. You will also be responsible for troubleshooting issues and implementing solutions that enhance the overall functionality and performance of the applications. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to foster their professional growth and development.- Continuously evaluate and improve development processes to enhance team efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data warehousing concepts and practices. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Coimbatore office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

Staff Software Engineer (Data Engineer) R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly-traded organization with employees throughout the US and multiple INDIA locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Position summary We are seeking a Staff Data Engineer with 7-10 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company Key duties & responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python/SCALA Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with Databricks preferred Experience working with agile methodology preferred Healthcare industry experience preferred Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your developmentby implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

7.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. : We are seeking a Staff Data Engineer with 7-10 years of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Deep knowledge and experience working with Scala and Spark. Experienced in Azure data factory, Azure Data bricks, Azure Synapse Analytics, Azure Data Lake. Experience working in Full stack development in .Net & Angular. Experience working with SQL and NoSQL database systems such as MongoDB, Couchbase. Experience in distributed system architecture design. Experience with cloud environments (Azure Preferred). Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred). Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers. Experience working with Databricks is preferred. Experience working with agile methodology is preferred. Healthcare industry experience is preferred. Job Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions. Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data. Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users and stakeholders. You will also be responsible for developing new features and functionalities, contributing to the overall success of the projects you are involved in, and ensuring high-quality deliverables through rigorous testing and validation processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Noida

Work from Office

Who we are: R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, amongst Top 50 Best Workplaces for Millennials, Top 50 for Women, Top 25 for Diversity and Inclusion and Top 10 for Health and Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 17,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. We are seeking a Data Engineer with 4-6 years of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Deep knowledge and experience working with Scala and Spark. Experienced in Azure data factory, Azure Data bricks, Azure Synapse Analytics, Azure Data Lake. Experience working in Full stack development in .Net & Angular. Experience working with SQL and NoSQL database systems such as MongoDB, Couchbase. Experience in distributed system architecture design. Experience with cloud environments (Azure Preferred). Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred). Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers. Experience working with Databricks is preferred. Experience working with agile methodology is preferred. Healthcare industry experience is preferred. Job Responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions. Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data. Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Noida

Work from Office

Key duties & responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python/SCALA Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with Databricks preferred Experience working with agile methodology preferred Healthcare industry experience preferred Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your developmentby implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

10.0 - 15.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 10+ years of experience in a similar role, working with Python, data validation, DB2, Snowflake, and Databricks Proven track record of designing and deploying high-quality, scalable, and reliable data solutions Experience with data validation techniques and methodologies to ensure the accuracy and integrity of data Hands-on experience with DB2 for database management and operations Experience working with Snowflake for cloud-based data warehousing solutions Proficiency in Databricks for big data processing and analytics on a unified data platform Proficiency in Python programming with a solid understanding of its ecosystems and frameworks Ability to participate in daily stand-up meetings and project planning sessions. Ability to collaborate with cross-functional teams to understand business requirements and design data solutions Ability to write, test, and deploy software solutions Ability to conduct data validation to ensure the accuracy and quality of data Ability to monitor data quality and implement processes to ensure data integrity Ability to perform data analysis and troubleshoot data-related issues Ability to provide technical support to team members and resolve technical issues Preferred Qualification: Experience working with large-scale data-driven applications is a plus At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone – of every race, gender, sexuality, age, location and income – deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #Nic External Candidate Application Internal Employee Application

Posted 1 month ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Pune

Work from Office

Designation: Big Data Lead/Architect Location: Pune Experience: 8-10 years NP - immediate joiner/15-30 days notice Reports To – Product Engineering Head Job Overview We are looking to hire a talented big data engineer to develop and manage our company’s Big Data solutions. In this role, you will be required to design and implement Big Data tools and frameworks, implement ELT processes, collaborate with development teams, build cloud platforms, and maintain the production system. To ensure success as a big data engineer, you should have in-depth knowledge of Hadoop technologies, excellent project management skills, and high-level problem-solving skills. A top-notch Big Data Engineer understands the needs of the company and institutes scalable data solutions for its current and future needs. Responsibilities: Meeting with managers to determine the company’s Big Data needs. Developing big data solutions on AWS, using Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, Hadoop, etc. Loading disparate data sets and conducting pre-processing services using Athena, Glue, Spark, etc. Collaborating with the software research and development teams. Building cloud platforms for the development of company applications. Maintaining production systems. Requirements: 8-10 years of experience as a big data engineer. Must be proficient with Python & PySpark. In-depth knowledge of Hadoop, Apache Spark, Databricks, Delta Tables, AWS data analytics services. Must have extensive experience with Delta Tables, JSON, Parquet file format. Good to have experience with AWS data analytics services like Athena, Glue, Redshift, EMR. Familiarity with Data warehousing will be a plus. Must have Knowledge of NoSQL and RDBMS databases. Good communication skills. Ability to solve complex data processing, transformation related problems

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of data platform components.- Ensure data platform scalability and performance.- Conduct regular data platform audits.- Stay updated on emerging data platform technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience with data integration and data modeling.- Hands-on experience with data pipeline orchestration tools.- Knowledge of data security and compliance standards. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

0.0 - 2.0 years

5 - 9 Lacs

Mumbai

Work from Office

About NCR Atleos Title BI Analyst Location Mumbai Position Summary NCR Corporation, a Fortune 500 company, is looking for Business Intelligence Analysts in Mumbai, India. The ideal candidates must have excellent analytical/logical skills. We are looking for candidates that speak fluent English Key Areas of Responsibility Provide on-going analytics, metric optimization and reports for the Business & Leadership/Management to assist in day to day business decisions. Resolve any issues that arise within key analytics from a development standpoint.For data integrity issues, provide support in resolving. Develop scorecards and dashboards for KPI tracking, and enable them to make sound business decisions informed by accurate and reliable information. Collaborate with cross-functional teams within the organization to obtain and develop accurate data for the business models and metrics. Communicate it with NCR internal/external customers worldwide. Analyze data trends, seasonality, and other random effects to quickly identify issues or best practices, assist program subject matter experts to get to root cause analysis. Responsible for ensuring contractual obligations with customers are met on a work order and systems basis; Coordinates activities associated with product/service resolution issues Qualifications Able to learn at a rapid pace, desire to go the extra-mile consistently Bachelors Degree and 0-2 years of related experience Ability to determine what data trends mean. Being able to analyze the data is crucial. Good communication (verbal and written), presentation, and organization skills; ability to communicate complex analytical findings clearly and concisely. Sound knowledge of Microsoft Office (Access, Excel, PowerPoint & Macros (Basic SQL). Candidate should be able to query or write a script. Knowledge of tools such as Fabric, SAP Business Objects, Power BI and Databricks is a plus. EEO Statement NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law. Statement to Third Party Agencies To ALL recruitment agenciesNCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.

Posted 1 month ago

Apply

9.0 - 11.0 years

13 - 17 Lacs

Pune

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Design, develop, and maintain scalable data pipelines on Databricks using PySpark Collaborate with data analysts and scientists to understand data requirements and deliver solutions Optimize and troubleshoot existing data pipelines for performance and reliability Ensure data quality and integrity across various data sources Implement data security and compliance best practices Monitor data pipeline performance and conduct necessary maintenance and updates Document data pipeline processes and technical specificationsLocation of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Mysore, Kolkata, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have 9+ years of experience in data engineering Proficiency with Databricks and Apache Spark Strong SQL skills and experience with relational databases Experience with big data technologies (e.g., Hadoop, Kafka) Knowledge of data warehousing concepts and ETL processes Experience with CI/CD tools, particularly Jenkins Excellent problem-solving and analytical skills Solid understanding of big data fundamentals and experience with Apache Spark Familiarity with cloud platforms (e.g., AWS, Azure) Experience with version control systems (e.g., BitBucket) Understanding of DevOps principles and tools (e.g., CI/CD, Jenkins) Databricks certification is a plus Preferred Skills: Technology-Big Data-Big Data - ALL Technology-Cloud Integration-Azure Data Factory (ADF) Technology-Cloud Platform-AWS Data Analytics-AWS Data Exchange

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Base SAS Certified professional.Develop, implement, and optimize analytical models using SAS and SQL.Strong Knowledge within SAS DI,SAS EG, SAS BI tools.Analyze large datasets to derive actionable insights and support business decision-making.Design, implement, and maintain ETL workflows to extract, transform, and load data efficiently.Develop advanced SAS programs using SAS Macros for automation and data processing.Troubleshoot and optimize SAS code for performance improvements.Work on data warehousing projects to enable efficient data storage and retrieval.Basic Unix Scripts related Knowledge.JL5B- 8+ Years of ExperienceAdvanced SAS Certified with 5+Worked on Min 2SAS Migration Projects execution.Strong Knowledge within SAS DI,SAS EG, SAS BI tools.SAS Migration Projects to SAS Scripts to Python, Pyspark, Databricks, ADF and Snowflake etc.Good Knowledge within SAS VIYA 3.4,4.0 platforms.Good Knowledge on SAS- SMC, LSF and other Schedulers Basic Knowledge on SAS Admin information, Basic SAS Grid level Knowledge.Good Unix commands, Scripts related Knowledge.Handle case studies or complex data scenarios, ensuring data quality and integrity.Develop advanced SAS programs using SAS Macros for automation and data processing.Troubleshoot and optimize SAS code for performance improvements.Collaborate with the data engineering team to build and manage robust data pipelines.Work on data warehousing projects to enable efficient data storage and retrieval.Present findings and insights clearly to both technical and non-technical stakeholders.Work closely with teams across departments to gather requirements and deliver solutions Technical and Professional : Min 5+ years in the analytics domain, with a strong portfolio of relevant projects.Proficiency in SAS, SAS Macros, and SQL.Hands-on experience in ETL processes and tools.Knowledge of data engineering concepts and data warehousing best practices. Preferred Skills: Technology-Reporting Analytics & Visualization-SAS Enterprise Guide

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in the development and implementation of software solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to define, design, and ship new features.- Develop high-quality software design and architecture.- Identify, prioritize, and execute tasks in the software development life cycle.- Conduct software analysis, programming, testing, and debugging.- Troubleshoot and resolve issues in existing software applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data analytics platforms.- Experience in developing and deploying scalable applications.- Knowledge of data modeling and database design.- Hands-on experience with data integration and ETL processes. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10.0 - 15.0 years

5 - 9 Lacs

Hyderabad

Work from Office

: Design, develop, and maintain data pipelines and ETL processes using Databricks. Manage and optimize data solutions on cloud platforms such as Azure and AWS. Implement big data processing workflows using PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Optimize and tune big data solutions for performance and scalability. Stay updated with the latest industry trends and technologies in big data and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Big Data Engineer or similar role. Strong proficiency in Databricks and cloud platforms (Azure/AWS). Expertise in PySpark and big data processing. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud services and infrastructure. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other big data technologies and frameworks. Knowledge of machine learning frameworks and libraries. Certification in cloud platforms or big data technologies.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies