Jobs
Interviews

1401 Data Bricks Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

13 - 23 Lacs

Pune, Chennai, Bengaluru

Work from Office

Expertise in Terraform and ARM templates for infrastructure automation. Strong proficiency with Azure Databricks – clusters, jobs, workspace, Unity Catalog. Proficiency in using GitHub / GitHub Actions for CI/CD and infrastructure management.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

13 - 23 Lacs

Chennai, Bengaluru, Delhi / NCR

Work from Office

Role & responsibilities Position : Sr. Azure Devops Engineer Work Mode: Hybrid (Work Timing: 12 9 PM) Location: Pan India Notice Period: Immediate Or June joiners for 1st Week of July Primary Skills: Exp in DevOpss role, Data bricks, Terraform, Ansible, API Job Description Provisioning and Configuring Azure components, including Data, App Services, Azure Kubernetes Services, Azure AIML and other components Developing terraform scripts and ARM Templates Utilizing Github and Azure DevOps for code and infrastructure deployments Managing & registering APIs using API Gateway (Kong). Understanding the internals of Azure Databricks and Unity cataglog Troubleshooting Azure platform issues. (Nice-to-have) Snowflake provisioning and configuration skills Additionally, excellent communication and collaboration skills are non-negotioable for those roles. Regards, HR Manager

Posted 3 weeks ago

Apply

8.0 - 12.0 years

35 - 40 Lacs

Bengaluru

Hybrid

Key Responsibilities: Develop and maintain scalable data pipelines in Databricks to support the migration of data from Oracle-based Data Warehouse (DWH) and Operational Data Store (ODS) systems. Analyze and understand existing Oracle schemas, stored procedures, and data transformation logic. Translate PL/SQL logic into PySpark/Databricks SQL. Develop and maintain Delta Lake-based datasets with appropriate partitioning, indexing, and optimization strategies. Perform data validation and reconciliation post-migration, including row count, data integrity, and accuracy checks. Build reusable and modular data ingestion and transformation frameworks using Databricks Notebooks, Jobs, and Workflows. Collaborate with DevOps teams for CI/CD pipeline integration and efficient deployment. Optimize performance of existing pipelines and queries in Databricks. Document architecture, transformation logic, and data lineage clearly for operational transparency. Utilize some ETL codes from Pentaho and TIBCO, Perl to support legacy ETL understanding and potential migration to PySpark/Databricks SQL Mandatory Skills & Experience: 6 to 8 years of experience in data engineering, with strong experience in Oracle DWH/ODS environments. Minimum 3+ years hands-on experience in Databricks (including PySpark, SQL, Delta Lake, Workflows). Strong understanding of Lakehouse architecture, cloud data platforms, and big data processing.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

20 - 24 Lacs

Pune

Work from Office

Immediate/Early joiners preferred. Proven expertise in building ML & GenAI apps using RAG , Agentic AI, NLP, LLMs (Langchain, LlamaIndex). Strong in Python , TensorFlow, PyTorch, Azure , Databricks , MLOps, and secure data pipelines for all data types. Gratuity Health insurance

Posted 3 weeks ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We are looking for an experienced Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us Your key responsibilities Responsible for change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the story, maintain risk management controls, monitor and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories): including liaising with end-users and performing analysis of gathered data. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years leading complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree

Posted 3 weeks ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Noida

Work from Office

SpireHub Softwares is looking for Backend Developer (LangGraph, AWS, Databricks, MongoDB, AI/ML) to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing scalable and efficient backend systems, including databases, APIs, and server-side logic Writing clean, maintainable, and efficient code that adheres to industry best practices and standards Collaborating with front-end developers, designers, and stakeholders to ensure the smooth delivery of projects Implementing security and data protection measures to ensure the confidentiality and integrity of sensitive information Testing and debugging applications to ensure they are functioning correctly and fixing any issues that arise Monitoring performance and optimizing backend systems to ensure they run efficiently and meet SLAs

Posted 3 weeks ago

Apply

9.0 - 12.0 years

1 - 2 Lacs

Hyderabad

Remote

Job Title: Data Architect Location: Remote Employment Type: Full-Time Reports to: Lead Data Strategist About Client / Project: Client is a specialist data strategy and AI consultancy that empowers businesses to unlock tangible value from their data assets. We specialize in developing comprehensive data strategies tailored to address core business and operational challenges. By combining strategic advisory with hands-on implementation, we ensure data becomes a true driver of business growth, operational efficiency, and competitive advantage for our clients. As a solutions-focused and forward-thinking consultancy, we help organizations transform their data capabilities using modern technology, reduce costs, and accelerate business growth by aligning every initiative directly with our clients core business objectives. Role Overview We are seeking a highly experienced Data Architect to lead the design and implementation of scalable data architectures for global clients across industries. You will define enterprise-grade data platforms leveraging cloud-native technologies and modern data frameworks. Key Responsibilities Design and implement cloud-based data architectures (GCP, AWS, Azure, Snowflake, Redshift, Databricks, or Hadoop)• Develop conceptual, logical, and physical data models Define data flows, ETL/ELT pipelines, and ingestion strategies Design and maintain data catalogs, metadata, and domain structures Establish data architecture standards, reference models, and blueprints Oversee data lineage, traceability, and audit readiness Guide integration of AI/ML pipelines and analytics solutions Ensure data privacy, protection, and compliance (e.g., GDPR, HIPAA) Collaborate closely with Engineers, Analysts, and Strategists Required Skills & Qualifications 8+ years of experience in data architecture or enterprise data platform roles Deep experience with at least two major cloud platforms (AWS, Azure, GCP) Proven hands-on work with modern data platforms: Snowflake, Databricks, Redshift, Hadoop Strong understanding of data warehousing, data lakes, lakehouse architecture Advanced proficiency in SQL, Python, Spark, and/or Scala Experience with data cataloging and metadata tools (e.g., Informatica, Collibra, Alation) Knowledge of data governance frameworks and regulatory compliance Strong documentation, stakeholder communication, and architectural planning skills Bachelor’s degree in Computer Science, Engineering, or related field (Master’s preferred)

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 18 Lacs

Pune

Work from Office

8+ years of experience in Python, SQL, Spark, and/or Scala Deep experience with cloud data services: GCP AWS Azure Snowflake, Databricks, or Hadoop ecosystem Apache Airflow, Prefect or Luigi Kafka, Kinesis or other streaming technologies Flexi working Work from home Health insurance Life insurance Retention bonus Leave encashment Gratuity Provident fund Course reimbursements

Posted 3 weeks ago

Apply

4.0 - 8.0 years

12 - 22 Lacs

Hyderabad

Work from Office

AWS Data Engineer Employment Type: Full Time Employment Work Mode: Work From Office Job Location: Hyderabad Walkin Date: 5-July-2025 ( Saturday ) Time: 11:00 am to 3:00pm Years of experience: 4 to 12 years (with minimum 4 years of relevant experience) Notice: Immediate to 30 days Skillset: Python,Pyspark,SQL,AWS,Databricks Airflow-Good to have Venue: Agilisium Consulting, RMZ Spire, Tower -100, Hitec City , Hyderabad. Sharing below registration link,Kindly complete the registration process as it mandate for participation. https://web.agilisium.com/hyderabad/july-5th

Posted 3 weeks ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Chennai

Remote

Expertise in ADF, Azure Databricks and Python. The ideal candidate will be responsible for developing and optimizing data pipelines, integrating cloud data services, and building scalable data processing workflows in the Azure ecosystem.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

12 - 20 Lacs

Hyderabad

Work from Office

Share updated CV to salveen.shaik@covasant.com Job Title: Azure Data Engineer with Databricks Location: Hyderabad (WFO) Experience: 5 8 years Job Type: Full-time Client Industry: [e.g., Healthcare, BFSI, ] Job Level: Senior Engineer / Lead / Architect Role Overview: We are looking for a skilled Azure Data Engineer with expertise in Databricks to join our high-performing data and AI team for a critical client engagement. The ideal candidate will have strong hands-on experience in building scalable data pipelines , data transformation , and real-time data processing using Azure Data Services and Databricks . You will work closely with cross-functional teams including data scientists, architects, business analysts, and client stakeholders to design and implement end-to-end data solutions in a cloud-native environment. Key Responsibilities: Design, develop, and deploy end-to-end data pipelines using Azure Databricks , Azure Data Factory , and Azure Synapse Analytics . Perform data ingestion , data wrangling , and ETL/ELT processes from various structured and unstructured data sources (e.g., APIs, on-prem databases, flat files). Optimize and tune Spark-based jobs and Databricks notebooks for performance and scalability. Implement best practices for CI/CD , code versioning , and testing in a Databricks environment using DevOps pipelines . Design data lake and data warehouse solutions using Delta Lake and Synapse Analytics . Ensure data security , governance , and compliance using Azure-native tools (e.g., Azure Purview , Key Vault , RBAC ). Collaborate with data scientists to enable feature engineering and model training within Databricks. Write efficient SQL and PySpark code for data transformation and analytics. Monitor and maintain existing data pipelines and troubleshoot issues in a production environment. Document technical solutions, architecture diagrams, and data lineage as part of delivery. Mandatory Skills & Technologies: Azure Cloud Services : Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure Key Vault, Azure Functions, Azure Monitor Databricks Platform : Delta Lake, Databricks Notebooks, Job Clusters, MLFlow (optional), Unity Catalog Programming Languages : PySpark, SQL, Python Data Pipelines : ETL/ELT pipeline design and orchestration Version Control & DevOps : Git, Azure DevOps, CI/CD pipelines Data Modeling : Star/Snowflake schema, Dimensional modeling Performance Tuning : Spark job optimization, Data partitioning strategies Data Governance & Security : Azure Purview, RBAC, Data Masking Nice to Have: Experience with Kafka , Event Hub , or other real-time streaming platforms Exposure to Power BI or other visualization tools Knowledge of Terraform or ARM templates for infrastructure as code Experience in MLOps and integration with MLFlow for model lifecycle management Certifications (Good to Have): Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer Associate / Professional DP-203: Data Engineering on Microsoft Azure Soft Skills: Strong communication and client interaction skills Analytical thinking and problem-solving Agile mindset with familiarity in Scrum/Kanban Team player with mentoring ability for junior engineers

Posted 3 weeks ago

Apply

6.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

Do you love leading data-driven transformations and mentoring teams in building scalable data platforms? Were looking for a Data Tech Lead to drive innovation, architecture, and execution across our data ecosystem. Your Role: Lead the design and implementation of modern data architecture, ETL/ELT pipelines, and data lakes/warehouses Set technical direction and mentor a team of talented data engineers Collaborate with product, analytics, and engineering teams to translate business needs into data solutions Define and enforce data modeling standards, governance, and naming conventions Take ownership of the end-to-end data lifecycle: ingestion, transformation, storage, access, and monitoring Evaluate and implement the right cloud/on-prem tools and frameworks Troubleshoot and resolve complex data challenges, while optimizing for performance and cost Contribute to documentation, design blueprints, and knowledge sharing We’re Looking For Someone With: Proven experience in leading data engineering or data platform teams Expertise in designing scalable data architectures and modern data stacks Strong hands-on experience with cloud platforms (AWS/Azure/GCP) and big data tools Proficiency in Python, SQL, Spark, Databricks, or similar tools A passion for clean code, performance tuning, and high-impact delivery Strong communication, collaboration, and leadership skills

Posted 3 weeks ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Pune

Work from Office

The headlines Job Title Data Consultant (Delivery) Start Date Mid-July 2025 Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??700,000 ??2,100,000/annum A bit about the role Were looking for passionate Data Consultants to join our Delivery team; a thriving and fast-growing community of some of the industrys best cloud data engineers at all levels, ranging from interns and graduates up to seasoned experts In this role, you'll combine technical expertise with strong commercial and client-facing skills You'll get the unique opportunity to work with advanced tools and methodologies, develop innovative solutions, and play an integral part in delivering value to our clients In a culture that values growth, mentorship, and technical excellence, this is the perfect opportunity for a data engineer looking to make a real impact within an international, industry-leading consultancy What you'll be doing Delivering high-quality data solutions by successfully managing development tasks with minimal guidance Working with industry-leading technologies such as Snowflake, Matillion, Power BI, and Databricks, with a focus on mastering at least one toolset while expanding your expertise in others Building trusted relationships with clients, managing expectations, and finding opportunities to add value beyond project scope Contributing to internal knowledge-sharing, delivering presentations, training sessions, and thought leadership content Driving business impact by engaging with stakeholders, understanding business challenges, and translating them into data-driven solutions Leading investigations, client workshops, and demonstrations to showcase technical expertise and problem-solving skills Balancing multiple priorities effectively, knowing when to escalate issues and when to push forward with solutions independently Helping shape the Snap Analytics team by mentoring junior consultants and sharing your expertise with others What you'll need to succeed Technical Expertise Strong experience in SQL, data modelling, and ETL processes Exposure to tools like Snowflake, Matillion, Databricks, or Power BI is highly desirable A Problem-Solving Mindset The ability to identify multiple solutions, analyse trade-offs, and confidently propose the best approach Client Engagement Skills Strong communication and stakeholder management abilities, ensuring seamless collaboration with clients at all levels Analytical Thinking The capability to evaluate data solutions critically and proactively identify opportunities for optimisation Ownership & Initiative Be self-motivated and accountable, with a proactive approach to learning and personal development A 'Team Player' Mentality Willingness to contribute to internal initiatives, support colleagues, and help grow Snap Analytics as a company So, what's in it for you A chance to work with the latest cloud data platforms, shaping enterprise-scale data solutions We'll support your journey towards technical certifications and leadership roles A collaborative and supportive culture in which we believe in knowledge-sharing, teamwork, and helping each other succeed The opportunity to write blogs, contribute to industry discussions, and become a recognised expert in your field A rewarding compensation package with opportunities for progression About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Matillion, Databricks, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!

Posted 3 weeks ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Pune, Gurugram, Delhi / NCR

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

8 - 18 Lacs

Chennai

Work from Office

Role Overview Are you passionate about building scalable data systems and working on cutting-edge cloud technologies? We're looking for a Senior Data Engineer to join our team and play a key role in transforming raw data into powerful insights. What Youll Do: Design, develop, and optimize scalable ETL/ELT pipelines and data integration workflows Build and maintain data lakes, warehouses, and real-time streaming pipelines Work with both structured and unstructured data, ensuring clean, usable datasets for analytics & ML Collaborate with analytics, product, and engineering teams to implement robust data models Ensure best practices around data quality, governance, lineage, and security Code in Python, SQL, PySpark, and work on Databricks Operate in AWS environments using Redshift, Glue, S3 Continuously monitor and optimize pipeline performance Document workflows and contribute to engineering standards What We’re Looking For: Strong hands-on experience in modern data engineering tools & platforms Cloud-first mindset with expertise in AWS data stack Solid programming skills and a passion for building high-performance data systems Excellent communication & collaboration skills

Posted 3 weeks ago

Apply

8.0 - 13.0 years

16 - 20 Lacs

Bengaluru

Work from Office

About the Role Tech lead with BI and analytics experience. Hands-on with GenAI code assist tools and API development. Responsibilities Tech lead with BI and analytics experience. Hands-on with GenAI code assist tools and API development. Qualifications Experience: 8-15 Years Required Skills Python Clickhouse Databricks Azure Cloud Containerization technologies

Posted 3 weeks ago

Apply

5.0 - 9.0 years

2 - 3 Lacs

Ahmedabad

Work from Office

*instinctools is a software development company that provides custom software solutions for businesses of all sizes. Our team works closely with clients to understand their specific needs and provide personalized solutions that meet their business requirements. *instinctools is looking for a Senior Data Engineer for one of our clients. Our Client is one of the TOP-5 global management consulting firms considered to be among the most prestigious ones in the world. Hundreds of customers from Fortune-500 , including the largest global financial institutions, the worlds top media companies, technology companies and federal government agencies rely on our Clients proven platform and services. Project is a dynamic solution empowering companies to optimize promotional activities for maximum impact. It collects and validates data, analyzes promotion effectiveness, plans calendars, and integrates seamlessly with existing systems. The tool enhances vendor collaboration, negotiates better deals, and employs machine learning to optimize promotional plans, enabling companies to make informed decisions and maximize return on investment. Stack on the project: Databricks, SQL, and Spark, AWS, Python. Tasks: Build and optimize data pipelines using Databricks, SQL, and Apache Spark. Design and implement scalable data processing systems. Manage and optimize data pipelines. Ensure the quality and efficiency of data flows. Our expectations of the ideal candidate: 5+ years of experience as a Data Engineer. Deep expertise in big data technologies, particularly Databricks, SQL, and Spark. Very strong SQL skills. Experience in data modeling and ETL processes. Experience with analytics engineering is a plus. Experience with DBT, AWS, Python. Soft Skills: Prefer problem solving style over experience Ability to clarify requirements with the customer Willingness to pair with other engineers when solving complex issues Good communication skills English: Upper-Intermediate or higher We offer: flexible working time (from Indian location) professional and ambitious team learning opportunities, seminars and conferences and time for exploring new technologies co-funding for language courses (English)

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to coding standards and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data processing and analytics workflows.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Knowledge of data integration techniques and ETL processes. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. You will also be responsible for troubleshooting issues and implementing solutions that enhance the overall functionality and performance of the applications. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to foster their professional growth and development.- Continuously evaluate and improve development processes to enhance team efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data warehousing concepts and practices. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Coimbatore office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

Staff Software Engineer (Data Engineer) R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. R1 RCM Inc. is a leading provider of technology-enabled revenue cycle management services which transform and solve challenges across health systems, hospitals and physician practices. Headquartered in Chicago, R1 is a publicly-traded organization with employees throughout the US and multiple INDIA locations. Our mission is to be the one trusted partner to manage revenue, so providers and patients can focus on what matters most. Our priority is to always do what is best for our clients, patients and each other. With our proven and scalable operating model, we complement a healthcare organizations infrastructure, quickly driving sustainable improvements to net patient revenue and cash flows while reducing operating costs and enhancing the patient experience. Position summary We are seeking a Staff Data Engineer with 7-10 year of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company Key duties & responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python/SCALA Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with Databricks preferred Experience working with agile methodology preferred Healthcare industry experience preferred Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your developmentby implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 3 weeks ago

Apply

7.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. : We are seeking a Staff Data Engineer with 7-10 years of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Deep knowledge and experience working with Scala and Spark. Experienced in Azure data factory, Azure Data bricks, Azure Synapse Analytics, Azure Data Lake. Experience working in Full stack development in .Net & Angular. Experience working with SQL and NoSQL database systems such as MongoDB, Couchbase. Experience in distributed system architecture design. Experience with cloud environments (Azure Preferred). Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred). Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers. Experience working with Databricks is preferred. Experience working with agile methodology is preferred. Healthcare industry experience is preferred. Job Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions. Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data. Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 3 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users and stakeholders. You will also be responsible for developing new features and functionalities, contributing to the overall success of the projects you are involved in, and ensuring high-quality deliverables through rigorous testing and validation processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Noida

Work from Office

Who we are: R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, amongst Top 50 Best Workplaces for Millennials, Top 50 for Women, Top 25 for Diversity and Inclusion and Top 10 for Health and Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 17,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 3 weeks ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. We are seeking a Data Engineer with 4-6 years of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Deep knowledge and experience working with Scala and Spark. Experienced in Azure data factory, Azure Data bricks, Azure Synapse Analytics, Azure Data Lake. Experience working in Full stack development in .Net & Angular. Experience working with SQL and NoSQL database systems such as MongoDB, Couchbase. Experience in distributed system architecture design. Experience with cloud environments (Azure Preferred). Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred). Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers. Experience working with Databricks is preferred. Experience working with agile methodology is preferred. Healthcare industry experience is preferred. Job Responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions. Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data. Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies