Jobs
Interviews

1301 Data Bricks Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 4.0 years

3 - 7 Lacs

Noida

Work from Office

Key duties & responsibilities Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Qualification B.E/B. Tech/MCA or equivalent professional degree Experience, Skills and Knowledge Deep knowledge and experience working with SSIS, T-SQL Experienced in Azure data factory, Azure Data bricks & Azure Data Lake. Experience working with any language like Python/SCALA Experience working with SQL and NoSQL database systems such as MongoDB Experience in distributed system architecture design Experience with cloud environments (Azure Preferred) Experience with acquiring and preparing data from primary and secondary disparate data sources (real-time preferred) Experience working on large scale data product implementation, responsible for technical delivery, mentoring and managing peer engineers Experience working with Databricks preferred Experience working with agile methodology preferred Healthcare industry experience preferred Key competency profile Spot new opportunities by anticipating change and planning accordingly Find ways to better serve customers and patients. Be accountable for customer service of highest quality Create connections across teams by valuing differences and including others Own your developmentby implementing and sharing your learnings Motivate each other to perform at our highest level Help people improve by learning from successes and failures Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 2 weeks ago

Apply

10.0 - 15.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: 10+ years of experience in a similar role, working with Python, data validation, DB2, Snowflake, and Databricks Proven track record of designing and deploying high-quality, scalable, and reliable data solutions Experience with data validation techniques and methodologies to ensure the accuracy and integrity of data Hands-on experience with DB2 for database management and operations Experience working with Snowflake for cloud-based data warehousing solutions Proficiency in Databricks for big data processing and analytics on a unified data platform Proficiency in Python programming with a solid understanding of its ecosystems and frameworks Ability to participate in daily stand-up meetings and project planning sessions. Ability to collaborate with cross-functional teams to understand business requirements and design data solutions Ability to write, test, and deploy software solutions Ability to conduct data validation to ensure the accuracy and quality of data Ability to monitor data quality and implement processes to ensure data integrity Ability to perform data analysis and troubleshoot data-related issues Ability to provide technical support to team members and resolve technical issues Preferred Qualification: Experience working with large-scale data-driven applications is a plus At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone – of every race, gender, sexuality, age, location and income – deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes – an enterprise priority reflected in our mission. #Nic External Candidate Application Internal Employee Application

Posted 2 weeks ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Pune

Work from Office

Designation: Big Data Lead/Architect Location: Pune Experience: 8-10 years NP - immediate joiner/15-30 days notice Reports To – Product Engineering Head Job Overview We are looking to hire a talented big data engineer to develop and manage our company’s Big Data solutions. In this role, you will be required to design and implement Big Data tools and frameworks, implement ELT processes, collaborate with development teams, build cloud platforms, and maintain the production system. To ensure success as a big data engineer, you should have in-depth knowledge of Hadoop technologies, excellent project management skills, and high-level problem-solving skills. A top-notch Big Data Engineer understands the needs of the company and institutes scalable data solutions for its current and future needs. Responsibilities: Meeting with managers to determine the company’s Big Data needs. Developing big data solutions on AWS, using Apache Spark, Databricks, Delta Tables, EMR, Athena, Glue, Hadoop, etc. Loading disparate data sets and conducting pre-processing services using Athena, Glue, Spark, etc. Collaborating with the software research and development teams. Building cloud platforms for the development of company applications. Maintaining production systems. Requirements: 8-10 years of experience as a big data engineer. Must be proficient with Python & PySpark. In-depth knowledge of Hadoop, Apache Spark, Databricks, Delta Tables, AWS data analytics services. Must have extensive experience with Delta Tables, JSON, Parquet file format. Good to have experience with AWS data analytics services like Athena, Glue, Redshift, EMR. Familiarity with Data warehousing will be a plus. Must have Knowledge of NoSQL and RDBMS databases. Good communication skills. Ability to solve complex data processing, transformation related problems

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of data platform components.- Ensure data platform scalability and performance.- Conduct regular data platform audits.- Stay updated on emerging data platform technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience with data integration and data modeling.- Hands-on experience with data pipeline orchestration tools.- Knowledge of data security and compliance standards. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

0.0 - 2.0 years

5 - 9 Lacs

Mumbai

Work from Office

About NCR Atleos Title BI Analyst Location Mumbai Position Summary NCR Corporation, a Fortune 500 company, is looking for Business Intelligence Analysts in Mumbai, India. The ideal candidates must have excellent analytical/logical skills. We are looking for candidates that speak fluent English Key Areas of Responsibility Provide on-going analytics, metric optimization and reports for the Business & Leadership/Management to assist in day to day business decisions. Resolve any issues that arise within key analytics from a development standpoint.For data integrity issues, provide support in resolving. Develop scorecards and dashboards for KPI tracking, and enable them to make sound business decisions informed by accurate and reliable information. Collaborate with cross-functional teams within the organization to obtain and develop accurate data for the business models and metrics. Communicate it with NCR internal/external customers worldwide. Analyze data trends, seasonality, and other random effects to quickly identify issues or best practices, assist program subject matter experts to get to root cause analysis. Responsible for ensuring contractual obligations with customers are met on a work order and systems basis; Coordinates activities associated with product/service resolution issues Qualifications Able to learn at a rapid pace, desire to go the extra-mile consistently Bachelors Degree and 0-2 years of related experience Ability to determine what data trends mean. Being able to analyze the data is crucial. Good communication (verbal and written), presentation, and organization skills; ability to communicate complex analytical findings clearly and concisely. Sound knowledge of Microsoft Office (Access, Excel, PowerPoint & Macros (Basic SQL). Candidate should be able to query or write a script. Knowledge of tools such as Fabric, SAP Business Objects, Power BI and Databricks is a plus. EEO Statement NCR Atleos is an equal-opportunity employer. It is NCR Atleos policy to hire, train, promote, and pay associates based on their job-related qualifications, ability, and performance, without regard to race, color, creed, religion, national origin, citizenship status, sex, sexual orientation, gender identity/expression, pregnancy, marital status, age, mental or physical disability, genetic information, medical condition, military or veteran status, or any other factor protected by law. Statement to Third Party Agencies To ALL recruitment agenciesNCR Atleos only accepts resumes from agencies on the NCR Atleos preferred supplier list. Please do not forward resumes to our applicant tracking system, NCR Atleos employees, or any NCR Atleos facility. NCR Atleos is not responsible for any fees or charges associated with unsolicited resumes.

Posted 2 weeks ago

Apply

9.0 - 11.0 years

13 - 17 Lacs

Pune

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Design, develop, and maintain scalable data pipelines on Databricks using PySpark Collaborate with data analysts and scientists to understand data requirements and deliver solutions Optimize and troubleshoot existing data pipelines for performance and reliability Ensure data quality and integrity across various data sources Implement data security and compliance best practices Monitor data pipeline performance and conduct necessary maintenance and updates Document data pipeline processes and technical specificationsLocation of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Mysore, Kolkata, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have 9+ years of experience in data engineering Proficiency with Databricks and Apache Spark Strong SQL skills and experience with relational databases Experience with big data technologies (e.g., Hadoop, Kafka) Knowledge of data warehousing concepts and ETL processes Experience with CI/CD tools, particularly Jenkins Excellent problem-solving and analytical skills Solid understanding of big data fundamentals and experience with Apache Spark Familiarity with cloud platforms (e.g., AWS, Azure) Experience with version control systems (e.g., BitBucket) Understanding of DevOps principles and tools (e.g., CI/CD, Jenkins) Databricks certification is a plus Preferred Skills: Technology-Big Data-Big Data - ALL Technology-Cloud Integration-Azure Data Factory (ADF) Technology-Cloud Platform-AWS Data Analytics-AWS Data Exchange

Posted 2 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Base SAS Certified professional.Develop, implement, and optimize analytical models using SAS and SQL.Strong Knowledge within SAS DI,SAS EG, SAS BI tools.Analyze large datasets to derive actionable insights and support business decision-making.Design, implement, and maintain ETL workflows to extract, transform, and load data efficiently.Develop advanced SAS programs using SAS Macros for automation and data processing.Troubleshoot and optimize SAS code for performance improvements.Work on data warehousing projects to enable efficient data storage and retrieval.Basic Unix Scripts related Knowledge.JL5B- 8+ Years of ExperienceAdvanced SAS Certified with 5+Worked on Min 2SAS Migration Projects execution.Strong Knowledge within SAS DI,SAS EG, SAS BI tools.SAS Migration Projects to SAS Scripts to Python, Pyspark, Databricks, ADF and Snowflake etc.Good Knowledge within SAS VIYA 3.4,4.0 platforms.Good Knowledge on SAS- SMC, LSF and other Schedulers Basic Knowledge on SAS Admin information, Basic SAS Grid level Knowledge.Good Unix commands, Scripts related Knowledge.Handle case studies or complex data scenarios, ensuring data quality and integrity.Develop advanced SAS programs using SAS Macros for automation and data processing.Troubleshoot and optimize SAS code for performance improvements.Collaborate with the data engineering team to build and manage robust data pipelines.Work on data warehousing projects to enable efficient data storage and retrieval.Present findings and insights clearly to both technical and non-technical stakeholders.Work closely with teams across departments to gather requirements and deliver solutions Technical and Professional : Min 5+ years in the analytics domain, with a strong portfolio of relevant projects.Proficiency in SAS, SAS Macros, and SQL.Hands-on experience in ETL processes and tools.Knowledge of data engineering concepts and data warehousing best practices. Preferred Skills: Technology-Reporting Analytics & Visualization-SAS Enterprise Guide

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in the development and implementation of software solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to define, design, and ship new features.- Develop high-quality software design and architecture.- Identify, prioritize, and execute tasks in the software development life cycle.- Conduct software analysis, programming, testing, and debugging.- Troubleshoot and resolve issues in existing software applications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data analytics platforms.- Experience in developing and deploying scalable applications.- Knowledge of data modeling and database design.- Hands-on experience with data integration and ETL processes. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Ability to work with data visualization tools to present insights effectively. Additional Information:- The candidate should have minimum 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

10.0 - 15.0 years

5 - 9 Lacs

Hyderabad

Work from Office

: Design, develop, and maintain data pipelines and ETL processes using Databricks. Manage and optimize data solutions on cloud platforms such as Azure and AWS. Implement big data processing workflows using PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Optimize and tune big data solutions for performance and scalability. Stay updated with the latest industry trends and technologies in big data and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Big Data Engineer or similar role. Strong proficiency in Databricks and cloud platforms (Azure/AWS). Expertise in PySpark and big data processing. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud services and infrastructure. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other big data technologies and frameworks. Knowledge of machine learning frameworks and libraries. Certification in cloud platforms or big data technologies.

Posted 2 weeks ago

Apply

10.0 - 13.0 years

12 - 15 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 TitlePrincipal Data Engineer (Associate Director) DepartmentISS LocationBangalore Reports ToHead of Data Platform - ISS Grade 7 Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.Be accountable for technical delivery and take ownership of solutions.Lead a team of senior and junior developers providing mentorship and guidance.Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.Drive technical innovation within the department to increase code reusability, code quality and developer productivity.Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.Experience designing event-based or streaming data architectures using Kafka.Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements.Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.Experience implementing CDC ingestion.Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes.Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making.Communication:Strong in strategic communication and stakeholder engagement.Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Develop and maintain a metadata driven generic ETL framework for automating ETL code Design, build, and optimize ETL/ELT pipelines using Databricks (PySpark/SQL) on AWS . Ingest data from a variety of structured and unstructured sources (APIs, RDBMS, flat files, streaming). Develop and maintain robust data pipelines for batch and streaming data using Delta Lake and Spark Structured Streaming. Implement data quality checks, validations, and logging mechanisms. Optimize pipeline performance, cost, and reliability. Collaborate with data analysts, BI, and business teams to deliver fit for purpose datasets. Support data modeling efforts (star, snowflake schemas) de norm tables approach and assist with data warehousing initiatives. Work with orchestration tools Databricks Workflows to schedule and monitor pipelines. Follow best practices for version control, CI/CD, and collaborative development Skills Hands-on experience in ETL/Data Engineering roles. Strong expertise in Databricks (PySpark, SQL, Delta Lake), Databricks Data Engineer Certification preferred Experience with Spark optimization, partitioning, caching, and handling large-scale datasets. Proficiency in SQL and scripting in Python or Scala. Solid understanding of data lakehouse/medallion architectures and modern data platforms. Experience working with cloud storage systems like AWS S3 Familiarity with DevOps practices Git, CI/CD, Terraform, etc. Strong debugging, troubleshooting, and performance-tuning skills.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : B E or B Tech Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with the organization's needs and goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze, design, and develop new features.- Implement and maintain software applications to meet client needs.- Troubleshoot and debug applications to enhance performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data analytics platforms.- Experience in developing and deploying scalable applications.- Knowledge of data modeling and database design principles. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A B E or B Tech degree is required. Qualification B E or B Tech

Posted 2 weeks ago

Apply

9.0 - 14.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Kafka Data Engineer Data Engineer to build and manage data pipelines that support batch and streaming data solutions. The role requires expertise in creating seamless data flows across platforms like Data Lake/Lakehouse in Cloudera, Azure Databricks, Kafka for both batch and stream data pipelines etc. Responsibilities Strong experience in develop, test, and maintain data pipelines (batch & stream) using Cloudera, Spark, Kafka and Azure services like ADF, Cosmos DB, Databricks, NoSQL DB/ Mongo DB etc. Strong programming skills in spark, python or scala & SQL. Optimize data pipelines to improve speed, performance, and reliability, ensuring that data is available for data consumers as required. Create ETL pipelines for downstream consumers by transform data as per business logic. Work closely with Data Architects and Data Analysts to align data solutions with business needs and ensure the accuracy and accessibility of data. Implement data validation checks and error handling processes to maintain high data quality and consistency across data pipelines. Strong analytical and problem solving skills, with a focus on optimizing data flows and addressing impacts in the data pipeline. Qualifications 8+ years of IT experience with at least 5+ years in data engineering and cloud-based data platforms. Strong experience with Cloudera/any Data Lake, Confluent/Apache Kafka, and Azure Data Services (ADF, Databricks, Cosmos DB). Deep knowledge of NoSQL databases (Cosmos DB, MongoDB) and data modeling for performance and scalability. Proven expertise in designing and implementing batch and streaming data pipelines using Databricks, Spark, or Kafka. Experience in creating scalable, reliable, and high-performance data solutions with robust data governance policies. Strong collaboration skills to work with stakeholders, mentor junior Data Engineers, and translate business needs into actionable solutions. Bachelors or masters degree in computer science, IT, or a related field.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : B E or B Tech Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing solutions to enhance business operations and streamline processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to analyze business requirements and translate them into technical solutions.- Develop and implement software solutions to meet business needs.- Conduct code reviews and ensure code quality and best practices are followed.- Troubleshoot and debug applications to optimize performance and enhance user experience.- Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data processing and analytics.- Experience with cloud-based data platforms like AWS or Azure.- Knowledge of programming languages such as Python, Java, or Scala.- Hands-on experience in building and deploying applications using Databricks platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A B E or B Tech degree is required. Qualification B E or B Tech

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Hyderabad

Work from Office

1. Data Engineer Azure Data Services 2. Data Modelling NO SQL and SQL 3. Good understanding of Spark, Spark stream 4. Hands on with Python / Pandas / Data Factory / Cosmos DB / Data Bricks / Event Hubs / Stream Analytics 5. Knowledge of medallion architecture, data vaults, data marts etc. 6. Preferably Azure Data associate exam certified.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Preferred Proficiency with statistical analysis tools (e.g., SAS, SPSS, R & Python) Hands on experience in Statistical Techniques & Deep Learning such as Decision Tree, Segmentation, Logistic and Multiple Regression, and others Experience of Forecasting, Campaign Management andCustomer Segmentation is desirable Strong Excel, Access and PowerPoint skills Familiarity with database query tools (SQL) Basic understanding of data warehouse architecture Understanding of various Analytical Tools, Platform & Frameworks. Technical and Professional : Data science . machine learning Good to have PySpark/Databricks Preferred Skills: Technology-Data Science-Machine Learning

Posted 2 weeks ago

Apply

12.0 - 17.0 years

13 - 18 Lacs

Hyderabad

Work from Office

1. Data Engineer Azure Data Services 2. Data Modelling NO SQL and SQL 3. Good understanding of Spark, park stream 4. Hands on with Python Pandas / Data Factory / Cosmos DB / Data Bricks / Event Hubs / Stream Analytics 5. Knowledge of medallion architecture, data vaults, data marts etc. 6. Preferably Azure Data associate exam certified.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Kolkata

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide feedback to enhance application performance.- Stay updated on emerging technologies and trends in application development.- Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data analytics and data processing techniques.- Experience with cloud-based data platforms like AWS or Azure.- Knowledge of programming languages such as Python, Java, or Scala.- Hands-on experience in developing and deploying applications using Databricks platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Kolkata office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Responsible for successful delivery of MLOps solutions and services in client consulting environments; Define key business problems to be solved; formulate high level solution approaches and identify data to solve those problems, develop, analyze/draw conclusions and present to client. Assist clients with operationalization metrics to track performance of ML Models Agile trained to manage team effort and track through JIRA High Impact Communication- Assesses the target audience need, prepares and practices a logical flow, answers audience questions appropriately and sticks to timeline. Additional Responsibilities: Master’s degree in Computer Science Engineering, with Relevant experience in the field of MLOps / Cloud Domain experience in Capital Markets, Banking, Risk and Compliance etc. Exposure to US/ overseas markets is preferred Azure Certified – DP100, AZ/AI900 Domain / Technical / Tools KnowledgeObject oriented programming, coding standards, architecture & design patterns, Config management, Package Management, Logging, documentation Experience in Test Driven Development and experience in using Pytest frameworks, git version control, Rest APIs Azure ML best practices in environment management, run time configurations (Azure ML & Databricks clusters), alerts. Experience designing and implementing ML Systems & pipelines, MLOps practices Exposure to event driven orchestration, Online Model deployment Contribute towards establishing best practices in MLOps Systems development Proficiency with data analysis tools (e.g., SQL, R & Python) High level understanding of database concepts/reporting & Data Science concepts Hands on experience in working with client IT/Business teams in gathering business requirement and converting into requirement for development team Experience in managing client relationship and developing business cases for opportunities Azure AZ-900 Certification with Azure Architecture understanding is a plus Technical and Professional : Technical knowledge- has expertise in cloud technologies, specifically MS Azure, and services with hands on coding to – Python Programming - Expert and Experienced - 4 -5 years DevOps Working knowledge with implementation experience - 1 or 2 projects a minimum Hands-On MS Azure Cloud knowledge Understand and take requirements on Operationalization of ML Models from Data Scientist Help team with ML Pipelines from creation to execution List Azure services required for deployment, Azure Data bricks and Azure DevOps Setup Assist team to coding standards (flake8 etc) Guide team to debug on issues with pipeline failures Engage with Business / Stakeholders with status update on progress of development and issue fix Automation, Technology and Process Improvement for the deployed projects Setup Standards related to Coding, Pipelines and Documentation Adhere to KPI / SLA for Pipeline Run, Execution Research on new topics, services and enhancements in Cloud Technologies Preferred Skills: Technology-Machine Learning-Python

Posted 2 weeks ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Design, develop, and maintain data pipelines and ETL processes using Databricks. Manage and optimize data solutions on cloud platforms such as Azure and AWS. Implement big data processing workflows using PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Optimize and tune big data solutions for performance and scalability. Stay updated with the latest industry trends and technologies in big data and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Big Data Engineer or similar role. Strong proficiency in Databricks and cloud platforms (Azure/AWS). Expertise in PySpark and big data processing. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud services and infrastructure. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other big data technologies and frameworks. Knowledge of machine learning frameworks and libraries.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Pune

Work from Office

1. ETLHands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica 2. Big DataExperience of big data platforms such as Hadoop, Hive or Snowflake for data storage and processing 3. Data Warehousing & Database ManagementUnderstanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design 4. Data Modeling & DesignGood exposure to data modeling techniques; design, optimization and maintenance of data models and data structures 5. LanguagesProficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala 6. DevOpsExposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Ab InitioExperience developing CoOp graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, ExpressIT, Data Profiler and ConductIT, ControlCenter, ContinuousFlows CloudGood exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & ControlsExposure to data validation, cleansing, enrichment and data controls ContainerizationFair understanding of containerization platforms like Docker, Kubernetes File FormatsExposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta OthersBasics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Analytics Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead for Packaged Application Development, you will be responsible for designing, building, and configuring applications using Microsoft Azure Analytics Services. Your typical day will involve leading the effort to deliver high-quality applications, acting as the primary point of contact for the project team, and ensuring timely delivery of project milestones. Roles & Responsibilities:- Lead the effort to design, build, and configure applications using Microsoft Azure Analytics Services.- Act as the primary point of contact for the project team, ensuring timely delivery of project milestones.- Collaborate with cross-functional teams to ensure the successful delivery of high-quality applications.- Provide technical guidance and mentorship to team members, ensuring adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Strong experience with Microsoft Azure Analytics Services.- Good To Have Skills: Experience with other Azure services such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.- Experience in designing, building, and configuring applications using Microsoft Azure Analytics Services.- Must have databricks and pyspark Skills.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and tools such as SSIS or Azure Data Factory.- Experience with SQL and NoSQL databases.- Experience with Agile development methodologies. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Analytics Services.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality applications.- This position is based at our Bengaluru office. Qualification BE

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform, Engineering Expertise, Data Lakehouse Development Good to have skills : Google BigQueryMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization and its clients. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Google BigQuery.- Strong understanding of data integration and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with data governance and compliance standards. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies