Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
20 - 27 Lacs
Bengaluru
Work from Office
TECHNICAL SKILLS AND EXPERIENCE Most important: 7+ years professional experience as a data engineer, with at least 4 utilizing cloud technologies. Proven experience building ETL or ETL data pipelines with Databricks either in Azure or AWS using PySpark language. Strong experience with the Microsoft Azure Data Stack (Databricks, Data Lake Gen2, ADF etc.) Strong SQL skills and proficiency in Python adhering to standards such as PEP Proven experience with unit testing and applying appropriate testing methodologies using libraries such as Pytest, Great Expectations, or similar. Demonstrable experience with CICD including release and test automation tools and processes such as Azure Devops, Terraform, Powershell and Bash scripting or similar. Strong understanding of data modeling, data warehousing, and OLAP concepts. Excellent technical documentation skills. Preferred candidate profile
Posted 6 days ago
8.0 - 10.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Databricks Architect Should have minimum of 10+ years of experience Must have skills DataBricks, Delta Lake, pyspark or scala spark, Unity Catalog Good to have skills - Azure and/or AWS Cloud Handson exposure in o Strong experience with the use of databricks as lakehouse solution o Establish the Databricks Lakehouse architecture o To ingest and transform batch and streaming data on the Databricks Lakehouse Platform. o Orchestrate diverse workloads for the full lifecycle including Delta Live Tables, PySpark etc Mandatory Skills: DataBricks - Data Engineering. Experience8-10 Years.
Posted 6 days ago
10.0 - 14.0 years
30 - 37 Lacs
Noida
Hybrid
Required Qualifications: Undergraduate degree or equivalent experience 5+ years of work experience on Big Data skills 5+ years of experience managing the team 5+ years of work experience on people management skills 3+ years of work experience on Azure Cloud skills Experience or knowledge Azure Cloud, Databricks, Terraform, CI/CD, Spark, Scala, Java, Hbase, Hive, Sqoop, GitHub, Jenkins, Elastic Search, Grafana, UNIX, SQL, OpenShift, Kubernetes and Oozie etc. Solid technical knowledge and work experience on Big Data skills and Azure Cloud skills Primary Responsibilities: Designing and developing large-scale data processing systems. Use the expertise in big data technologies to ensure that the systems are efficient, scalable, and secure Ensuring that the developed systems are running smoothly. Monitor system performance, diagnose and troubleshoot issues, and make necessary changes to optimize system performance Processing, cleaning, and integrating large data sets from various sources to ensure that the data is accurate, complete, and consistent Work closely with cross-functional teams, including data scientists, analysts, and business stakeholders. Collaborate with these teams to ensure that the systems they develop meet the organizations requirements and can support its goals Collaborating closely with senior stakeholders to understand business requirements and effectively translate them into technical requirements for the development team Planning and documenting comprehensive technical specifications for features or system design, ensuring a clear roadmap for development and implementation Designing, building, and configuring applications to meet business process and application requirements, leveraging your technical expertise and problem-solving skills Directing the development team in all aspects of the software development life cycle, including design, development, coding, testing, and debugging, to deliver high-quality solutions Writing testable, scalable, and efficient code, leading by example, and setting coding standards for the team Conducting code reviews and providing constructive feedback to ensure code quality and adherence to best practices Mentoring and guiding junior team members, fostering their professional growth, and encouraging the adoption of industry best practices Ensuring that software quality standards are met by enforcing code standards, conducting rigorous testing, and implementing continuous improvement processes Staying updated with the latest technologies and industry trends, continuously enhancing technical skills, and driving innovation within the development team Set and communicate team priorities that support the broader organization's goals. Align strategy, processes, and decision-making across teams Set clear expectations with individuals based on their level and role and aligned to the broader organization's goals. Meet regularly with individuals to discuss performance and development and provide feedback and coaching Develop the long-term technical vision and roadmap within, and often beyond, the scope of your teams. Evolve the roadmap to meet anticipated future requirements and infrastructure needs. Identify, navigate, and overcome technical and organizational barriers that may stand in the way of delivery Constantly improve the processes and practices around development and delivery Always think customer first, including striving to outperform their expectations Effectively work with Product Managers, Program Managers and other stakeholders to ensure the customer is benefiting from the work Foster and facilitate Agile methodologies globally and work in an agile environment using SCRUM or Kanban Work with Program Managers/leads to consume product backlog and generate technical design Leading by example on design and development of platform features Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Posted 6 days ago
0.0 - 5.0 years
0 Lacs
Pune
Remote
The candidate must be proficient in Python, libraries and frameworks. Good with Data Modeling, Pyspark, MySQL concepts, Power BI, AWS, Azure concepts Experience in optimizing large transactional DBs Data, visualization tools, Databricks, fast API.
Posted 6 days ago
7.0 - 12.0 years
16 - 31 Lacs
Pune, Delhi / NCR, Mumbai (All Areas)
Hybrid
Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.
Posted 1 week ago
4.0 - 9.0 years
10 - 20 Lacs
Kolkata, Pune, Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Azure Data Engineer Qualification : Any Graduate or above Relevant Experience : 4 to 10 yrs Must Have Skills : Azure, ADB, PySpark Roles and Responsibilites: Strong experience in implementation and management of lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Strong hands-on expertise with SQL, Python, Apache Spark and Delta Lake. Proficiency in data integration techniques, ETL processes and data pipeline architectures. Demonstrable experience using GIT and building CI/CD pipelines for code management. Develop and maintain technical documentation for the platform. Ensure the platform is developed with software engineering, data analytics and data security practices in mind. Developing and optimizing data processing and data storage systems, ensuring high performance, reliability, and security. Experience working in Agile Methodology and well-knowledgeable in using ADO Boards for Sprint deliveries. Excellent communication skills and able to communicate clearly technical and business concepts both verbally and in writing. Ability to work in a team environment and collaborate with all the levels effectively by sharing ideas and knowledge. Location : Kolkata, Pune, Mumbai, Bangalore, BBSR Notice period : Immediate / 90 days Shift Timing : General Shift Mode of Interview : Virtual Mode of Work : WFO Thanks & Regards Bhavana B Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:8067432454 bhavana.b@blackwhite.in |www.blackwhite.in
Posted 1 week ago
12.0 - 16.0 years
1 - 1 Lacs
Hyderabad
Remote
Were Hiring: Azure Data Factory (ADF) Developer Hyderabad Location: Onsite at Canopy One Office, Hyderabad/Remote Type: Full-time/Partime/Contract | Offshore role | Must be available to work in Eastern Time Zone (EST) We’re looking for an experienced ADF Developer to join our offshore team supporting a major client. This role focuses on building robust data pipelines using Azure Data Factory (ADF) and working closely with client stakeholders for transformation logic and data movement. Key Responsibilities Design, build, and manage ADF data pipelines Implement transformations and aggregations based on mappings provided Work with data from the bronze (staging) area, pre-loaded via Boomi Collaborate with client-side data managers (based in EST) to deliver clean, reliable datasets Requirements Proven hands-on experience with Azure Data Factory Strong understanding of ETL workflows and data transformation Familiarity with data staging/bronze layer concepts Willingness to work in Eastern Time Zone (EST) hours Preferred Qualifications Knowledge of Kimball Data Warehousing (huge advantage!) Experience working in an offshore coordination model Exposure to Boomi is a plus Role & responsibilities Preferred candidate profile
Posted 1 week ago
5.0 - 7.0 years
18 - 25 Lacs
Hyderabad
Work from Office
DataBricks , Azure, Big query (Need to be good with SQL ) , Python , Familiar with Data science concepts or implementations .
Posted 1 week ago
3.0 - 8.0 years
6 - 14 Lacs
Ahmedabad
Work from Office
Role & responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Preferred candidate profile Bachelor's and/or masters degree in computer science or equivalent experience. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail
Posted 1 week ago
5.0 - 10.0 years
9 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Key Responsibilities: ¢ Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. €¢ Build and operate very large data warehouses or data lakes. €¢ ETL optimization, designing, coding, & tuning big data processes using Apache Spark. €¢ Build data pipelines & applications to stream and process datasets at low latencies. €¢ Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: €¢ Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake. €¢ Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. Email at- maya@mounttalent.com
Posted 1 week ago
7.0 - 10.0 years
19 - 27 Lacs
Hyderabad
Hybrid
We are seeking a highly skilled and experienced Data Engineering to join our team. The ideal candidate will have a strong background in programming, data management, and cloud infrastructure, with a focus on designing and implementing efficient data solutions. This role requires a minimum of 5+ years of experience and a deep understanding of Azure services and infrastructure and ETL/ELT solutions. Key Responsibilities: Azure Infrastructure Management: Own and maintain all aspects of Azure infrastructure, recommending modifications to enhance reliability, availability, and scalability. Security Management: Manage security aspects of Azure infrastructure, including network, firewall, private endpoints, encryption, PIM, and permissions management using Azure RBAC and Databricks roles. Technical Troubleshooting: Diagnose and troubleshoot technical issues in a timely manner, identifying root causes and providing effective solutions. Infrastructure as Code: Create and maintain Azure Infrastructure as Code using Terraform and GitHub Actions. CI/CD Pipelines: Configure and maintain CI/CD pipelines using GitHub Actions for various Azure services such as ADF, Databricks, Storage, and Key Vault. Programming Expertise: Utilize your expertise in programming languages such as Python to develop and maintain data engineering solutions. Generative AI and Language Models: Knowledge of Language Models (LLMs) and Generative AI is a plus, enabling the integration of advanced AI capabilities into data workflows. Real-Time Data Streaming: Use Kafka for real-time data streaming and integration, ensuring efficient data flow and processing. Data Management: Proficiency in Snowflake for data wrangling and management, optimizing data structures for analysis. DBT Utilization: Build and maintain data marts and views using DBT, ensuring data is structured for optimal analysis. ETL/ELT Solutions: Design ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks, leveraging methodologies to acquire data from various structured or semi-structured source systems. Communication: Strong communication skills to explain technical issues and solutions clearly to the Engineering Lead and key stakeholders (as required) Qualifications: Minimum of 5+ years of experience in designing ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks.OR Snowflake Expertise in programming languages such as Python. Experience with Kafka for real-time data streaming and integration. Proficiency in Snowflake for data wrangling and management. Proven ability to use DBT to build and maintain data marts and views. Experience in creating and maintaining Azure Infrastructure as Code using Terraform and GitHub Actions. Ability to configure, set up, and maintain GitHub for various code repositories. Experience in creating and configuring CI/CD pipelines using GitHub Actions for various Azure services. In-depth understanding of managing security aspects of Azure infrastructure. Strong problem-solving skills and ability to diagnose and troubleshoot technical issues. Excellent communication skills for explaining technical issues and solutions.
Posted 1 week ago
5.0 - 8.0 years
15 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
We are looking for Azure Administrator for Bangalore / Hyderabad / Chennai / Gurgaon Experience : 5 to 8 Years Location : Bangalore / Hyderabad / Chennai / Gurgaon NP : Immediate to 15 Days only Please send updated resume below mail ID : sumanta.majumdar@infinite.com JD : We are seeking a skilled and proactive Azure Databricks Administrator to manage, monitor, and support our Databricks environment on Microsoft Azure. The ideal candidate will be responsible for system integrations, access control, user support, and CI/CD pipeline administration, ensuring a secure, efficient, and scalable data platform. Key Responsibilities: - System Integration & Monitoring: - Build, monitor, and support integrations between Databricks and enterprise systems such as LogRhythm, ServiceNow, and AppDynamics. - Ensure seamless data flow and alerting mechanisms across integrated platforms. - Security & Access Management: - Administer user and group access to the Databricks environment. - Implement and enforce security policies and role-based access controls (RBAC). - User Support & Enablement: - Provide initial system support and act as a point of contact (POC) for Databricks users. - Assist users with onboarding, workspace setup, and troubleshooting. - Vendor Coordination: - Engage with Databricks vendor support for issue resolution and platform optimization. - Platform Monitoring & Maintenance: - Monitor Databricks usage, performance, and cost. - Ensure the platform is up-to-date with the latest patches and features. - Database & CI/CD Administration: - Manage Databricks database configurations and performance tuning. - Administer and maintain CI/CD pipelines for Databricks notebooks and jobs. Required Skills & Qualifications: - Proven experience administering Azure Databricks in a production environment. - Strong understanding of Azure services, data engineering workflows, and DevOps practices. - Experience with integration tools and platforms like LogRhythm, ServiceNow, and AppDynamics. - Proficiency in CI/CD tools (e.g., Azure DevOps, GitHub Actions). - Familiarity with Databricks REST APIs, Terraform, or ARM templates is a plus. - Excellent problem-solving, communication, and documentation skills.
Posted 1 week ago
5.0 - 8.0 years
9 - 19 Lacs
Hyderabad
Work from Office
We are looking for a skilled Data Engineer with strong expertise in Python, PySpark, SQL, AWS and Data Bricks to join our data engineering team. The ideal candidate will be responsible for building scalable data pipelines, transforming large datasets, and enabling data-driven decision-making across the organization. Role & responsibilities Data Pipeline Development: Design, build, and maintain scalable data pipelines for ingesting, processing, and transforming large datasets from diverse sources into usable formats. Performance Optimization: Optimize data processing and storage systems for cost efficiency and high performance, including managing compute resources and cluster configurations. Automation and Workflow Management: Automate data workflows using tools like Airflow, Databricks APIs, and other orchestration technologies to streamline data ingestion, processing, and reporting tasks. Data Quality and Validation: Implement data quality checks, validation rules, and transformation logic to ensure the accuracy, consistency, and reliability of data. Cloud Platform Management: Manage and optimize cloud infrastructure (AWS, Databricks) for data storage, processing, and compute resources, ensuring seamless data operations. Preferred candidate profile Strong proficiency in Python for scripting and data manipulation. Hands-on experience with PySpark for distributed data processing. Proficient in writing complex SQL queries for large-scale data extraction and transformation. Solid understanding and experience with AWS cloud ecosystem (especially S3, Glue, EMR, Lambda). Knowledge of data warehousing, data lakes, and ETL/ELT processes. Familiarity with version control tools like Git and workflow orchestration tools (e.g., Airflow) is a plus. Location - Hyderabad (Work From Office) Notice - Immediate or 15days
Posted 1 week ago
5.0 - 8.0 years
9 - 19 Lacs
Hyderabad
Work from Office
We are looking for a skilled Data Engineer with strong expertise in Python, PySpark, SQL, AWS and Data Bricks to join our data engineering team. The ideal candidate will be responsible for building scalable data pipelines, transforming large datasets, and enabling data-driven decision-making across the organization. Role & responsibilities Data Pipeline Development: Design, build, and maintain scalable data pipelines for ingesting, processing, and transforming large datasets from diverse sources into usable formats. Performance Optimization: Optimize data processing and storage systems for cost efficiency and high performance, including managing compute resources and cluster configurations. Automation and Workflow Management: Automate data workflows using tools like Airflow, Databricks APIs, and other orchestration technologies to streamline data ingestion, processing, and reporting tasks. Data Quality and Validation: Implement data quality checks, validation rules, and transformation logic to ensure the accuracy, consistency, and reliability of data. Cloud Platform Management: Manage and optimize cloud infrastructure (AWS, Databricks) for data storage, processing, and compute resources, ensuring seamless data operations. Preferred candidate profile Strong proficiency in Python for scripting and data manipulation. Hands-on experience with PySpark for distributed data processing. Proficient in writing complex SQL queries for large-scale data extraction and transformation. Solid understanding and experience with AWS cloud ecosystem (especially S3, Glue, EMR, Lambda). Knowledge of data warehousing, data lakes, and ETL/ELT processes. Familiarity with version control tools like Git and workflow orchestration tools (e.g., Airflow) is a plus. Location - Hyderabad (Work From Office) Notice - Immediate or 15days
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Chennai
Work from Office
Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 3+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Chennai
Work from Office
Join us in bringing joy to customer experience. Five9 is a leading provider of cloud contact center software, bringing the power of cloud innovation to customers worldwide, Living our values everyday results in our team-first culture and enables us to innovate, grow, and thrive while enjoying the journey together. We celebrate diversity and foster an inclusive environment, empowering our employees to be their authentic selves, The Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools. They will support a wide range of projects while collaborating closely with management teams and business leaders. The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures, This position is based out of one of the offices of our affiliate Acqueon Technologies in India, and will adopt the hybrid work arrangements of that location. You will be a member of the Acqueon team with responsibilities supporting Five9 products, collaborating with global teammates based primarily in the United States, Responsibilities. Design, implement, and maintain a scalable Data Lake on GCP to centralize structured and unstructured data from various sources (databases, APIs, cloud storage), Utilize GCP services including BigQuery, Dataflow, Pub/Sub, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security, Collaborate closely with data analytics and data science teams to understand data needs, ensuring data is properly prepared for consumption by various systems (e-g. DOMO, Looker, Databricks). Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards, Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability, Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development, Requirements. Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e-g. Mathematics, Statistics, Engineering), 4+ years of experience using GCP Data Lake and Storage Services. Certifications in GCP are preferred (e-g. Professional Cloud Developer, Professional Cloud Database Engineer), Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows, Proficiency in programming languages such as Python, Java, or Scala, with practical experience building data pipelines, automating data workflows, and integrating APIs for data ingestion, Five9 embraces diversity and is committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better we are. Five9 is an equal opportunity employer, View our privacy policy, including our privacy notice to California residents here: https://www,five9,/pt-pt/legal, Note: Five9 will never request that an applicant send money as a prerequisite for commencing employment with Five9, Show more Show less
Posted 1 week ago
5.0 - 9.0 years
7 - 11 Lacs
Gurugram
Work from Office
Dentsply Sirona is the world’s largest manufacturer of professional dental products and technologies, with a 130-year history of innovation and service to the dental industry and patients worldwide. Dentsply Sirona develops, manufactures, and markets a comprehensive solutions offering including dental and oral health products as well as other consumable medical devices under a strong portfolio of world class brands. Dentsply Sirona’s products provide innovative, high-quality and effective solutions to advance patient care and deliver better and safer dentistry. Dentsply Sirona’s global headquarters is located in Charlotte, North Carolina, USA. The company’s shares are listed in the United States on NASDAQ under the symbol XRAY, Bringing out the best in people. As advanced as dentistry is today, we are dedicated to making it even better. Our people have a passion for innovation and are committed to applying it to improve dental care. We live and breathe high performance, working as one global team, bringing out the best in each other for the benefit of dental patients, and the professionals who serve them. If you want to grow and develop as a part of a team that is shaping an industry, then we’re looking for the best to join us, Working At Dentsply Sirona You Are Able To. Develop faster with our commitment to the best professional development, Perform better as part of a high-performance, empowering culture, Shape an industry with a market leader that continues to drive innovation, Make a difference -by helping improve oral health worldwide, Scope. Role has global scope and includes managing and leading data flow and transformation development in the Data Engagement Platform (DEP). This role will lead work of DS employees as well as contractors, Key Responsibilities. Develop and maintain high quality data warehouse solution, Maintain accurate and complete technical architectural documents, Collaborate with BI Developers and Business Analysts for successful development of BI reporting and analysis, Work with business groups and technical teams to develop and maintain data warehouse platform for BI reporting, Develop scalable and maintainable data layer for BI applications to meet business objectives, To work in a small, smart, agile team – designing, developing and owning full solution for an assigned data area. Develop standards, patterns, best practices for reuse and acceleration, Perform maintenance and troubleshooting activities in Azure data platform, Analyze, plan and develop requirements and standards in reference to scheduled projects, Partake in process to define clear project deliverables, Coordinate the development of standards, patterns, best practices for reuse and acceleration, Typical Background. Education:University Degree or equivalent in MIS or similar. Years And Type Of Experience. 5-10 years working with BI and data warehouse solutions, Key Required Skills, Knowledge And Capabilities. Good understanding of business logic and understanding of their needs, Some experience with Databricks and dbt is desirable, Worked with Azure DevOps code repository, version control and task management, Strong proficiency with SQL and its variation among popular databases, Knowledge of best practices when dealing with relational databases. Capable of troubleshooting common database issues. You have knowledge of data design and analysis of BI systems and processes, Strong analytical and logical thinking. Internationally and culturally aware. Communicate well verbally and in writing in English. Key Leadership Behaviors. Dentsply Sirona managers are expected to successfully demonstrate behaviors aligned with the Competency model. See competencies below together with. a Key Specific. Behaviors For Success. Teamwork– Defines success in terms of the whole team. Customer Focus– Is dedicated to meeting the expectations and requirements of internal and external customers and seeking to make improvements with the customer in mind. Strategic Thinking– Applies experience, knowledge, and perspective of business and external or global factors to create new perspectives and fresh thinking. Talent Management– Actively seeks assignments that stretch her beyond comfort zone. Integrity– Raises potential ethical concerns to the right party. Problem Solving– Can analyze problems and put together a plan for resolution within her scope of responsibility. Drive for Results– Can be counted on to reach goals successfully. Accountability– Acts with a clear sense of ownership. Innovation and Creativity– Brings creative ideas to work and acts to take advantage of opportunities to improve business. Leading Change– Adapts to changing priorities and acts without having the total picture. DentsplySirona is an Equal Opportunity/ Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, sexual orientation, disability, or protected Veteran status. We appreciate your interest in DentsplySirona, If you need assistance with completing the online application due to a disability, please send an accommodation request to careers@dentsplysirona,. Please be sure to include “Accommodation Request” in the subject, Show more Show less
Posted 1 week ago
2.0 - 5.0 years
4 - 8 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Automation EngineerDatabricks. Job Type: Full-time, Contractor. Location: Hybrid Hyderabad | Pune| Delhi. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a detail-oriented and innovative Automation EngineerDatabricks to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you.. Key Responsibilities:. Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments.. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes.. Create detailed and effective test plans and test cases based on technical requirements and business specifications.. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes.. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage.. Document test cases, results, and identified defects; communicate findings clearly to the team.. Conduct performance testing to ensure data processing and retrieval meet established benchmarks.. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation.. Required Skills and Qualifications:. Strong proficiency in Python, Selenium, and SQL for developing test automation solutions.. Hands-on experience with Databricks, data warehouse, and data lake architectures.. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar.. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred).. Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences.. Bachelor’s degree in Computer Science, Information Technology, or a related discipline.. Demonstrated problem-solving skills and a collaborative approach to teamwork.. Preferred Qualifications:. Experience with implementing security and data protection measures in data-driven applications.. Ability to integrate user-facing elements with server-side logic for seamless data experiences.. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies.. Show more Show less
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
About us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. Join our customer's team as a Software Developer and play a pivotal role in building high-impact backend solutions at the forefront of AI and data engineering. This is your chance to work in a collaborative, onsite environment where your technical expertise and communication skills will drive the success of next-generation AI/ML applications.. Key Responsibilities:. Develop, test, and maintain scalable backend components and microservices using Python and PySpark.. Build and optimize advanced data pipelines leveraging Databricks and distributed computing platforms.. Design and administer efficient MySQL databases, focusing on data integrity, availability, and performance.. Integrate machine learning models into production-grade backend systems powering innovative AI features.. Collaborate with data scientists and engineering peers to deliver comprehensive, business-driven solutions.. Monitor, troubleshoot, and enhance system performance using Redis for caching and scalability.. Create clear technical documentation and communicate proactively with the team, emphasizing both written and verbal skills.. Required Skills and Qualifications:. Proficient in Python for backend development with strong coding standards.. Practical experience with Databricks and PySpark in live production environments.. Advanced knowledge of MySQL database design, query optimization, and maintenance.. Solid foundation in machine learning concepts and deploying ML models in backend systems.. Experience utilizing Redis for effective caching and state management.. Outstanding written and verbal communication abilities with strong attention to detail.. Demonstrated success working collaboratively in a fast-paced onsite setting in Hyderabad.. Preferred Qualifications:. Background in high-growth AI/ML or complex data engineering projects.. Familiarity with additional backend technologies or cloud-based platforms.. Experience mentoring or leading technical teams.. Be a key contributor to our customer's team, delivering backend systems that seamlessly bridge data engineering and AI innovation. We value professionals who thrive on clear communication, technical excellence, and collaborative problem-solving.. Show more Show less
Posted 1 week ago
8.0 - 13.0 years
25 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Senior Software Engineer. Job Type: Full-time, Contractor. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a highly skilled Senior Software Engineer to join one of our top customers., committed to designing and implementing high-performance microservices. The ideal candidate will have extensive experience with Python, FastAPI, task queues, web sockets and Kubernetes to build scalable solutions for our platforms. This is an exciting opportunity for those who thrive in challenging environments and have a passion for technology and innovation.. Key Responsibilities:. Design and develop backend services using Python, with an emphasis on FastAPI for high-performance applications.. Architect and orchestrate microservices to handle high concurrency I/O requests efficiently.. Deploy and manage applications on AWS, ensuring robust and scalable solutions are delivered.. Implement and maintain messaging queues using Celery, RabbitMQ, or AWS SQS.. Utilize WebSockets and asynchronous programming to enhance system responsiveness and performance.. Collaborate with cross-functional teams to ensure seamless integration of solutions.. Continuously improve system reliability, scalability, and performance through innovative design and testing.. Required Skills and Qualifications:. Proven experience in production deployments with user bases exceeding 10k.. Expertise in Python and FastAPI, with strong knowledge of microservices architecture.. Proficiency in working with queues and asynchronous programming.. Hands-on experience with databases such as Postgres, MongoDB, or Databricks.. Comprehensive knowledge of Kubernetes for running scalable microservices.. Exceptional written and verbal communication skills.. Consistent work history without overlapping roles or career gaps.. Preferred Qualifications:. Experience with GoLang for microservice development.. Familiarity with data lake technologies such as Iceberg.. Understanding of deploying APIs in Kubernetes environments.. Show more Show less
Posted 1 week ago
2.0 - 5.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Job Title: Full Stack Engineer. Job Type: Full-time, Contractor. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a highly skilled Full Stack Engineer to join our dynamic team, committed to designing and implementing high-performance microservices. The ideal candidate will have extensive experience with Python, FastAPI, task queues, web sockets and Kubernetes to build scalable solutions for our platforms. This is an exciting opportunity for those who thrive in challenging environments and have a passion for technology and innovation.. Key Responsibilities:. Design and develop backend services using Python, with an emphasis on FastAPI for high-performance applications.. Architect and orchestrate microservices to handle high concurrency I/O requests efficiently.. Deploy and manage applications on AWS, ensuring robust and scalable solutions are delivered.. Implement and maintain messaging queues using Celery, RabbitMQ, or AWS SQS.. Utilize WebSockets and asynchronous programming to enhance system responsiveness and performance.. Collaborate with cross-functional teams to ensure seamless integration of solutions.. Continuously improve system reliability, scalability, and performance through innovative design and testing.. Required Skills and Qualifications:. Proven experience in production deployments with user bases exceeding 10k.. Expertise in Python and FastAPI, with strong knowledge of microservices architecture.. Proficiency in working with queues and asynchronous programming.. Hands-on experience with databases such as Postgres, MongoDB, or Databricks.. Comprehensive knowledge of Kubernetes for running scalable microservices.. Exceptional written and verbal communication skills.. Consistent work history without overlapping roles or career gaps.. Preferred Qualifications:. Experience with GoLang for microservice development.. Familiarity with data lake technologies such as Iceberg.. Understanding of deploying APIs in Kubernetes environments.. Show more Show less
Posted 1 week ago
1.0 - 3.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Job Title: Backend Developer Python. Job Type: Full-time. Location: On-site, Hyderabad, Telangana, India. Job Summary:. Join one of our top customer's team as a Backend Developer and help drive scalable, high-performance solutions at the intersection of machine learning and data engineering. You’ll collaborate with skilled professionals to design, implement, and maintain backend systems powering advanced AI/ML applications in a dynamic, onsite environment.. Key Responsibilities:. Develop, test, and deploy robust backend components and microservices using Python and PySpark.. Implement and optimize data pipelines leveraging Databricks and distributed computing frameworks.. Design and maintain efficient databases with MySQL, ensuring data integrity and high availability.. Integrate machine learning models into production-ready backend systems supporting AI-driven features.. Collaborate closely with data scientists and engineers to deliver end-to-end solutions aligned with business goals.. Monitor, troubleshoot, and enhance system performance, utilizing Redis for caching and improved scalability.. Write clear and maintainable documentation, and communicate effectively with team members both verbally and in writing.. Required Skills and Qualifications:. Proficiency in Python programming for backend development.. Hands-on experience with Databricks and PySpark in a production environment.. Strong understanding of MySQL database design, querying, and performance tuning.. Practical background in machine learning concepts and deploying ML models.. Experience with Redis for caching and state management.. Excellent written and verbal communication skills, with a keen attention to detail.. Demonstrated ability to work effectively in an on-site, collaborative setting in Hyderabad.. Preferred Qualifications:. Previous experience in high-growth AI/ML or data engineering projects.. Familiarity with additional backend technologies or cloud platforms.. Demonstrated leadership or mentorship in technical teams.. Show more Show less
Posted 1 week ago
5.0 - 9.0 years
13 - 17 Lacs
Gurugram
Work from Office
Dentsply Sirona is the world’s largest manufacturer of professional dental products and technologies, with a 130-year history of innovation and service to the dental industry and patients worldwide. Dentsply Sirona develops, manufactures, and markets a comprehensive solutions offering including dental and oral health products as well as other consumable medical devices under a strong portfolio of world class brands. Dentsply Sirona’s products provide innovative, high-quality and effective solutions to advance patient care and deliver better and safer dentistry. Dentsply Sirona’s global headquarters is located in Charlotte, North Carolina, USA. The company’s shares are listed in the United States on NASDAQ under the symbol XRAY.. Bringing out the best in people. As advanced as dentistry is today, we are dedicated to making it even better. Our people have a passion for innovation and are committed to applying it to improve dental care. We live and breathe high performance, working as one global team, bringing out the best in each other for the benefit of dental patients, and the professionals who serve them. If you want to grow and develop as a part of a team that is shaping an industry, then we’re looking for the best to join us.. Working At Dentsply Sirona You Are Able To. Develop faster with our commitment to the best professional development.. Perform better as part of a high-performance, empowering culture.. Shape an industry with a market leader that continues to drive innovation.. Make a difference -by helping improve oral health worldwide.. Scope. Role has global scope and includes managing and leading data flow and transformation development in the Data Engagement Platform (DEP). This role will lead work of DS employees as well as contractors.. Key Responsibilities. Develop and maintain high quality data warehouse solution.. Maintain accurate and complete technical architectural documents.. Collaborate with BI Developers and Business Analysts for successful development of BI reporting and analysis.. Work with business groups and technical teams to develop and maintain data warehouse platform for BI reporting.. Develop scalable and maintainable data layer for BI applications to meet business objectives.. To work in a small, smart, agile team – designing, developing and owning full solution for an assigned data area. Develop standards, patterns, best practices for reuse and acceleration.. Perform maintenance and troubleshooting activities in Azure data platform.. Analyze, plan and develop requirements and standards in reference to scheduled projects.. Partake in process to define clear project deliverables.. Coordinate the development of standards, patterns, best practices for reuse and acceleration.. Typical Background. Education: University Degree or equivalent in MIS or similar. Years And Type Of Experience. 5-10 years working with BI and data warehouse solutions.. Key Required Skills, Knowledge And Capabilities. Good understanding of business logic and understanding of their needs.. Some experience with Databricks and dbt is desirable.. Worked with Azure DevOps code repository, version control and task management.. Strong proficiency with SQL and its variation among popular databases.. Knowledge of best practices when dealing with relational databases. Capable of troubleshooting common database issues. You have knowledge of data design and analysis of BI systems and processes.. Strong analytical and logical thinking. Internationally and culturally aware. Communicate well verbally and in writing in English. Key Leadership Behaviors. Dentsply Sirona managers are expected to successfully demonstrate behaviors aligned with the Competency model. See competencies below together with. a Key Specific. Behaviors For Success. Teamwork – Defines success in terms of the whole team. Customer Focus – Is dedicated to meeting the expectations and requirements of internal and external customers and seeking to make improvements with the customer in mind. Strategic Thinking – Applies experience, knowledge, and perspective of business and external or global factors to create new perspectives and fresh thinking. Talent Management – Actively seeks assignments that stretch her beyond comfort zone. Integrity – Raises potential ethical concerns to the right party. Problem Solving – Can analyze problems and put together a plan for resolution within her scope of responsibility. Drive for Results – Can be counted on to reach goals successfully. Accountability – Acts with a clear sense of ownership. Innovation and Creativity – Brings creative ideas to work and acts to take advantage of opportunities to improve business. Leading Change – Adapts to changing priorities and acts without having the total picture. DentsplySirona is an Equal Opportunity/ Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, sexual orientation, disability, or protected Veteran status. We appreciate your interest in DentsplySirona.. If you need assistance with completing the online application due to a disability, please send an accommodation request to careers@dentsplysirona.com. Please be sure to include “Accommodation Request” in the subject.. Show more Show less
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Job Title: Automation Engineer. Job Type: Full-time, Contractor. About Us:. Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market.. Job Summary:. We are seeking a detail-oriented and innovative Automation Engineer to join our customer's team. In this critical role, you will design, develop, and execute automated tests to ensure the quality, reliability, and integrity of data within Databricks environments. If you are passionate about data quality, thrive in collaborative environments, and excel at both written and verbal communication, we'd love to meet you.. Key Responsibilities:. Design, develop, and maintain robust automated test scripts using Python, Selenium, and SQL to validate data integrity within Databricks environments.. Execute comprehensive data validation and verification activities to ensure accuracy and consistency across multiple systems, data warehouses, and data lakes.. Create detailed and effective test plans and test cases based on technical requirements and business specifications.. Integrate automated tests with CI/CD pipelines to facilitate seamless and efficient testing and deployment processes.. Work collaboratively with data engineers, developers, and other stakeholders to gather data requirements and achieve comprehensive test coverage.. Document test cases, results, and identified defects; communicate findings clearly to the team.. Conduct performance testing to ensure data processing and retrieval meet established benchmarks.. Provide mentorship and guidance to junior team members, promoting best practices in test automation and data validation.. Required Skills and Qualifications:. Strong proficiency in Python, Selenium, and SQL for developing test automation solutions.. Hands-on experience with Databricks, data warehouse, and data lake architectures.. Proven expertise in automated testing of data pipelines, preferably with tools such as Apache Airflow, dbt Test, or similar.. Proficient in integrating automated tests within CI/CD pipelines on cloud platforms (AWS, Azure preferred).. Excellent written and verbal communication skills with the ability to translate technical concepts to diverse audiences.. Bachelor’s degree in Computer Science, Information Technology, or a related discipline.. Demonstrated problem-solving skills and a collaborative approach to teamwork.. Preferred Qualifications:. Experience with implementing security and data protection measures in data-driven applications.. Ability to integrate user-facing elements with server-side logic for seamless data experiences.. Demonstrated passion for continuous improvement in test automation processes, tools, and methodologies.. Show more Show less
Posted 1 week ago
3.0 - 6.0 years
9 - 13 Lacs
Gurugram
Work from Office
Dentsply Sirona is the world’s largest manufacturer of professional dental products and technologies, with a 130-year history of innovation and service to the dental industry and patients worldwide. Dentsply Sirona develops, manufactures, and markets a comprehensive solutions offering including dental and oral health products as well as other consumable medical devices under a strong portfolio of world class brands. Dentsply Sirona’s products provide innovative, high-quality and effective solutions to advance patient care and deliver better and safer dentistry. Dentsply Sirona’s global headquarters is located in Charlotte, North Carolina, USA. The company’s shares are listed in the United States on NASDAQ under the symbol XRAY.. Bringing out the best in people. As advanced as dentistry is today, we are dedicated to making it even better. Our people have a passion for innovation and are committed to applying it to improve dental care. We live and breathe high performance, working as one global team, bringing out the best in each other for the benefit of dental patients, and the professionals who serve them. If you want to grow and develop as a part of a team that is shaping an industry, then we’re looking for the best to join us.. Working At Dentsply Sirona You Are Able To. Develop faster with our commitment to the best professional development.. Perform better as part of a high-performance, empowering culture.. Shape an industry with a market leader that continues to drive innovation.. Make a difference -by helping improve oral health worldwide.. Scope of Role. In the role as Azure Data Engineer, you will have the opportunity to join us and become part of the team that works with development, enhancement, and maintenance of our Data Engagement Platform (DEP). You will work with advanced analytics and the latest technology and be part of our passionate team. Does this sound like something that would energize you, then come join us!. Our Global Data and Analytics department handles the collection and streamlining of data into (DEP), development of BI solutions and reports in the Dentsply Sirona group. The team consists of 20+ members and work cross-functionally, which means that you will interact with many functions such as finance, marketing, sales, commercial, supply and operations. We use Azure tools together with Databricks and dbt.. Responsibilities. Develop and maintain high quality data warehouse solution.. Collaborate with BI Developers and Business Analysts for successful development of BI reporting and analysis.. Develop scalable and maintainable data layer for BI applications to meet business objectives.. Work in a small, smart, agile team – design, develop and own full solution for an assigned data area.. Perform maintenance and troubleshooting activities in Azure data platform.. Take part in accurate and complete technical architectural documents.. Work closely with other members in the Global Data and Analytics team.. Maintain clear and coherent communication, both verbal and written, to understand data requirement needs.. Additional responsibilities as assigned.. Education. An academic background, with relevant university degree within Management Information System or similar.. Years And Type Of Experience. Minimum 5 year work experience in a BI position.. Experience with Databricks and dbt is desirable.. Experience with Azure DevOps code repository, version control and task management. Strong proficiency with SQL and its variation among popular databases. Knowledge of best practices when dealing with relational databases. Key Skills, Knowledge & Capabilities. Capable of troubleshooting common database issuesIs motivated by analyzing and understanding the business needs, translating it to technical solutions, assuring that both businessand technical-needs are met.. Strong analytical and logical thinking. Communicative skills, verbally and writing. English language – proficiency in verbal and written communication. How We Lead The DS Way. Actively articulates and promotes Dentsply Sirona’s vision, mission and values.. Advocates on behalf of the customer.. Promotes high performance, innovation and continual improvement.. Consistently meets Company standards, ethics and compliance requirements.. Clear and effective communication with stake holders, which span across multiple levels, socio-geographic areas and functional expertise.. DentsplySirona is an Equal Opportunity/ Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, sexual orientation, disability, or protected Veteran status. We appreciate your interest in DentsplySirona.. If you need assistance with completing the online application due to a disability, please send an accommodation request to careers@dentsplysirona.com. Please be sure to include “Accommodation Request” in the subject.. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane