Jobs
Interviews

5 Data Observability Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Data Modeller/Data Modeler, you will play a crucial role in leading data architecture efforts across various enterprise domains such as Sales, Procurement, Finance, Logistics, R&D, and Advanced Planning Systems (SAP/Oracle). Your responsibilities will include designing scalable and reusable data models, constructing data lake foundations, and collaborating with cross-functional teams to deliver robust end-to-end data solutions. You will work closely with business and product teams to understand processes and translate them into technical specifications. Using methodologies such as Medallion Architecture, EDW, or Kimball, you will design logical and physical data models. It will be essential to source the correct grain of data from authentic source systems or existing DWHs and create intermediary data models and physical views for reporting and consumption. In addition, you will be responsible for implementing Data Governance, Data Quality, and Data Observability practices. Developing business process maps, user journey maps, and data flow/integration diagrams will also be part of your tasks. You will design integration workflows utilizing APIs, FTP/SFTP, web services, and other tools to support large-scale implementation programs involving multiple projects. Your technical skills should include a minimum of 5+ years of experience in data-focused projects, strong expertise in Data Modelling encompassing Logical, Physical, Dimensional, and Vault modeling, and familiarity with enterprise data domains such as Sales, Finance, Procurement, Supply Chain, Logistics, and R&D. Proficiency in tools like Erwin or similar data modeling tools, understanding of OLTP and OLAP systems, and knowledge of Kimball methodology, Medallion architecture, and modern Data Lakehouse patterns are essential. Furthermore, you should have knowledge of Bronze, Silver, and Gold layer architecture in cloud platforms and the ability to read existing data dictionaries, table structures, and normalize data tables effectively. Familiarity with cloud data platforms (AWS, Azure, GCP), DevOps/DataOps best practices, Agile methodologies, and end-to-end integration needs and methods is also required. Preferred experience includes a background in Retail, CPG, or Supply Chain domains, as well as experience with data governance frameworks, quality tools, and metadata management platforms. Your skills should encompass a range of technical aspects such as FTP/SFTP, physical data models, DevOps, data observability, cloud platforms, APIs, data lakehouse, vault modeling, dimensional modeling, and more. In summary, as a Data Modeller/Data Modeler, you will be a key player in designing and implementing data solutions that drive business success across various domains and collaborating with diverse teams to achieve strategic objectives seamlessly.,

Posted 14 hours ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The role of a Site Reliability/Data Observability Engineer in the Data & Analytics organization involves developing and supporting frameworks and mechanisms to monitor data, data pipelines, usage, user operations, data infrastructure, and compute statistics. This proactive approach aims to ensure data accuracy, reliability, system scalability, health, and performance of Analytics platforms on GCP/AWS. Collaborating closely with the Cloud/Data Engineering team, the role emphasizes designing and implementing robust, scalable solutions. Working at Tyson Foods as a Site Reliability Engineer offers the opportunity to engage with cutting-edge technologies and collaborate with skilled professionals in a dynamic environment. The organization values innovation, growth, work-life balance, and provides avenues for career advancement and professional development. Responsibilities include collaborating with cross-functional teams to address data and data pipeline issues, optimize system performance, capacity, and cost-efficiency. Implementing and managing monitoring, logging, and observability solutions is crucial to ensure proactive identification and resolution of data and platform issues. Developing strategies for data collection, metric definition, anomaly detection, alerting, data pipeline monitoring, data quality assessment, and visualization creation is essential. Automation scripts and open-source/market tools are utilized to implement Data Observability frameworks, streamline operational processes, and enhance efficiency. Continuous collaboration and improvement initiatives are undertaken to boost the reliability, scalability, and performance of the data platform. This role demands strong technical expertise, problem-solving skills, and effective communication abilities. Qualifications for this position include a minimum of 3+ years of experience in Site Reliability Engineering or a similar role, with a focus on data observability, cloud infrastructure, and operations. Proficiency in data observability tools and observability tools, expertise in cloud platforms like GCP, AWS, or Azure, knowledge of data analytics, data processing, visualization tools, data quality, CI/CD pipelines, and DevOps practices are required. Strong scripting and programming skills, excellent communication abilities, analytical and troubleshooting skills, and a degree in computer science, engineering, or a related field are necessary. Additional qualifications such as a master's degree in a relevant field, experience in containerization and orchestration technologies, proficiency in infrastructure as code tools, knowledge of security best practices, compliance requirements in cloud environments, certifications in cloud platforms, familiarity with ITIL or other service management frameworks are considered advantageous for the role.,

Posted 1 day ago

Apply

12.0 - 18.0 years

0 Lacs

noida, uttar pradesh

On-site

As a seasoned Manager - Data Engineering with 12-18 years of total experience in data engineering, including 3-5 years in a leadership/managerial role, you will lead complex data platform implementations using Databricks or the Apache data stack. Your key responsibilities will include leading high-impact data engineering engagements for global clients, delivering scalable solutions, and driving digital transformation. You must have hands-on experience in Databricks OR core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.) and expertise in one or more cloud platforms such as AWS, Azure, or GCP, ideally with Databricks on cloud. Strong programming skills in Python, Scala, and SQL are essential, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also required. Your role will involve leading the architecture, development, and deployment of modern data platforms using Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks. Additionally, you will own delivery accountability for data engineering programs across BFSI, telecom, healthcare, or manufacturing clients. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key part of your responsibilities. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance is crucial. You will manage and mentor a team of 10-25 engineers, conducting performance reviews, capability building, and coaching. Supporting presales activities including solutioning, technical proposals, and client workshops will also be part of your role. At GlobalLogic, we prioritize a culture of caring and continuous learning and development. You'll have the opportunity to work on interesting and meaningful projects that have a real impact. We offer balance and flexibility, ensuring that you can achieve the perfect equilibrium between work and life. As a high-trust organization, integrity is key, and you can trust that you are part of a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. As part of our team, you'll collaborate with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

12.0 - 18.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for an experienced Manager - Data Engineering with a strong background in Databricks or the Apache data stack to lead the implementation of complex data platforms. In this role, you will be responsible for overseeing impactful data engineering projects for global clients, delivering scalable solutions, and steering digital transformation initiatives. With 12-18 years of overall experience in data engineering, including 3-5 years in a leadership position, you will need hands-on expertise in either Databricks or the core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.). Proficiency in at least one cloud platform such as AWS, Azure, or GCP, ideally with Databricks on the cloud, is required. Strong programming skills in Python, Scala, and SQL are essential, along with experience in constructing scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also necessary. You should have a proven track record of managing delivery in an onshore-offshore or hybrid model, coupled with exceptional communication, stakeholder management, and team mentoring abilities. As a Manager - Data Engineering, your key responsibilities will include leading the design, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will be tasked with designing and implementing data pipelines (both batch and real-time), data lakehouses, and large-scale ETL frameworks. Furthermore, you will take ownership of delivery accountability for data engineering programs across various industries, collaborating with global stakeholders, product owners, architects, and business teams to drive data-driven outcomes. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance will be crucial. Additionally, you will be responsible for managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching, as well as supporting presales activities including solutioning, technical proposals, and client workshops. At GlobalLogic, we prioritize a culture of caring where people come first. We offer continuous learning and development opportunities to help you grow personally and professionally. You'll have the chance to work on interesting and meaningful projects that have a real impact. With various career areas, roles, and work arrangements, we believe in providing a balance between work and life. As a high-trust organization, integrity is at the core of everything we do. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. Join us in collaborating with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 month ago

Apply

5 - 7 years

10 - 16 Lacs

Pune

Hybrid

Lead Data Engineer Experience: 7 - 10 Years Exp Salary : Upto INR 25 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Hybrid (Pune) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : AWS Glue, Databricks, Azure - Data Factory, SQL, Python, Data Modelling, ETL Good to have skills : Big Data Pipelines, Data Warehousing Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Position: Lead Data Engineer (Databricks) Location: Pune, Ahmedabad Required Experience: 7 to 10 years Preferred: Immediate Joiners Job Overview: We are looking for an accomplished Lead Data Engineer with expertise in Databricks to join our dynamic team. This role is crucial for enhancing our data engineering capabilities, and it offers the chance to work with advanced technologies, including Generative AI. Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Qualifications: Bachelors or masters degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Skills: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies