Jobs
Interviews

1814 Data Architecture Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

9 - 13 Lacs

Ahmedabad

Work from Office

Role Expectations Advanced Analytics, Leadership & Cross-functional Collaboration Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables Develop Strategies for effective data analysis and reporting Define company-wide metrics and relevant data sources Build systems to transform raw data into actionable business insights Work closely with business leaders and stakeholders to understand their needs and translate them into functional and technical requirements Champion a data-driven culture, promoting the use of insights across the organization for informed decision-making Communicate technical complexities clearly to non-technical stakeholders and align diverse teams around common goals Lead digital transformation initiatives to foster innovation and modernization IT Management And Strategic Planning Oversee the architectural planning, development, and operation of all IT systems, ensuring their scalability, performance, security, and continuous integration/continuous deployment (CI/CD) Evaluate, select, and implement cutting-edge technology platforms and infrastructure to enable business growth and competitive advantage Develop and manage an IT budget to optimize resource allocation and ensure ROI on IT investments Establish IT policies, standards, and procedures in line with best practices and regulatory compliance Drive the talent management lifecycle of the IT team, including hiring, training, coaching, and performance management Profile We Are Looking At Working in/anchoring Analytics team for DTC business & Market places A person who lives, breathes and dreams numbers Marico Information classification: Official Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field Master's degree or MBA is a plus Minimum of 3-4 years of experience in IT management and/or advanced analytics roles Proficiency in advanced analytics techniques (e-g , machine learning) and tools (e-g , Python, R, SQL, Tableau, Hadoop) Extensive experience with IT systems architecture, cloud-based solutions (AWS, Google Cloud, Azure), and modern development methodologies (Agile, Scrum, DevOps) Proven ability to lead and develop a high-performing team Strong communication, strategic thinking, and project management skills Familiarity with data privacy standards and regulations (e-g , GDPR, CCPA) Experience in creating breakthrough visualizations Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must Technical Ideal to have o Exposure to our tech stack PHP Microsoft workflows knowledge Experience in the beauty and personal care industry is desirable

Posted 3 weeks ago

Apply

1.0 - 3.0 years

9 - 13 Lacs

Pune

Work from Office

Delivery Manager Data Engineering (Databricks & Snowflake) Position: Delivery Manager Data Engineering Location: Bavdhan/Baner, Pune Experience: 7-10 years Employment Type: Full-time Job Summary We are seeking a Delivery Manager Data Engineering to oversee multiple data engineering projects leveraging Databricks and Snowflake This role requires strong leadership skills to manage teams, ensure timely delivery, and drive best practices in cloud-based data platforms The ideal candidate will have deep expertise in data architecture, ETL processes, cloud data platforms, and stakeholder management Key Responsibilities Project & Delivery Management: Oversee the end-to-end delivery of multiple data engineering projects using Databricks and Snowflake Define project scope, timelines, milestones, and resource allocation to ensure smooth execution Identify and mitigate risks, ensuring that projects are delivered on time and within budget Establish agile methodologies (Scrum, Kanban) to drive efficient project execution Data Engineering & Architecture Oversight Provide technical direction on data pipeline architecture, data lakes, data warehousing, and ETL frameworks Ensure optimal performance, scalability, and security of data platforms Collaborate with data architects and engineers to design and implement best practices for data processing and analytics Stakeholder & Client Management Act as the primary point of contact for clients, senior management, and cross-functional teams Understand business requirements and translate them into technical solutions Provide regular status updates and manage client expectations effectively Team Leadership & People Management Lead, mentor, and develop data engineers, architects, and analysts working across projects Drive a culture of collaboration, accountability, and continuous learning Ensure proper resource planning and capacity management to balance workload effectively Technology & Process Improvement Stay up-to-date with emerging trends in Databricks, Snowflake, and cloud data technologies Continuously improve delivery frameworks, automation, and DevOps for data engineering Implement cost-optimization strategies for cloud-based data solutions Technical Expertise Required Skills & Experience: 10+ years of experience in data engineering and delivery management Strong expertise in Databricks, Snowflake, and cloud platforms (AWS, Azure, GCP) Hands-on experience in ETL, data modeling, and big data processing frameworks (Spark, Delta Lake, Apache Airflow, DBT) Understanding of data governance, security, and compliance standards (GDPR, CCPA, HIPAA, etc) Familiarity with SQL, Python, Scala, or Java for data transformation Project & Team Management Proven experience in managing multiple projects simultaneously Strong knowledge of Agile, Scrum, and DevOps practices Experience in budgeting, forecasting, and resource management Soft Skills & Leadership Excellent communication and stakeholder management skills Strong problem-solving and decision-making abilities Ability to motivate and lead cross-functional teams effectively Preferred Qualifications ???? Experience with data streaming (Kafka, Kinesis, or Pub/Sub) ???? Knowledge of ML & AI-driven data processing solutions ???? Certifications in Databricks, Snowflake, or cloud platforms (AWS/Azure/GCP) Apply or share your updated CV at hr@anvicybernetics,

Posted 3 weeks ago

Apply

2.0 - 7.0 years

9 - 13 Lacs

Gurugram

Work from Office

We are looking for a skilled Database Specialist to join our team at Squareops, focusing on Cloud Infrastructure. The ideal candidate will have 2-7 years of experience in database management and cloud computing. Roles and Responsibility Design, implement, and manage databases for cloud infrastructure projects. Collaborate with cross-functional teams to identify and prioritize database requirements. Develop and maintain database documentation and technical specifications. Ensure data security, integrity, and compliance with industry standards. Troubleshoot and resolve complex database issues efficiently. Optimize database performance and scalability for large-scale applications. Job Requirements Strong knowledge of database management systems and cloud computing platforms. Experience with designing and implementing scalable and secure databases. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Familiarity with database development tools and technologies.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

40 - 55 Lacs

Bengaluru

Work from Office

Roles and responsibilities: Collaborate with cross-functional teams to understand data requirements and design scalable and efficient data processing solutions. Develop and maintain data pipelines using PySpark and SQL on the Databricks platform. Optimise and tune data processing jobs for performance and reliability. Implement automated testing and monitoring processes to ensure data quality and reliability. Work closely with data scientists, data analysts, and other stakeholders to understand their data needs and provide effective solutions. Troubleshoot and resolve data-related issues, including performance bottlenecks and data quality problems. Stay up to date with industry trends and best practices in data engineering and Databricks. Key Requirements: 8+ years of experience as a Data Engineer, with a focus on Databricks and cloud-based data platforms, with a minimum of 4 years of experience in writing unit/end-to-end tests for data pipelines and ETL processes on Databricks. Hands-on experience in PySpark programming for data manipulation, transformation, and analysis. Strong experience in SQL and writing complex queries for data retrieval and manipulation. Experience in Docker for containerising and deploying data engineering applications is good to have. Strong knowledge of the Databricks platform and its components, including Databricks notebooks, clusters, and jobs. Experience in designing and implementing data models to support analytical and reporting needs will be an added advantage.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Agra

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Faridabad

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Jaipur

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

7.0 - 9.0 years

8 - 14 Lacs

Nagpur

Work from Office

Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards

Posted 3 weeks ago

Apply

3.0 - 6.0 years

15 - 16 Lacs

Pune

Work from Office

At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire MDM Professionals in the following areas : Experience 4-6 Years Job Title: Profisee MDM Developer. Experience Required : 4 to 6 Years. Employment Type : Full-Time / Contract. Job Summary: We are seeking a skilled and motivated Profisee MDM professional to join our data management team. The ideal candidate will have hands-on experience with Profisee MDM and a strong understanding of master data management principles, data governance, and data integration. You will play a key role in designing, implementing, and maintaining MDM solutions that support business operations and data quality initiatives. Key Responsibilities: Design and configure Profisee MDM solutions including data models, business rules, workflows, and user interfaces. Collaborate with business and IT stakeholders to gather requirements and define master data domains. Implement data quality rules and validation processes to ensure data accuracy and consistency. Integrate MDM solutions with enterprise systems using ETL tools and APIs. Monitor and maintain MDM performance, troubleshoot issues, and optimize configurations. Support data governance initiatives and ensure compliance with data standards and policies. Provide training and support to end-users and contribute to documentation and best practices. Required Skills: 4 6 years of experience in Master Data Management, with at least 2 3 years of hands-on experience in Profisee MDM . Strong understanding of data modeling, data governance, and data quality principles. Proficiency in SQL and experience with ETL tools and data integration techniques. Familiarity with enterprise data architecture and systems (e. g. , ERP, CRM). Excellent problem-solving, communication, and collaboration skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customers business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools and methodologies. Able to analyse the impact of change requested / enhancement / defect fix and identify dependencies or interrelationships among requirements and transition requirements for the engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyse various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Basic knowledge of architecture Industry tools & frameworks Able to analyse available tools and frameworks for review by the SME and plan for tool configurations and development. Architecture concepts and principles: Basic knowledge of architectural elements, SDLC, methodologies. Able to apply various architectural constructs in the projects and identify various architectural patterns and implement. Analytics Solution Design: High-level awareness of a wide range of core data science/analytics techniques, their advantages, disadvantages, and areas of application. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial and open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Participates in team activities and reaches out to others in team to achieve common goals. Agility: Demonstrates a willingness to accept and embrace differing ideas or perceptions which are beneficial to the organization. Customer Focus: Displays awareness of customers stated needs and gives priority to meeting and exceeding customer expectations at or above expected quality within stipulated time. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Certifications Good To Have Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 3 weeks ago

Apply

7.0 - 8.0 years

32 - 45 Lacs

Pune

Work from Office

We are looking to add an experienced and enthusiastic Lead Data Scientist to our Jet2 Data Science team in India. Reporting to the Data Science Delivery Manager , the Lead Data Scientist is a key appointment to the Data Science Team , with responsibility for executing the data science strategy and realising the benefits we can bring to the business by combining insights gained from multiple large data sources with the contextual understanding and experience of our colleagues across the business. In this exciting role, y ou will be joining an established team of 40+ Data Science professionals , based across our UK and India bases , who are using data science to understand, automate and optimise key manual business processes, inform our marketing strategy, and ass ess product development and revenue opportunities and optimise operational costs. As Lead Data Scientist, y ou will have strong experience in leading data science projects and creating machine learning models and be able t o confidently communicate with and enthuse key business stakeholders . Roles and Responsibilities A typical day in your role at Jet2TT: A lead data scientist would lead a team of data science team Lead will be responsible for delivering & managing day-to-day activities The successful candidate will be highly numerate with a statistical background , experienced in using R, Python or similar statistical analysis package Y ou will be expected to work with internal teams across the business , to identify and collaborate with stakeholders across the wider group. Leading and coaching a group of Data Scientists , y ou will plan and execute the use of machine learning and statistical modelling tools suited to the identified initiative delivery or discovery problem identified . You will have strong ability to analyse the create d algorithms and models to understand how changes in metrics in one area of the business could impact other areas, and be able to communicate those analyses to key business stakeholders. You will identify efficiencies in the use of data across its lifecycle, reducing data redundancy, structuring data to ensure efficient use of time , and ensuring retained data/information provides value to the organisation and remains in-line with legitimate business and/or regulatory requirements. Your ability to rise above group think and see beyond the here and now is matched only by your intellectual curiosity. Strong SQL skills and the ability to create clear data visualisations in tools such as Tableau or Power BI will be essential . They will also have experience in developing and deploying predictive models using machine learning frameworks and worked with big data technologies. As we aim to realise the benefits of cloud technologies, some familiarity with cloud platforms like AWS for data science and storage would be desirable. You will be skilled in gathering data from multiple sources and in multiple formats with knowledge of data warehouse design, logical and physical database design and challenges posed by data quality. Qualifications, Skills and Experience (Candidate Requirements): Experience in leading small to mid-size data science team Minimum 7 years of experience in the industry & 4+ experience in data science Experience in building & deploying machine learning algorithms & detail knowledge on applied statistics Good understanding of various data architecture RDBMS, Datawarehouse & Big Data Experience of working with regions such as US, UK, Europe or Australia is a plus Liaise with the Data Engineers, Technology Leaders & Business Stakeholder Working knowledge of Agile framework is good to have Demonstrates willingness to learn Mentoring, coaching team members Strong delivery performance, working on complex solutions in a fast-paced environment

Posted 3 weeks ago

Apply

3.0 - 4.0 years

17 - 18 Lacs

Bengaluru

Work from Office

KPMG India is looking for Azure Data Engineer - Consultant Azure Data Engineer - Consultant to join our dynamic team and embark on a rewarding career journey Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements Solution design using Microsoft Azure services and other tools The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Build data pipelines to collectively bring together data Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse

Posted 3 weeks ago

Apply

6.0 - 13.0 years

20 - 25 Lacs

Hyderabad

Work from Office

ServiceNow is looking for an experienced Data Platform Architect to join our Customer Journey Data Platform and Insights team. This is a senior-level, hands-on architecture role focused on building a unified, intelligent data platform that brings together signals across the entire customer journey from product usage and learning engagement to support interactions and community activity. You will play a critical role in designing and delivering a scalable, secure, performant and AI-ready data platform that powers customer understanding, insights generation, and value-driven actions. Experience with the ServiceNow platform, enterprise software, and AI/ML integration is highly preferred. Key Responsibilities Platform Architecture Design Design and lead the technical architecture for a unified customer journey data platform leveraging modern data and AI technologies. Define data modeling standards, integration blueprints, and lifecycle management practices. Architect scalable, distributed data systems including data lakes, data mesh, data fabric, data warehouses, and real-time processing pipelines. Create reusable architectural patterns that support real-time and batch processing across structured and unstructured datasets. Technical Strategy Leadership Partner with engineering, product, and analytics teams to align technical roadmaps with customer insights and business objectives. Translate business needs and customer signals into data solutions and architectural strategies. Provide thought leadership in evaluating and implementing cutting-edge tools, frameworks, and best practices. Drive architectural governance, review processes, and architecture maturity models. Cloud, Governance, and Security Design and optimize cloud-native solutions using Azure, Snowflake, Databricks, and other modern platforms. Establish governance frameworks ensuring data quality, privacy, lineage, and compliance across all sources. Implement robust data security practices, including RBAC, encryption, and access auditing. AI/ML Analytics Integration Architect platforms to support AI/ML workflows, including feature stores, model inference pipelines, and feedback loops. Work closely with data scientists and ML engineers to embed intelligence across customer journeys. Enable data democratization and advanced insights through curated datasets, semantic layers, and analytical products. ServiceNow Platform Alignment Integrate deeply with ServiceNow platform data models, APIs, and event based systems. Architect solutions that enhance customer understanding within IT/DT software domains and workflows powered by ServiceNow. Ensure platform extensibility and API-driven architecture supports self-service analytics, operational workflows, and personalization. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Job Description We are seeking a Lead Snowflake Engineer to join our dynamic Data Engineering team. This role will involve owning the architecture, implementation, and optimization of our Snowflake-based data warehouse solutions while mentoring a team of engineers and driving project success. The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams. Key Responsibilities: Design and implement scalable and efficient Snowflake data warehouse architectures and ELT pipelines. Leverage DBT to build and manage data transformation workflows within Snowflake. Lead data modeling efforts to support analytics and reporting needs across the organization. Optimize Snowflake performance including query tuning, resource scaling, and storage usage. Collaborate with business stakeholders and data analysts to gather requirements and deliver high-quality data solutions. Manage and mentor a team of data engineers; provide technical guidance, code reviews, and career development support. Establish and enforce best practices for data engineering, including version control, CI/CD, documentation, and data quality. Ensure data solutions are secure, compliant, and aligned with privacy regulations (e.g., GDPR, CCPA). Continuously evaluate emerging tools and technologies to enhance our data ecosystem. Qualifications Bachelor s degree in Computer Science, Information Technology, or related field. 6+ years of experience in data engineering, including at least 2+ years of hands-on experience with

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Ahmedabad

Work from Office

Position Overview This role is responsible for defining and delivering ZURU s next-generation data architecture built for global scalability, real-time analytics, and AI enablement. You will lead the unification of fragmented data systems into a cohesive, cloud-native platform that supports advanced business intelligence and decision-making. Sitting at the intersection of data strategy, engineering, and commercial enablement, this role demands both deep technical acumen and strong cross-functional influence. You will drive the vision and implementation of robust data infrastructure, champion governance standards, and embed a culture of data excellence across the organisation. Position Impact In the first six months, the Head of Data Architecture will gain deep understanding of ZURU s operating model, technology stack, and data fragmentation challenges. You ll conduct a comprehensive review of current architecture, identifying performance gaps, security concerns, and integration challenges across systems like SAP, Odoo, POS, and marketing platforms. By month twelve, you ll have delivered a fully aligned architecture roadmap implementing cloud-native infrastructure, data governance standards, and scalable models and pipelines to support AI and analytics. You will have stood up a Centre of Excellence for Data, formalised global data team structures, and established yourself as a trusted partner to senior leadership. What are you Going to do Lead Global Data Architecture: Own the design, evolution, and delivery of ZURU s enterprise data architecture across cloud and hybrid environments. Consolidate Core Systems: Unify data sources across SAP, Odoo, POS, IoT, and media into a single analytical platform optimised for business value. Build Scalable Infrastructure: Architect cloud-native solutions that support both batch and streaming data workflows using tools like Databricks, Kafka, and Snowflake. Implement Governance Frameworks: Define and enforce enterprise-wide data standards for access control, privacy, quality, security, and lineage. Enable Metadata Cataloguing: Deploy metadata management and cataloguing tools to enhance data discoverability and self-service analytics. Operationalise AI/ML Pipelines: Lead data architecture that supports AI/ML initiatives, including demand forecasting, pricing models, and personalisation. Partner Across Functions: Translate business needs into data architecture solutions by collaborating with leaders in Marketing, Finance, Supply Chain, RD, and Technology. Optimize Cloud Cost Performance: Roll out compute and storage systems that balance cost efficiency, performance, and observability across platforms. Establish Data Leadership: Build and mentor a high-performing data team across India and NZ, and drive alignment across engineering, analytics, and governance. Vendor and Tool Strategy: Evaluate external tools and partners to ensure the data ecosystem is future-ready, scalable, and cost-effective. What are we Looking for 8+ years of experience in data architecture, with 3+ years in a senior or leadership role across cloud or hybrid environments Proven ability to design and scale large data platforms supporting analytics, real-time reporting, and AI/ML use cases Hands-on expertise with ingestion, transformation, and orchestration pipelines (e.g. Kafka, Airflow, DBT, Fivetran) Strong knowledge of ERP data models, especially SAP and Odoo Experience with data governance, compliance (GDPR/CCPA) , metadata cataloguing, and security practices Familiarity with distributed systems and streaming frameworks like Spark or Flink Strong stakeholder management and communication skills, with the ability to influence both technical and business teams Experience building and leading cross-regional data teams Tools Technologies Cloud Platforms: AWS (S3, EMR, Kinesis, Glue), Azure (Synapse, ADLS), GCP Big Data: Hadoop, Apache Spark, Apache Flink Streaming: Kafka, Kinesis, Pub/Sub Orchestration: Airflow, Prefect, Dagster, DBT Warehousing: Snowflake, Redshift, BigQuery, Databricks Delta NoSQL: Cassandra, DynamoDB, HBase, Redis Query Engines: Presto/Trino, Athena IaC CI/CD: Terraform, GitLab Actions Monitoring: Prometheus, Grafana, ELK, OpenTelemetry Security/Governance: IAM, TLS, KMS, Amundsen, DataHub, Collibra, DBT for lineage What do we Offer Competitive compensation 5 Working Days with Flexible Working Hours Medical Insurance for self family Training skill development programs Work with the Global team, Make the most of the diverse knowledge Several discussions over Multiple Pizza Parties

Posted 3 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC Learn more about us . s Key Attributes Hands on Experience in Data migration and ETL tools like BODS LTMC knowledge with SAP functional understanding Excellent project management and communication skills End to end Data Migration Landscape knowledge Lead Role experience Domestic client handling experience i.e. working with client with domestic onsite Understanding of Data Security and Data Compliance Agile Understanding Certification (Good to have) Domain Knowledge of Manufacturing Industry Sector Mandatory skill sets SAP BODS Preferred skill sets SAP BODS Years of experience required 38 Years Education qualification BE, B.Tech, MCA, M.Tech Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP BO Data Services (BODS) Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} No

Posted 3 weeks ago

Apply

4.0 - 8.0 years

11 - 15 Lacs

Ahmedabad

Work from Office

Not Applicable Specialism Data, Analytics AI Management Level Senior Associate Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. . s Roles Responsibilities Hands on Experience with Power BI Dashboard Development and willing to work as an individual contributor. Clear Understanding of Data Warehousing Concepts. Should Work with a data engineering team closely to perform data extraction and data transformation processes to create the datasets. Good Experience in different Categories of DAX functions like Time Intelligence Function, Filter Functions, Date Functions, Logical Functions, Text Functions, Number and Statistical Functions. Good experience with Visual level, Page level, Report level and Drill Through filters for filtering the data in a Report. Experience in Row Level Security (RLS) implementation in Power BI. Should work with OnPremises Data Gateway to Refresh and Schedule Refresh of the Dataset. Strong data transformation skills through Power Query Editor with familiarity in M language. Data Modelling knowledge with Joins on multiple tables and creating new bridge tables. Knowledge on PBI desktop features like Bookmarks, Selections, Sync Slicers Edit interactions. Knowledge of PBI Service features like creating import, scheduling extract refresh, managing subscriptions etc. Publishing and maintenance of Apps in Power BI. Also, knowledge on configuring Row Level Security and Dashboard level Security in Power BI Service. Experience in creating and publishing reports on both web and mobile layout. Able to Perform Unit Testing like functionality testing and Data Validation. Report Performance Optimization and Troubleshooting. Clear Understanding of UI and UX designing. Hands on Working Experience in SQL to write the queries. Very good communication skills must be able to discuss the requirements effectively with business owners. Mandatory skill sets Power BI, DAX Preferred skill sets Power BI, DAX Years of experience required 48 Years Educational Qualification BE, B.Tech, MCA, M.Tech Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills DAX Language, Power BI Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Travel Requirements Available for Work Visa Sponsorship

Posted 3 weeks ago

Apply

2.0 - 4.0 years

6 - 9 Lacs

Chennai

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decisionmaking for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . s Possess handson experience in IICS / Informatica Powercenter. Demonstrated involvement in endtoend IICS/IDMC project. Possess handson experience in Informatica PowerCenter Structured Query Language (SQL) Data Warehouse expertise Experience in Extract, Transform, Load (ETL) Testing Effective communication skills Key Responsibilities Design and develop ETL processes using Informatica IICS / Informatica Powercenter. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Should have good expertise on IICS Data integration / Informatica Powercenter and Application Integration, Oracle SQL Implement data integration solutions that ensure data accuracy and consistency. Monitor and optimize existing workflows for performance and efficiency. Troubleshoot and resolve any issues related to data integration and ETL processes. Maintain documentation for data processes and integration workflows. Mandatory skill sets ETL Informatica Preferred skill sets ETL Informatica Years of experience required 2 4 yrs Education qualification BTech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology Degrees/Field of Study preferred Required Skills ETL (Informatica) Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} No

Posted 3 weeks ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Hyderabad

Work from Office

About the role Youll be at the heart of developing and maintaining our sophisticated in-house insurance products built on relational or document databases. You will have the opportunity to join one of our product teams and contribute to the development of functionality which generates real business impact. About the team We are a team that believes in engineering excellence and that our leaders should also be engineers themselves. We build applications that are carefully designed, thoughtfully implemented, and surpass the expectations of our users by working together with product owners. Quality and stability are first-class deliverables in everything we do, and we lead by example by embedding high standards into our processes. Your responsibilities include Designs, develops, deploys, and supports sustainable data/solutions architectures such as design patterns, reference data architecture, conceptual, logical, and physical data models for both Relational and NoSQL DB Data migration / ingestion / transfer from / to heterogeneous databases and file types. Performance Optimization (Query fine tuning, indexing strategy etc.) Support project team conduct Public Cloud Data Growth and Data Service Consumption assessment and forecast Collaborate effectively within a cross-functional team including requirements engineers, QA specialists, and other application engineers. Stay current with emerging technologies and Generative AI developments to continuously improve our solutions. About you Youre a naturally curious and thoughtful professional who thrives in a high-performance engineering environment. Your passion for coding is matched by your commitment to delivering business value. You believe in continuous learning through self-improvement or by absorbing knowledge from those around you and youre excited to contribute to a team that values technical excellence. You should bring the following skills and experiences Proficient in Relational and NoSQL DBs Proficient in PL/SQL programming Strong data model and database design skill for both Relational and NoSQL Experience with seamless data integration using Informatica and Azure Data Factory Must have previous public cloud experience, particularly with Microsoft Azure

Posted 3 weeks ago

Apply

6.0 - 8.0 years

22 - 27 Lacs

Hyderabad

Work from Office

We are seeking a Lead Snowflake Engineer to join our dynamic Data Engineering team. This role will involve owning the architecture, implementation, and optimization of our Snowflake-based data warehouse solutions while mentoring a team of engineers and driving project success. The ideal candidate will bring deep technical expertise in Snowflake, hands-on experience with DBT (Data Build Tool), and a collaborative mindset for working across data, analytics, and business teams. Key Responsibilities: Design and implement scalable and efficient Snowflake data warehouse architectures and ELT pipelines. Leverage DBT to build and manage data transformation workflows within Snowflake. Lead data modeling efforts to support analytics and reporting needs across the organization. Optimize Snowflake performance including query tuning, resource scaling, and storage usage. Collaborate with business stakeholders and data analysts to gather requirements and deliver high-quality data solutions. Manage and mentor a team of data engineers; provide technical guidance, code reviews, and career development support. Establish and enforce best practices for data engineering, including version control, CI/CD, documentation, and data quality. Ensure data solutions are secure, compliant, and aligned with privacy regulations (e.g., GDPR, CCPA). Continuously evaluate emerging tools and technologies to enhance our data ecosystem. Bachelor s degree in Computer Science, Information Technology, or related field. 6+ years of experience in data engineering, including at least 2+ years of hands-on experience with Snowflake.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

35 - 50 Lacs

Hyderabad

Work from Office

Job Description: Senior Data Analyst Location: Hyderabad, IN - Work from Office Experience: 7+ Years Role Summary We are seeking an experienced and highly skilled Senior Data Analyst to join our team. The ideal candidate will possess a deep proficiency in SQL, a strong understanding of data architecture, and a working knowledge of the Google Cloud platform (GCP)based ecosystem. They will be responsible for turning complex business questions into actionable insights, driving strategic decisions, and helping shape the future of our Product/Operations team. This role requires a blend of technical expertise, analytical rigor, and excellent communication skills to partner effectively with engineering, product, and business leaders. Key Responsibilities Advanced Data Analysis: Utilize advanced SQL skills to query, analyze, and manipulate large, complex datasets. Develop and maintain robust, scalable dashboards and reports to monitor key performance indicators (KPIs). Source Code management : Proven ability to effectively manage, version, and collaborate on code using codebase management systems like GitHub. Responsible for upholding data integrity, producing reproducible analyses, and fostering a collaborative database management environment through best practices in version control and code documentation. Strategic Insights: Partner with product managers and business stakeholders to define and answer critical business questions. Conduct deep-dive analyses to identify trends, opportunities, and root causes of performance changes. Data Architecture & Management: Work closely with data engineers to design, maintain, and optimize data schemas and pipelines. Provide guidance on data modeling best practices and ensure data integrity and quality. Reporting & Communication: Translate complex data findings into clear, concise, and compelling narratives for both technical and non-technical audiences. Present insights and recommendations to senior leadership to influence strategic decision-making. Project Leadership: Lead analytical projects from end to end, including defining project scope, methodology, and deliverables. Mentor junior analysts, fostering a culture of curiosity and data-driven problem-solving. Required Skills & Experience Bachelor's degree in a quantitative field such as Computer Science, Statistics, Mathematics, Economics, or a related discipline. 5+ years of professional experience in a data analysis or business intelligence role. Expert-level proficiency in SQL with a proven ability to write complex queries, perform window functions, and optimize queries for performance on massive datasets. Strong understanding of data architecture, including data warehousing, data modeling (e.g., star/snowflake schemas), and ETL/ELT principles. Excellent communication and interpersonal skills, with a track record of successfully influencing stakeholders. Experience with a business intelligence tool such as Tableau, Looker, or Power BI to create dashboards and visualizations. Experience with internal Google/Alphabet data tools and infrastructure, such as BigQuery, Dremel, or Google-internal data portals. Experience with statistical analysis, A/B testing, and experimental design. Familiarity with machine learning concepts and their application in a business context. A strong sense of curiosity and a passion for finding and communicating insights from data. Proficiency with scripting languages for data analysis (e.g., App scripting , Python or R ) would be an added advantage Responsibilities Lead a team of data scientists and analysts to deliver data-driven insights and solutions. Oversee the development and implementation of data models and algorithms to support new product development. Provide strategic direction for data science projects ensuring alignment with business goals. Collaborate with cross-functional teams to integrate data science solutions into business processes. Analyze complex datasets to identify trends and patterns that inform business decisions. Utilize generative AI techniques to develop innovative solutions for product development. Ensure adherence to ITIL V4 practices in all data science projects. Develop and maintain documentation for data science processes and methodologies. Mentor and guide team members to enhance their technical and analytical skills. Monitor project progress and adjust strategies to meet deadlines and objectives. Communicate findings and recommendations to stakeholders in a clear and concise manner. Drive continuous improvement in data science practices and methodologies. Foster a culture of innovation and collaboration within the data science team. Qualifications Possess strong experience in business analysis and data analysis. Demonstrate expertise in generative AI and its applications in product development. Have a solid understanding of ITIL V4 practices and their implementation. Exhibit excellent communication and collaboration skills. Show proficiency in managing and leading a team of data professionals. Display a commitment to working from the office during day shifts.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

40 - 80 Lacs

Bengaluru

Work from Office

About the Role: We are seeking a highly skilled Data Solutions Architect - Business Intelligence & AI to lead the design and delivery of advanced data solutions. This role requires a seasoned professional with deep technical expertise, consulting experience, and leadership capabilities to drive data transformation initiatives. The ideal candidate will play a pivotal role in architecting scalable data platforms, enabling AI-driven automation, and mentoring a team of data engineers and analysts.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

Piller Soft Technology is looking for Lead Data Engineer to join our dynamic team and embark on a rewarding career journey Designing and developing data pipelines: Lead data engineers are responsible for designing and developing data pipelines that move data from various sources to storage and processing systems. Building and maintaining data infrastructure: Lead data engineers are responsible for building and maintaining data infrastructure, such as data warehouses, data lakes, and data marts. Ensuring data quality and integrity: Lead data engineers are responsible for ensuring data quality and integrity, by setting up data validation processes and implementing data quality checks. Managing data storage and retrieval: Lead data engineers are responsible for managing data storage and retrieval, by designing and implementing data storage systems, such as NoSQL databases or Hadoop clusters. Developing and maintaining data models: Lead data engineers are responsible for developing and maintaining data models, such as data dictionaries and entity-relationship diagrams, to ensure consistency in data architecture. Managing data security and privacy: Lead data engineers are responsible for managing data security and privacy, by implementing security measures, such as access controls and encryption, to protect sensitive data. Leading and managing a team: Lead data engineers may be responsible for leading and managing a team of data engineers, providing guidance and support for their work.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Noida

Work from Office

Design, implement, and maintain data pipelines for processing large datasets, ensuring data availability, quality, and efficiency for machine learning model training and inference. Collaborate with data scientists to streamline the deployment of machine learning models, ensuring scalability, performance, and reliability in production environments. Develop and optimize ETL (Extract, Transform, Load) processes, ensuring data flow from various sources into structured data storage systems. Automate ML workflows using ML Ops tools and frameworks (e.g., Kubeflow, MLflow, TensorFlow Extended (TFX)). Ensure effective model monitoring, versioning, and logging to track performance and metrics in a production setting. Collaborate with cross-functional teams to improve data architectures and facilitate the continuous integration and deployment of ML models. Work on data storage solutions, including databases, data lakes, and cloud-based storage systems (e.g., AWS, GCP, Azure). Ensure data security, integrity, and compliance with data governance policies. Perform troubleshooting and root cause analysis on production-level machine learning systems. Skills: Glue, Pyspark, AWS Services, Strong in SQL; Nice to have : Redshift, Knowledge of SAS Dataset Mandatory Competencies DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker ETL - ETL - AWS Glue DevOps/Configuration Mgmt - Cloud Platforms - AWS DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Containerization (Docker, Kubernetes) Database - Sql Server - SQL Packages

Posted 3 weeks ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Noida

Work from Office

Experience in developing and managing dashboards and reports in Tableau In-depth knowledge and a sound understanding of RDBMS systems, SQL, Business Intelligence, and data analytics. Excellent analytical skills to forecast and predict trends and insights using past and current data Knowledge of data architecture, data modelling, data mapping, data analysis, and data visualization Able to build visually-stunning and interactive dashboards in Tableau. Good to have knowledge of Power BI Mandatory Competencies Beh - Communication and collaboration BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL QA/QE - QA Analytics - Data Analysis

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies