Home
Jobs

310 Data Lake Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

3 - 7 Lacs

Mumbai, Pune, Chennai

Work from Office

Naukri logo

Job Category: IT Job Type: Full Time Job Location: Bangalore Chennai Mumbai Pune Exp:- 4 to 5 Years Location:- Pune/Mumbai/Bangalore/Chennai JD : Azure Data Engineer with QA: Must Have - Azure Data Bricks, Azure Data Factory, Spark SQL Years - 4-5 years of development experience in Azure Data Bricks Strong experience in SQL along with performing Azure Data bricks Quality Assurance. Understand complex data system by working closely with engineering and product teams Develop scalable and maintainable applications to extract, transform, and load data in various formats to SQL Server, Hadoop Data Lake or other data storage locations. Kind Note: Please apply or share your resume only if it matches the above criteria.

Posted 1 month ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

About the Role: Data Engineer Core Responsibilities: The candidate is expected to lead one of the key analytics areas end-to-end. This is a pure hands-on role. Ensure the solutions built meet the required best practices and coding standards. Ability to adapt to any new technology if the situation demands. Requirement gathering with the business and getting this prioritized in the sprint cycle. Should be able to take end-to-end responsibility of the assigned task Ensure quality and timely delivery. Preference and Experience- Strong at PySpark, Python, and Java fundamentals Good understanding of Data Structure Good at SQL queries/optimization Strong fundamentals of OOP programming Good understanding of AWS Cloud, Big Data. Nice to have Data Lake, AWS Glue, Athena, S3, Kinesis, SQL/NoSQL DB Academic qualifications- Must be a Technical Graduate B. Tech / M. Tech – Tier 1/2 colleges.

Posted 1 month ago

Apply

2.0 - 4.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. Responsibilities Act as a subject matter expert across different digital projects. Oversee work with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 7+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). B Tech/BA/BS in Computer Science, Math, Physics, or other technical fields.

Posted 1 month ago

Apply

12.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title Senior Software Engineer Experience 12-20 Years Location Bangalore : Strong knowledge & hands-on experience in AWS Data Bricks Nice to have Worked in hp eco system (FDL architecture) Technically strong to help the team on any technical issues they face during the execution. Owns the end-to-end technical deliverables Hands on data bricks + SQL knowledge Experience in AWS S3, Redshift, EC2 and Lambda services Extensive experience in developing and deploying Bigdata pipelines Experience in Azure data lake Strong hands on in SQL development / Azure SQL and in-depth understanding of optimization and tuning techniques in SQL with Redshift Development in Notebooks (like Jupyter, DataBricks, Zeppelin etc) Development experience in Spark Experience in scripting language like python and any other programming language Roles and Responsibilities Candidate must have hands on experience in AWS Data Databricks Good development experience using Python/Scala, Spark SQL and Data Frames Hands-on with Databricks, Data Lake and SQL knowledge is a must. Performance tuning, troubleshooting, and debugging SparkTM Process Skills: Agile – Scrum Qualification: Bachelor of Engineering (Computer background preferred)

Posted 1 month ago

Apply

2.0 - 5.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Use Design thinking and a consultative approach to conceive cutting edge technology solutions for business problems, mining core Insights as a service model Engage with project activities across the Information lifecycle, often related to paradigms like - Building & managing Business data lakes and ingesting data streams to prepare data , Developing machine learning and predictive models to analyse data , Visualizing data , Empowering Information consumers with agile Data Models that enable Self-Service BI , Specialize in Business Models and architectures across various Industry verticals Participate in business requirements / functional specification definition, scope management, data analysis and design, in collaboration with both business stakeholders and IT teams , Document detailed business requirements, develop solution design and specifications. Support and coordinate system implementations through the project lifecycle working with other teams on a local and global basis Work closely with the solutions architecture team to define the target detailed solution to deliver the business requirements. Your Profile B.E. / B.Tech. + MBA (Systems / Data / Data Science/ Analytics / Finance) with a good academic background Strong communication, facilitation, relationship-building, presentation, and negotiation skills Consultant must have a flair for storytelling and be able to present interesting insights from the data. Consultant should have good Soft skills like good communication, proactive, self-learning skills etc Consultants are expected to be flexible to the dynamically changing needs of the industry. Must have good exposure to Database management systems, Good to have knowledge about big data ecosystem like Hadoop. Hands on with SQL and good knowledge of noSQL based databases. Good to have working knowledge of R/Python language. Exposure to / Knowledge about one of the cloud ecosystems Google / AWS/ Azure What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Cloud Architect to design and oversee scalable, secure, and cost-efficient cloud solutions. Great for architects who bridge technical vision with business needs. Key Responsibilities: Design cloud-native solutions using AWS, Azure, or GCP Lead cloud migration and transformation projects Define cloud governance, cost control, and security strategies Collaborate with DevOps and engineering teams for implementation Required Skills & Qualifications: Deep expertise in cloud architecture and multi-cloud environments Experience with containers, serverless, and microservices Proficiency in Terraform, CloudFormation, or equivalent Bonus: Cloud certification (AWS/Azure/GCP Architect) Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

2.0 - 7.0 years

20 - 30 Lacs

Pune

Work from Office

Naukri logo

Work mode – Currently this is remote but it’s not permanent WFH , once business ask the candidate to come to office, they must relocate. Mandatory:- DE , Azure , synapse , SQL python , Pyspark, ETL,Fabric, • Exp.in Python for scripting or data tasks. Required Candidate profile • Hands-on exp in SQL& relational databases (SQL Server,PostgreSQL). • data warehousing concepts (ETL, • Hands-on exp in Azure data integration tools like DF, Synapse, Data Lake and Blob Storage.

Posted 1 month ago

Apply

15.0 - 24.0 years

45 - 70 Lacs

Bhubaneswar, Hyderabad

Work from Office

Naukri logo

Job Description: Head of Data and Digital, Delivery Bourntec Solutions Title: Head of Data and Digital, Delivery Level: Director Location: Hyderabad/Bhubaneshwar Role Overview: The Head of Data and Digital, Delivery is a critical leadership role responsible for the successful delivery of Bourntec Solutions' data and digital projects and services to our clients. This Director-level position will lead and manage a team of delivery managers, data scientists, digital specialists, and engineers, ensuring high-quality, on-time, and within-budget project execution. The ideal candidate will possess a strong technical background in data and digital technologies, coupled with exceptional leadership, delivery management, and client management skills. This role requires a strategic thinker with a proven ability to build and scale delivery teams, optimize processes, and drive client satisfaction. Key Responsibilities: Delivery Leadership & Management: Provide strategic leadership and direction to the Data and Digital Delivery teams. Oversee the end-to-end delivery lifecycle of data analytics, business intelligence, AI/ML, cloud, and other digital transformation projects. Manage and mentor a team of Delivery Managers, Data Scientists, Digital Specialists, and Engineers, fostering a high-performance and collaborative environment. Ensure projects are delivered with high quality, on time, within budget, and according to client specifications. Proactively identify and mitigate project risks and issues, escalating appropriately and implementing effective solutions. Establish and maintain strong relationships with client stakeholders, acting as a trusted advisor and point of escalation for delivery-related matters. Operational Excellence & Process Improvement: Define and implement best practices, methodologies, and standards for data and digital project delivery. Continuously evaluate and optimize delivery processes to improve efficiency, quality, and predictability. Establish and track key delivery metrics and KPIs, providing regular reports to senior management. Implement and enforce project governance frameworks and compliance standards. Drive the adoption of relevant tools and technologies to enhance delivery capabilities. Team Building & Talent Development: Recruit, onboard, and develop top talent within the Data and Digital Delivery teams. Foster a culture of continuous learning and professional development within the team. Conduct performance reviews, provide feedback, and identify opportunities for growth. Build a scalable and agile delivery organization to support the company's growth objectives. Client Relationship Management: Serve as a key point of contact for executive-level client stakeholders regarding project delivery. Understand client business needs and ensure delivery aligns with their strategic goals. Proactively manage client expectations and ensure high levels of client satisfaction. Identify opportunities for expanding services and solutions within existing client engagements. Strategic Contribution: Contribute to the overall strategic planning and direction of Bourntec Solutions' data and digital offerings. Stay abreast of the latest trends and advancements in data and digital technologies. Collaborate with Sales and Pre-sales teams to develop compelling proposals and solutions for clients. Contribute to the development of new service offerings and intellectual property. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field. Minimum of 15+ years of progressive experience in data and digital technology delivery, with at least 5+ years in a leadership role managing delivery teams. Proven track record of successfully delivering complex data analytics, business intelligence, AI/ML, cloud, or digital transformation projects for enterprise clients. Strong technical understanding of data warehousing, data lakes, ETL/ELT processes, data modeling, AI/ML algorithms, cloud platforms (AWS, Azure, GCP), and modern digital technologies. Excellent leadership, communication (written and verbal), interpersonal, and presentation skills. Strong client management and stakeholder management skills, with the ability to build and maintain trusted relationships. Proven ability to build, mentor, and motivate high-performing delivery teams. Experience in establishing and implementing delivery methodologies, standards, and processes. Strong analytical and problem-solving skills with a data-driven approach. Experience with project management tools and software. Ability to thrive in a fast-paced and dynamic environment. Preferred Qualifications: Relevant certifications in project management (e.g., PMP, Agile certifications) or cloud platforms. Experience working in a services-based organization. Familiarity with industry-specific data and digital solutions. Bourntec Solutions is an equal opportunity employer and values diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 1 month ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We are seeking an experienced Data Architect to join our team , working with our UAE-based client. The ideal candidate should have 8-12 years of hands-on experience in data architecture, with at least 4 years in architectural design and integration. Strong expertise in MS Dynamics and data lake architecture is required, along with proficiency in ETL, data modeling, data integration, and data quality assurance. The candidate should have a strong problem-solving mindset, the ability to handle architectural issues, and experience in troubleshooting. They should also be a proactive contributor and a team player with a flexible attitude. The role requires immediate availability and the ability to work as per UAE timings. Location: Chennai, Hyderabad, Kolkata, Pune, Ahmedabad, Remote

Posted 1 month ago

Apply

4.0 - 5.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role - We are seeking a highly skilled and experienced Senior Data Scientist to join our data science team. - As a Senior Data Scientist, you will play a critical role in driving data-driven decision making across the organization by developing and implementing advanced analytical solutions. - You will leverage your expertise in data science, machine learning, and statistical analysis to uncover insights, build predictive models, and solve complex business challenges. Key Responsibilities - Develop and implement statistical and machine learning models (e.g., regression, classification, clustering, time series analysis) to address business problems. - Analyze large and complex datasets to identify trends, patterns, and anomalies. - Develop predictive models for forecasting, churn prediction, customer segmentation, and other business outcomes. - Conduct A/B testing and other experiments to optimize business decisions. - Communicate data insights effectively through visualizations, dashboards, and presentations. - Develop and maintain interactive data dashboards and reports. - Present findings and recommendations to stakeholders in a clear and concise manner. - Work with data engineers to design and implement data pipelines and data warehousing solutions. - Ensure data quality and integrity throughout the data lifecycle. - Develop and maintain data pipelines for data ingestion, transformation, and loading. - Stay up-to-date with the latest advancements in data science, machine learning, and artificial intelligence. - Research and evaluate new technologies and tools to improve data analysis and modeling capabilities. - Explore and implement new data science techniques and methodologies. - Collaborate effectively with data engineers, business analysts, product managers, and other stakeholders. - Communicate technical information clearly and concisely to both technical and non-technical audiences. Qualifications Essential - 4+ years of experience as a Data Scientist or in a related data science role. - Strong proficiency in statistical analysis, machine learning algorithms, and data mining techniques. - Experience with programming languages like Python (with libraries like scikit-learn, pandas, NumPy) or R. - Experience with data visualization tools (e.g., Tableau, Power BI). - Experience with data warehousing and data lake technologies. - Excellent analytical, problem-solving, and communication skills. - Master's degree in Statistics, Mathematics, Computer Science, or a related field Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Job Position Python Lead Total Exp Required 6+ years Relevant Exp Required around 5 Mandatory skills required Strong Python coding and development Good to have skills required Cloud, SQL , data analysis skills Location Pune - Kharadi - WFO - 3 days/week. About The Role : We are seeking a highly skilled and experienced Python Lead to join our team. The ideal candidate will have strong expertise in Python coding and development, along with good-to-have skills in cloud technologies, SQL, and data analysis. Key Responsibilities : - Lead the development of high-quality, scalable, and robust Python applications. - Collaborate with cross-functional teams to define, design, and ship new features. - Ensure the performance, quality, and responsiveness of applications. - Develop RESTful applications using frameworks like Flask, Django, or FastAPI. - Utilize Databricks, PySpark SQL, and strong data analysis skills to drive data solutions. - Implement and manage modern data solutions using Azure Data Factory, Data Lake, and Data Bricks. Mandatory Skills : - Proven experience with cloud platforms (e.g. AWS) - Strong proficiency in Python, PySpark, R, and familiarity with additional programming languages such as C++, Rust, or Java. - Expertise in designing ETL architectures for batch and streaming processes, database technologies (OLTP/OLAP), and SQL. - Experience with the Apache Spark, and multi-cloud platforms (AWS, GCP, Azure). - Knowledge of data governance and GxP data contexts; familiarity with the Pharma value chain is a plus. Good to Have Skills : - Experience with modern data solutions via Azure. - Knowledge of principles summarized in the Microsoft Cloud Adoption Framework. - Additional expertise in SQL and data analysis. Educational Qualifications Bachelor's/Master's degree or equivalent with a focus on software engineering. If you are a passionate Python developer with a knack for cloud technologies and data analysis, we would love to hear from you. Join us in driving innovation and building cutting-edge solutions! Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Role Python Developer Location Bangalore Experience4 - 7 Yrs Employment Type Full Time, Working mode Regular Notice Period Immediate - 15 Days About the Role : We are seeking a skilled Python Developer to join our dynamic team and contribute to the development of innovative data-driven solutions. The ideal candidate will have a strong foundation in Python programming, data analysis, and data processing techniques. This role will involve working with various data sources, including Redis, MongoDB, SQL, and Linux, to extract, transform, and analyze data for valuable insights. You will also be responsible for developing and maintaining efficient and scalable data pipelines and visualizations using tools like matplotlib and seaborn. Additionally, experience with web development frameworks such as Flask, FastAPI, or Django, as well as microservices architecture, will be a significant advantage. Key Responsibilities : - Design, develop, and maintain efficient and scalable data pipelines to extract, transform, and load (ETL) data from various sources, including Redis, MongoDB, SQL, and Linux. - Conduct in-depth data analysis and processing using Python libraries and tools to uncover valuable insights and trends. - Develop and maintain data visualizations using matplotlib, seaborn, or other relevant tools to effectively communicate findings to stakeholders. - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. - Develop and maintain web applications using Python frameworks like Flask, FastAPI, or Django, adhering to best practices and coding standards. - Design and implement microservices architecture to build scalable and modular systems. - Troubleshoot and resolve technical issues related to data pipelines, applications, and infrastructure. - Stay updated with the latest trends and technologies in the data engineering and Python development landscape. Required Skills and Qualifications : - Strong proficiency in Python programming, including object-oriented programming and functional programming concepts. - Experience with data analysis and processing libraries such as pandas, NumPy, and scikit-learn. - Familiarity with data storage and retrieval technologies, including Redis, MongoDB, SQL, and Linux. - Knowledge of data visualization tools like matplotlib and seaborn. - Experience with web development frameworks such as Flask, FastAPI, or Django. - Understanding of microservices architecture and principles. - Excellent problem-solving and analytical skills. - Ability to work independently and as part of a team. - Strong communication and interpersonal skills. Preferred Skills and Qualifications : - Experience with cloud platforms (AWS, GCP, Azure). - Knowledge of containerization technologies (Docker, Kubernetes). - Familiarity with data warehousing and data lake concepts. - Experience with machine learning and deep learning frameworks (TensorFlow, PyTorch). Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Role Big Data Engineer Work Location Bangalore (CV Ramen Nagar location) Experience 7+ Years Notice Period Immediate - 30 days Mandatory Skills Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

5.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role - We are seeking a highly skilled and experienced Senior Azure Databricks Engineer to join our dynamic data engineering team. - As a Senior Azure Databricks Engineer, you will play a critical role in designing, developing, and implementing data solutions on the Azure Databricks platform. - You will be responsible for building and maintaining high-performance data pipelines, transforming raw data into valuable insights, and ensuring data quality and reliability. Key Responsibilities - Design, develop, and implement data pipelines and ETL/ELT processes using Azure Databricks. - Develop and optimize Spark applications using Scala or Python for data ingestion, transformation, and analysis. - Leverage Delta Lake for data versioning, ACID transactions, and data sharing. - Utilize Delta Live Tables for building robust and reliable data pipelines. - Design and implement data models for data warehousing and data lakes. - Optimize data structures and schemas for performance and query efficiency. - Ensure data quality and integrity throughout the data lifecycle. - Integrate Azure Databricks with other Azure services (e.g., Azure Data Factory, Azure Synapse Analytics, Azure Blob Storage). - Leverage cloud-based data services to enhance data processing and analysis capabilities. Performance Optimization & Troubleshooting - Monitor and analyze data pipeline performance. - Identify and troubleshoot performance bottlenecks. - Optimize data processing jobs for speed and efficiency. - Collaborate effectively with data engineers, data scientists, data analysts, and other stakeholders. - Communicate technical information clearly and concisely. - Participate in code reviews and contribute to the improvement of development processes. Qualifications Essential - 5+ years of experience in data engineering, with at least 2 years of hands-on experience with Azure Databricks. - Strong proficiency in Python and SQL. - Expertise in Apache Spark and its core concepts (RDDs, DataFrames, Datasets). - In-depth knowledge of Delta Lake and its features (e.g., ACID transactions, time travel). - Experience with data warehousing concepts and ETL/ELT processes. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

7.0 - 8.0 years

17 - 22 Lacs

Mumbai

Work from Office

Naukri logo

About the Role - We are seeking a highly skilled and experienced Senior Data Architect to join our growing data engineering team. - As a Senior Data Architect, you will play a critical role in designing, developing, and implementing robust and scalable data solutions to support our business needs. - You will be responsible for defining data architectures, ensuring data quality and integrity, and driving data-driven decision making across the organization. Key Responsibilities - Design and implement data architectures for various data initiatives, including data warehouses, data lakes, and data marts. - Define data models, data schemas, and data flows for complex data integration projects. - Develop and maintain data dictionaries and metadata repositories. - Ensure data quality and consistency across all data sources. - Design and implement data warehousing solutions, including ETL/ELT processes, data transformations, and data aggregations. - Support the development and implementation of business intelligence and reporting solutions. - Optimize data warehouse performance and scalability. - Define and implement data governance policies and procedures. - Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). - Develop and implement data access controls and data masking strategies. - Design and implement data solutions on cloud platforms (AWS, Azure, GCP), leveraging cloud-native data services. - Implement data pipelines and data lakes on cloud platforms. - Collaborate effectively with data engineers, data scientists, business analysts, and other stakeholders. - Communicate complex technical information clearly and concisely to both technical and non-technical audiences. - Present data architecture designs and solutions to stakeholders. Qualifications Essential - 7+ years of experience in data architecture, data modeling, and data warehousing. - Strong understanding of data warehousing concepts, including dimensional modeling, ETL/ELT processes, and data quality. - Experience with relational databases (e.g., SQL Server, Oracle, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). - Experience with data integration tools and technologies. - Excellent analytical and problem-solving skills. - Strong communication and interpersonal skills. - Bachelor's degree in Computer Science, Computer Engineering, or a related field. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

We are Reckitt Home to the world's best loved and trusted hygiene, health, and nutrition brands Our purpose defines why we exist: to protect, heal and nurture in the relentless pursuit of a cleaner, healthier world We are a global team united by this purpose Join us in our fight to make access to the highest quality hygiene, wellness, and nourishment a right and not a privilege, Information Technology & Digital In IT and D, you'll be a force for good, whether you're championing cyber security, defining how we harness the power of technology to improve our business, or working with data to guide the innovation of consumer loved products, Working globally across functions, you'll own your projects and process from start to finish, with the influence and visibility to achieve what needs to be done And if you're willing to bring your ideas to the table, you'll get the support and investment to make them happen, Your potential will never be wasted You'll get the space and support to take your development to the next level Every day, there will be opportunities to learn from peers and leaders through working on exciting, varied projects with real impact And because our work spans so many different businesses, from Research and Product Development to Sales, you'll keep learning exciting new approaches, About The Role The IT&D Solution Architect is accountable for proactively and holistically driving solution design activities within the product teams, ensuring alignment with the overall Enterprise Architecture and product stream strategy The Solution Architect provides the necessary technical leadership, analysis and design tasks related to the development of a product or a set of products within a product group S/He partners closely with DevOps/development teams to secure that the value planned will be delivered in the most optimal way according to the product strategy and the overall value for money objectives linked to the IT strategy The Solution Architect is also accountable for supervising the design as well as the integration execution within the scope of their product(s), collaborating with the integration platform team S/He owns solution architecture specification(s), Your responsibilities Use problem-solving skills, technical expertise to design, architect, develop & document high-end technology solutions that solve most complex and challenging problems across different projects on Azure, On-Premises or SaaS, Provide technical leadership throughout the design and deployment life cycle and focus on delivery with quality, Lead technical discussions and build consensus among all stakeholders including vendors, Provide technical architecture consultancy and design for projects, Coordinate with the product owners to identify future needs and requirements, Should understand security and compliance requirements, deliver the solutions as per the defined business standards, principles & patterns, Design end to end application (Single or multi-tier) architecture and data security for IaaS, PaaS and SaaS, Should be able to design, document applications hosted on on-premises, cloud providers or other platforms as required, Build, migrate and test Azure / On-Premises environments and integrations of services with IaaS, PaaS, and SaaS services with DevOps mindset, Work with the leadership to develop a strategic plan to keep infrastructure services contemporary, cost competitive and meeting the needs of the business, The experience we're looking for 12+ Years of experience required in enterprise applications, integration & Solution Designing, Ability to translate future-state business capabilities and requirements into solution architecture requirements, Proven track record of delivering new products and services in a fast-paced, dynamic environment especially in CPG or healthcare organizations, Deep technical experience in infrastructure design including On-premises, private and public cloud, networking, virtualization, storage & Integration Services (MuleSoft, API Management), Proficiency in Azure SQL Database, Azure Synapse Analytics, Azure Cosmos DB, and other Azure data services, etc Experience with Extract, Transform, Load (ETL) processes using tools like Azure Data Factory, etc Experience in designing, deploying single or multi-tier applications in compliance focussed industries, Strong skills in designing data models, data warehousing, and data lakes, Understanding of data security, privacy, and compliance requirements, including GDPR and HIPAA, Experience in modernization of workloads in on-premises data centers & cloud platforms like Azure, Skills in optimizing database performance and managing data storage costs, Deep understanding of latest LLM models, machine learning, deep learning, natural language processing (NLP), and computer vision & able to advise on selection of best models, tools for required problem area, Previous experience/ Knowledge on: Azure/Microsoft Technology Stack at Enterprise level Azure/Cloud Network Services Security services and Identity services like Encryption, Active Directory, RBAC, NSGs / ASGs, firewall policies, Azure Networking services (e-g , VNETs, Load Balancers, Front Door, ExpressRoute, Traffic Manager, Content Delivery Network) firewalls, Web App proxies, Bash, PowerShell scripting, Azure Monitor and Application Insights, Azure Based Cloud Automation and Infrastructure as Code (e-g , Terraform, Cloud Shell, PowerShell, ARM, Webhook and runbook) Disaster Recovery Backups, disaster recovery policies, standards applicable for Cloud/On-Prem Enterprise Systems, Ability to quickly comprehend the functions and capabilities of new technologies, Good understanding of strategic and new and emerging technology trends, and the practical application of existing, new and emerging technologies to new and evolving business and operating models, Ability to propose and estimate the financial impact of solution architecture alternatives, Ability to apply multiple technical solutions to business challenges, Qualifications Required Bachelors degree (B E, b-tech) Azure Solutions Architect Expert (Mandatory) Azure DevOps Engineer Expert (preferable) TOGAF Architecture certification (Good to have) Any other Microsoft certifications related to Security, Data etc The skills for success Product Development, system development, Project Management, Programme Management, Design Thinking, Process Automisation, IT Service Management, Innovation Processes, Innovation, User Experience Design, Change Analyst, Change Management, Digital Transformation, Value Analysis, Change Management, Adoption, Technology Adoption Lifecycle, Stakeholder Relationship Management, Vendor Management, Outstanding Communication, stakeholder engagement, Digital Strategy, Product Solution Architecture, Cyber Security Strategy, Cyber Security, Data Privacy, Portfolio Management, Data Governance, Product Compliance, Media Analytics, advertising, Consumer Engagement, Market Value, Market Chain, Data Driven Practices, Advanced Analytics, Data Analytics, Governance, What we offer With inclusion at the heart of everything we do, working alongside our four global Employee Resource Groups, we support our people at every step of their career journey, helping them to succeed in their own individual way We invest in the wellbeing of our people through parental benefits, an Employee Assistance Program to promote mental health, and life insurance for all employees globally We have a range of other benefits in line with the local market Through our global share plans we offer the opportunity to save and share in Reckitt's potential future successes For eligible roles, we also offer short-term incentives to recognise, appreciate and reward your work for delivering outstanding results You will be rewarded in line with Reckitt's pay for performance philosophy, Equality We recognise that in real life, great people don't always 'tick all the boxes' That's why we hire for potential as well as experience Even if you don't meet every point on the job description, if this role and our company feels like a good fit for you, we still want to hear from you All qualified applicants will receive consideration for employment without regard to age, disability or medical condition; colour, ethnicity, race, citizenship, and national origin; religion, faith; pregnancy, family status and caring responsibilities; sexual orientation; sex, gender identity, gender expression, and transgender identity; protected veteran status; size or any other basis protected by appropriate law, Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Nashik

Work from Office

Naukri logo

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Azure Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Engineer to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Design, develop, and maintain scalable data pipelines and workflows using Azure Data Factory, Azure Databricks, and other relevant tools, Implement and optimize data storage solutions in Azure, including Azure PostgreSQL Database, Azure Blob Storage, Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and implement solutions that align with business objectives, Ensure data quality, integrity, and security in all data-related processes and implementations, Work with both structured and unstructured data and implement data transformation and cleansing processes, Optimize and fine-tune performance of data solutions to meet both real-time and batch processing requirements, Troubleshoot and resolve issues related to data pipelines, ensuring minimal downtime and optimal performance, Stay current with industry trends and best practices, and proactively recommend improvements to existing data infrastructure, Qualifications, Experience, Skills Bachelor's degree in Computer Science, Information Technology, or a related field, Proven experience as a Data Engineer with a focus on Microsoft Azure technologies, Hands-on experience with Azure services such as Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse Analytics, Strong proficiency in SQL and experience with data modeling and ETL processes, Familiarity with data integration and orchestration tools, Knowledge of data warehousing concepts and best practices, Experience with version control systems, preferably Git, Excellent problem-solving and communication skills, Level Of Educational Attainment Required Tech Previous Work Experience 7+ Years of Experience Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, Familiarity with data integration tools and techniques, including Azure-native solutions, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 1 month ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Senior Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Engineer to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Design, develop, and maintain scalable data pipelines and workflows using Azure Data Factory, Azure Databricks, and other relevant tools, Implement and optimize data storage solutions in Azure, including Azure PostgreSQL Database, Azure Blob Storage, Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and implement solutions that align with business objectives, Ensure data quality, integrity, and security in all data-related processes and implementations, Work with both structured and unstructured data and implement data transformation and cleansing processes, Optimize and fine-tune performance of data solutions to meet both real-time and batch processing requirements, Troubleshoot and resolve issues related to data pipelines, ensuring minimal downtime and optimal performance, Stay current with industry trends and best practices, and proactively recommend improvements to existing data infrastructure, Qualifications, Experience, Skills Bachelor's degree in Computer Science, Information Technology, or a related field, Proven experience of atleast 7+ years as a Data Engineer with a focus on Microsoft Azure technologies, Hands-on experience with Azure services such as Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse Analytics, Strong proficiency in SQL and experience with data modeling and ETL processes, Familiarity with data integration and orchestration tools, Knowledge of data warehousing concepts and best practices, Experience with version control systems, preferably Git, Excellent problem-solving and communication skills, Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, Familiarity with data integration tools and techniques, including Azure-native solutions, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 1 month ago

Apply

8.0 - 14.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Data Architect Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Architect to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Architecting Optimal Solutions Design and implement robust data architectures using Azure Databricks and ADLS aligned with the medallion architecture (bronze, silver, and gold layers), in the Data Mesh set up Ensure scalability, performance, and cost efficiency in data extraction, ingestion, and transformation processes, Governance and Quality Standards Define and review governance frameworks and quality standards for data pipelines, Collaborate with data engineers to ensure compliance with organizational and regulatory standards, Code Review and Optimization Conduct thorough code reviews to ensure optimal performance and alignment with stakeholder requirements, Provide guidance on best practices for coding and data pipeline development, Data Modelling and Complex Transformations Define data modeling requirements to support complex transformations as required by stakeholders, Collaborate with business users to understand data transformation needs and ensure that the models meet those needs effectively, Data Governance Review Review and enhance data governance standards to ensure consistency, security, and compliance, Implement policies to monitor and maintain the quality and accessibility of data, Integration Design Design and define integration solutions between systems, ensuring seamless data flow and communication, Leverage Azure native tools and services for system integrations where applicable, Cost Estimation and Management Provide cost estimates for proposed architectures and solutions, Optimize resource usage to minimize costs while maintaining performance and reliability, Proposing Data Consumption Solutions Design optimal solutions for data consumption, catering to analytical, reporting, and operational needs, Ensure user-friendly access to data through appropriate tools and interfaces, Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor of Technology Previous Work Experience Min 9 Years of Experience Preferred Experience Familiarity with Unity Catalog for data governance and sharing, Hands-on experience with data mesh architectures and distributed data platforms, Knowledge of CDC tools such as Aecorsoft for real-time data extraction, Technical Skills - Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, DevOps, CICD Familiarity with data integration tools and techniques, including Azure-native solutions, Key requirement SAP ECC & S4Hana integration experience and experience in data modelling of the SAP ECC Finance modules, e-g , Account receivable, Account Payable, General Ledger, Inventory, TAX etc Governance and Modeling: Experience in defining data governance frameworks and standards, Strong understanding of data modeling techniques and principles, Analytical Skills: Ability to analyze business requirements and translate them into scalable technical solutions, Proficiency in estimating costs and managing budgets for large-scale data projects, Soft Skills: Excellent communication skills to collaborate with cross-functional teams and stakeholders, Strong problem-solving skills and a proactive approach to technical challenges, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 1 month ago

Apply

8.0 - 14.0 years

10 - 16 Lacs

Nashik

Work from Office

Naukri logo

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Data Architect Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Architect to join our dynamic team The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions, Key tasks & accountabilities Architecting Optimal Solutions Design and implement robust data architectures using Azure Databricks and ADLS aligned with the medallion architecture (bronze, silver, and gold layers), in the Data Mesh set up Ensure scalability, performance, and cost efficiency in data extraction, ingestion, and transformation processes, Governance and Quality Standards Define and review governance frameworks and quality standards for data pipelines, Collaborate with data engineers to ensure compliance with organizational and regulatory standards, Code Review and Optimization Conduct thorough code reviews to ensure optimal performance and alignment with stakeholder requirements, Provide guidance on best practices for coding and data pipeline development, Data Modelling and Complex Transformations Define data modeling requirements to support complex transformations as required by stakeholders, Collaborate with business users to understand data transformation needs and ensure that the models meet those needs effectively, Data Governance Review Review and enhance data governance standards to ensure consistency, security, and compliance, Implement policies to monitor and maintain the quality and accessibility of data, Integration Design Design and define integration solutions between systems, ensuring seamless data flow and communication, Leverage Azure native tools and services for system integrations where applicable, Cost Estimation and Management Provide cost estimates for proposed architectures and solutions, Optimize resource usage to minimize costs while maintaining performance and reliability, Proposing Data Consumption Solutions Design optimal solutions for data consumption, catering to analytical, reporting, and operational needs, Ensure user-friendly access to data through appropriate tools and interfaces, Qualifications, Experience, Skills Level Of Educational Attainment Required Bachelor of Technology Previous Work Experience Min 9 Years of Experience Preferred Experience Familiarity with Unity Catalog for data governance and sharing, Hands-on experience with data mesh architectures and distributed data platforms, Knowledge of CDC tools such as Aecorsoft for real-time data extraction, Technical Skills - Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation, Strong knowledge of medallion architecture and data lake design, Expertise in SQL, Python, and Spark for building and optimizing data pipelines, DevOps, CICD Familiarity with data integration tools and techniques, including Azure-native solutions, Key requirement SAP ECC & S4Hana integration experience and experience in data modelling of the SAP ECC Finance modules, e-g , Account receivable, Account Payable, General Ledger, Inventory, TAX etc Governance and Modeling: Experience in defining data governance frameworks and standards, Strong understanding of data modeling techniques and principles, Analytical Skills: Ability to analyze business requirements and translate them into scalable technical solutions, Proficiency in estimating costs and managing budgets for large-scale data projects, Soft Skills: Excellent communication skills to collaborate with cross-functional teams and stakeholders, Strong problem-solving skills and a proactive approach to technical challenges, And above all of this, an undying love for beer! We dream big to create future with more cheers

Posted 1 month ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Airbnb was born in 2007 when two hosts welcomed three guests to their San Francisco home, and has since grown to over 5 million hosts who have welcomed over 2 billion guest arrivals in almost every country across the globe Every day, hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way, Description Airbnb was born in 2007 when two Hosts welcomed three guests to their San Francisco home, and has since grown to over 4 million Hosts who have welcomed more than 1 billion guest arrivals in almost every country across the globe Every day, Hosts offer unique stays and experiences that make it possible for guests to connect with communities in a more authentic way, This role should be based in Gurgaon, India No relocation and Visa support, The Community You Will Join The Analytics Centre of Excellence (ACOE) at Airbnb, based in India, is a hub of knowledge and expertise that aims to provide data-driven decision-making, enabling Airbnb's business goals The ACOE's vision is to build a world-class analytics organization that provides scalable analytics We work with various business functions such as payments, trust, digital, customer support, hosting, sales,social, compliance, risk, platforms and partnership & economics, The ACOE's delivery framework is designed to provide relevant and contextual insights for data-driven decisions This includes a one-stop solution for metrics, dashboards driving actionable insights, optimization of performance, experimentation, measuring pre/post feature impact, sizing the ROI of opportunities, prioritization of opportunities, anomaly-driven alerting mechanisms, root cause analysis of metric deviation, and exploratory hypothesis testing, The Difference You Will Make You will be a part of the agent performance analytics team and will be responsible for evaluating the performance of customer service agents, identifying trends, and providing actionable insights to enhance agent productivity and service quality, Build business insights capabilities, passionate about solving complex problems with data; adding a new perspective to existing solutions and making business decisions based on careful and thoughtful analysis, You work independently with minimal supervision and act as a resource for colleagues with less experience, Work with technical and business teams to develop robust insights, create data narratives and leverage Airbnbs rich data to define metrics, You have a strong sense of urgency for and commitment to Airbnbs mission of belonging, Develop a deep understanding of principles of excellent customer service and how agent behavior impacts the customer experience Align analytical efforts with overall business goals and understanding how agent performance contributes to these goals, A Typical Day Analyze and report on agent performance metrics, including response times, resolution rates, and customer satisfaction scores, Develop and maintain dashboards and reports to monitor agent performance and identify areas for improvement, Conduct regular performance reviews and provide feedback to agents and management, Collaborate with training and development teams to create targeted training programs based on performance analysis, Monitor agent interactions and provide qualitative feedback to enhance service delivery, Identify trends in customer inquiries and feedback to inform service improvements, Assist in the development of performance-related incentive programs, Stay up-to-date with industry trends and best practices in customer service analytics, Ensure the availability of data quality and data integrity for analysis and reporting, working closely with global operations and India analytics functions, Build deep business context and become a trusted advisor to the business for tactical and strategic initiatives, Present and drive ideations, solutions, and progress updates to the business stakeholders, Your Expertise 5+ years in industry experience and a degree (Masters or PhD is a plus) in a quantitative field, Expert communication and collaboration skills with the ability to work effectively with internal teams in a cross-cultural and cross-functional environment Ability to conduct rigorous analysis and communicate conclusions to both technical and non-technical audiences Proven track record of delivering valuable insights and influencing business impact through analytics, Beginner level understanding of analytical frameworks such as Cohort Analysis, Product Funnel Analysis, Segmentation, Factor Analysis, Sensitivity Analysis, and statistical methodologies, Intermediate to advanced level proficiency in SQL, Tableau/Superset/Power BI, and data warehouses/data lakes such as Presto, Hive, Teradata, Spark, etc Advanced level expertise in presentation tools like Keynote, Google Slides, and PowerPoint, Experience partnering with internal teams to drive action and providing expertise and direction on analytics, data science, experimental design, and measurement, Exceptional problem-solving abilities, self-motivation and ability to work autonomously, taking ownership of projects and driving them to completion, Strong organizational and time management skills, with the ability to manage multiple priorities and meet deadlines, Experience in the hospitality or travel industry, Knowledge of customer service software and CRM systems, Hybrid Work Requirements & Expectations: To support productivity and maintain a professional hybrid work environment (2 days work from office), employees are expected to adhere to the following: Workspace: A dedicated, quiet, and private workspace free from interruptions and external noise Internet Connectivity: During the working hours, maintain a minimum and consistent internet speed of 10 Mbps on your official devices to ensure reliability for work-related tasks, including calls and virtual meetings Professionalism: Employees must remain fully engaged, respectful, and maintain a professional presence during virtual meetings, with video participation required unless otherwise approved, Confidentiality & Security: Employees are responsible for protecting Airbnbs Intellectual Property and Confidential Information Work-related activities, including calls and meetings, must not be conducted in public places, while traveling, or in any setting that may compromise confidentiality or work quality, Our Commitment To Inclusion & Belonging Airbnb is committed to working with the broadest talent pool possible We believe diverse ideas foster innovation and engagement, and allow us to attract creatively-led people, and to develop the best products, services and solutions All qualified individuals are encouraged to apply,

Posted 1 month ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Reference 250002R6 Responsibilities Primary Skills : Deep Practical Knowledge on C#, Dot net framework 4 5 and above, Deep knowledge of Angular 8+ practices and commonly used modules based on extensive work experience Good Practical knowledge on ASPDot net MVC Good Practical knowledge on Web API Good Knowledge on OAuth 2 0 Good practical Knowledge on Entity Framework Good Hands on experience on LINQ Good Practical Knowledge on SQL Server and Query Tuning Secondary Skills : Understanding on Elastic search Worked with DATALAKE with dot net framework Basic knowledge on Azure DevOps Knowledge on ASPDot net Core Knowledge on Bootstrap Understanding on Agile process, scrum Knowledge on JavaScript and HTML5 Required Profile required Primary Skills : Deep Practical Knowledge on C#, Dot net framework 4 5 and above, Deep knowledge of Angular 8+ practices and commonly used modules based on extensive work experience Good Practical knowledge on ASPDot net MVC Good Practical knowledge on Web API Good Knowledge on OAuth 2 0 Good practical Knowledge on Entity Framework Good Hands on experience on LINQ Good Practical Knowledge on SQL Server and Query Tuning Secondary Skills : Understanding on Elastic search Worked with DATALAKE with dot net framework Basic knowledge on Azure DevOps Knowledge on ASPDot net Core Knowledge on Bootstrap Understanding on Agile process, scrum Knowledge on JavaScript and HTML5 Why join us We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status, Business insight At Societe Generale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious, Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating and taking action are part of our DNA, If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved, We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection, Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination,

Posted 1 month ago

Apply

8.0 - 12.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Design and develop Looker Dashboards, with apt security provisioning and drill down capabilities. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets. Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes. Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data quality, governance, and security best practices across all layers of the data stack. Write clean, maintainable, and efficient code following best practices. Requirements Data Engineering: 8–12 years of experience in data engineering, with at least 3–5 years hands-on experience specifically in Google Cloud Platform (GCP) and BI tools like Looker. BigQuery (data modeling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Basic to intermediate knowledge of Python for scripting and automation. Exposure to ETL and batch scheduling/ orchestration solutions Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. Looker: Hands on experience in Looker with design, development, configuration/ setup, dashboarding and reporting techniques. Experience building and maintaining LookML models, Explores, PDTs, and semantic layers. Understanding of security provisioning and access controls, performance tuning of dashboard/ reports based on large dataset, building drill down capabilities. Proven ability to design scalable, user-friendly dashboards and self-service analytics environments. Expertise in optimizing Looker performance: materialized views, query tuning, aggregate tables. Strong command over Row-Level Security, Access Filters, and permission sets in Looker to support enterprise-grade data governance. General: Experience with Agile delivery methodologies (e.g. Scrum, Kanban) Demonstrable track record of dealing well with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, Looker modeling best practices, and advanced visualization techniques. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Gurugram

Remote

Naukri logo

US Shift- 5 working days. Remote Work. (US Airline Group) Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Strong focus on AWS and PySpark. Knowledge of AWS services, including but not limited to S3, Redshift, Athena, EMR, and Glue. Proficiency in PySpark and related Big Data technologies for ETL processing. Strong SQL skills for data manipulation and querying. Familiarity with data warehousing concepts and dimensional modeling. Experience with data governance, data quality, and data security practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams.

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 10 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Role & Responsibilities Job Description: We are seeking a skilled and experienced Microsoft Fabric Engineer to join data engineering team. The ideal candidate will have a strong background in designing, developing, and maintaining data solutions using Microsoft Fabric, i ncluding experience across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. Require deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within Microsoft ecosystem. Key Responsibilities: Design, implement scalable and secure data solutions using Microsoft Fabric. Build and maintain Data Pipelines using Dataflows Gen2 and Data Factory. Work with Lakehouse architecture and manage datasets in OneLake. Develop notebooks (PySpark or T-SQL) for data transformation and processing. Collaborate with data analysts to create interactive dashboards, reports using Power BI (within Fabric). Leverage Synapse Data Warehouse and KQL databases for structured real-time analytics. Monitor and optimize performance of data pipelines and queries. Ensure to adhere data quality, security, and governance practices. Stay current with Microsoft Fabric updates and roadmap, recommending enhancements. Required Skills: 3+ years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. Strong proficiency with: Data Factory (Fabric) Synapse Data Warehouse / SQL Analytics Endpoints Power BI integration and DAX Notebooks (PySpark, T-SQL) Lakehouse and OneLake Understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus. Familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Qualifications: Microsoft Fabric, Onelake, Data Factory, Data Lake, DataMesh

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies