Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
4 - 7 Lacs
Bengaluru
Work from Office
About The Role Skill required: Delivery - Marketing Analytics and Reporting Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIAnalytical processes and technologies applied to marketing-related data to help businesses understand and deliver relevant experiences for their audiences, understand their competition, measure and optimize marketing campaigns, and optimize their return on investment. What are we looking for Data Analytics - with a specialization in the marketing domain*Domain Specific skills* Familiarity with ad tech and B2B sales*Technical Skills* Proficiency in SQL and Python Experience in efficiently building, publishing & maintaining robust data models & warehouses for self-ser querying, advanced data science & ML analytic purposes Experience in conducting ETL / ELT with very large and complicated datasets and handling DAG data dependencies. Strong proficiency with SQL dialects on distributed or data lake style systems (Presto, BigQuery, Spark/Hive SQL, etc.), including SQL-based experience in nested data structure manipulation, windowing functions, query optimization, data partitioning techniques, etc. Knowledge of Google BigQuery optimization is a plus. Experience in schema design and data modeling strategies (e.g. dimensional modeling, data vault, etc.) Significant experience with dbt (or similar tools), Spark-based (or similar) data pipelines General knowledge of Jinja templating in Python. Hands-on experience with cloud provider integration and automation via CLIs and APIs*Soft Skills* Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 3 weeks ago
7.0 - 9.0 years
0 - 1 Lacs
Pune
Work from Office
Dear Candidate, We at TATA Technologies looking for an experienced candidate for Technical Lead-Data role for Pune location. Please check the below JD, if matches to your profile kindly share your resume on sayali.yadav@tatatechnologies.com Job Title: Technical Lead-Data Total Exp:7-9 Years Location: Pune Noice Period: Immediate-30 Days Role & responsibilities Implement scalable, secure data architectures. Develop and maintain data models, data flow structures, and data warehouses. Translate business requirements into technical specifications for databases, data warehouses, and data streams. Ensure data accuracy, consistency, and security across the organization. Collaborate with IT teams, data scientists, and business stakeholders to optimize data utilization. Create and enforce data management policies and procedures. Monitor and improve system performance, conducting tests and troubleshooting as needed. Stay updated with the latest data technologies and best practices. Desired Candidate Profile Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7 - 9 years of experience in data lake related programs. Proficiency in SQL, Oracle, and data visualization tools. Experience with data warehousing solutions and ETL (Extract, Transform, Load) frameworks. Strong understanding of data security measures and compliance requirements. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Skills: Experience with Snowflake, Databricks, or MS Fabric. Experience with cloud data platforms (e.g., AWS, Azure, Google Cloud). Knowledge of big data technologies (e.g., Hadoop, Spark).
Posted 3 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Mumbai, Mangaluru
Hybrid
6 months-3 yrs of IT experience Knowledge on Bigquery, SQL Or similar tools Aware of ETL and Data warehouse concepts Good oral and written communication skills Great team player and able to work efficiently with minimal supervision Should have good knowledge of Java or python to conduct data cleansing Preferred: Good communication and problem-solving skills Experience on Spring Boot would be an added advantage Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable Experience in Google Cloud Platform (GCP) Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function Roles and Responsibilities Develop high performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large scale data lakes. Writing complex, highly-optimized queries across large data sets and to create data processing layers. Closely interact with Data Engineers to identify right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.
Posted 3 weeks ago
7.0 - 12.0 years
15 - 27 Lacs
Pune
Hybrid
Notice Period - Immediate joiner Responsibilities Lead, develop and support analytical pipelines to acquire, ingest and process data from multiple sources Debug, profile and optimize integrations and ETL/ELT processes Design and build data models to conform to our data architecture Collaborate with various teams to deliver effective, high value reporting solutions by leveraging an established DataOps delivery methodology Continually recommend and implement process improvements and tools for data collection, analysis, and visualization Address production support issues promptly, keeping stakeholders informed of status and resolutions Partner closely with on and offshore technical resources Provide on-call support outside normal business hours as needed Provide status updates to the stakeholders. Identify obstacles and seek assistance with enough lead time to ensure delivery on time Demonstrate technical ability, thoroughness, and accuracy in all assignments Document and communicate on proper operations, standards, policies, and procedures Keep abreast on all new tools and technologies that are related to our Enterprise data architecture Foster a positive work environment by promoting teamwork and open communication. Skills/Qualifications Bachelors degree in computer science with focus on data engineering preferable. 6+ years of experience in data warehouse development, building and managing data pipelines in cloud computing environments Strong proficiency in SQL and Python Experience with Azure cloud services, including Azure Data Lake Storage, Data Factory, and Databricks Expertise in Snowflake or similar cloud warehousing technologies Experience with GitHub, including GitHub Actions. Familiarity with data visualization tools, such as Power BI or Spotfire Excellent written and verbal communication skills Strong team player with interpersonal skills to interact at all levels Ability to translate technical information for both technical and non-technical audiences Proactive mindset with a sense of urgency and initiative Adaptability to changing priorities and needs If you are interested share your updated resume on mail - recruit5@focusonit.com. Also Request you to please spread this message across your Networks or Contacts.
Posted 3 weeks ago
9.0 - 14.0 years
30 - 35 Lacs
Pune
Work from Office
: Job TitleProduction Specialist, AVP LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. As an Assistant Vice President, your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. You will also be working as application lead and will be responsible for technical & operational processes for all application you support. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Provide technical support by handling and consulting on BAU, Incidents/emails/alerts for the respective applications. Perform post-mortem, root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management. Manage regional L2 team and vendor teams supporting the application. Ensure the team is up to speed and picks up the support duties. Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Define and track KPIs, SLAs and operational metrics to measure and improve application stability and performance. Conduct real time monitoring to ensure application SLAs are achieved and maximum application availability (up time) using an array of monitoring tools. Build and maintain effective and productive relationships with the stakeholders in business, development, infrastructure, and third-party systems / data providers & vendors. Assist in the process to approve application code releases as well as tasks assigned to support to perform. Keep key stakeholders informed using communication templates. Approach support with a proactive attitude, desire to seek root cause, in-depth analysis, and strive to reduce inefficiencies and manual efforts. Mentor and guide junior team members, fostering technical upskill and knowledge sharing. Provide strategic input into disaster recovery planning, failover strategies and business continuity procedures Collaborate and deliver on initiatives and install these initiatives to drive stability in the environment. Perform reviews of all open production items with the development team and push for updates and resolutions to outstanding tasks and reoccurring issues. Drive service resilience by implementing SRE(site reliability engineering) principles, ensuring proactive monitoring, automation and operational efficiency. Ensure regulatory and compliance adherence, managing audits,access reviews, and security controls in line with organizational policies. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Weekend on-call coverage needs to be provided on rotational/need basis. Your skills and experience 9-15 years of experience in providing hands on IT application support. Experience in managing vendor teams providing 24x7 support. Preferred Team lead role experience, Experience in an investment bank, financial institution. Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred ITIL v3 foundation certification or higher. Knowledgeable in cloud products like Google Cloud Platform (GCP) and hybrid applications. Strong understanding of ITIL /SRE/ DEVOPS best practices for supporting a production environment. Understanding of KPIs, SLO, SLA and SLI Monitoring ToolsKnowledge of Elastic Search, Control M, Grafana, Geneos, OpenShift, Prometheus, Google Cloud Monitoring, Airflow,Splunk. Working Knowledge of creation of Dashboards and reports for senior management Red Hat Enterprise Linux (RHEL) professional skill in searching logs, process commands, start/stop processes, use of OS commands to aid in tasks needed to resolve or investigate issues. Shell scripting knowledge a plus. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Proven experience in leading L2 support teams, including managing vendor teams and offshore resources. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Working knowledge of Big Data Hadoop/Secure Data Lake is a plus. Prior experience in automation projects is great to have. Exposure to python, shell, Ansible or other scripting language for automation and process improvement Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. Ability to manage high-pressure issues, coordinating across teams to drive swift resolution. Strong negotiation skills with interface teams to drive process improvements and efficiency gains. How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Navi Mumbai
Work from Office
As Architect at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Architect, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong understanding data lake approaches, industry standards and industry best practices. Detail level understanding of HADOOP Framework, Ecosystem, MapReduce, and Data on Containers (data in OpenShift). Applies individual experiences / competency and IBM architecting structure thinking model to analyzing client IT systems. Experience with relational SQL, Big Data etc Experienced with Cloud native platforms such as AWS, Azure, Google, IBM Cloud or Cloud Native data platforms like Snowflake Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Unix shell scripting and python
Posted 3 weeks ago
5.0 - 10.0 years
9 - 14 Lacs
Mysuru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred technical and professional experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences
Posted 3 weeks ago
7.0 - 12.0 years
10 - 15 Lacs
Bengaluru
Hybrid
Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 7+ years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.
Posted 3 weeks ago
8.0 - 13.0 years
20 - 30 Lacs
Chennai
Hybrid
Role: Power BI Architect Experience: 10+ Years Location - Chennai (Willing to relocate - Tamil Nadu Region is fine) Looking for immediate Joiners only. Job Description: Bachelor's degree in computer science, information systems, or a related field (or equivalent experience). At least 7+ years of proven experience in developing Power BI solutions, including data modeling and ETL processes. Designed or architected solutions using Power BI connected to Snowflake or Data Lake Experience with performance tuning, data modeling, and DAX optimization in that context Exposure to enterprise-level reporting, preferably with large datasets and cloud data platforms Strong proficiency in DAX and Power Query. Experience with SQL and relational databases. Understanding of data warehousing and dimensional modeling concepts. Experience with data integration tools and techniques. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team. Experience with Azure services (e.g., Azure Data Factory, Azure SQL Database, Azure Databricks) is a plus. Experience with version control systems (e.g., Git) is a plus.
Posted 3 weeks ago
5.0 - 9.0 years
10 - 20 Lacs
Pune, Chennai, Bengaluru
Work from Office
Key Responsibilities Design, build, and maintain robust, scalable, and efficient ETL/ELT pipelines. Implement data ingestion processes using FiveTran and integrate various structured and unstructured data sources into GCP-based environments. Develop data models and transformation workflows using DBT and manage version-controlled pipelines. Build and manage data storage solutions using Snowflake, optimizing for cost, performance, and scalability. Orchestrate workflows and pipeline dependencies using Apache Airflow. Design and support Data Lake architecture for raw and curated data zones. Collaborate with Data Analysts, Scientists, and Product teams to ensure availability and quality of data. Monitor data pipeline performance, ensure data integrity, and handle error recovery mechanisms. Follow best practices in CI/CD, testing, data governance, and security standards. Required Skills 5 - 7 years of professional experience in data engineering roles. Hands-on experience with GCP services: Big Query, Cloud Storage, Pub/Sub, Dataflow, Composer, etc. Proficient in writing modular SQL transformations and data modeling using DBT. Deep understanding of Snowflake warehousing: performance tuning, cost optimization, security. Experience with Airflow for pipeline orchestration and DAG management. Familiarity with designing and implementing Data Lake solutions. Proficient in Python and/or SQL. Send profiles to payal.kumari@nam-it.com Regards, Payal Kumari Senior Executive Staffing NAM Info Pvt Ltd, 29/2B-01, 1st Floor, K.R. Road, Banashankari 2nd Stage, Bangalore - 560070. Email payal.kumari@nam-it.com Website - www.nam-it.com USA | CANADA | INDIA
Posted 3 weeks ago
10.0 - 15.0 years
1 - 2 Lacs
Hyderabad
Work from Office
Experience needed: 10-15 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are seeking a highly experienced and results-driven Power BI Architect to lead the design, development, and implementation of enterprise-level BI solutions. The ideal candidate will have deep expertise in Power BI architecture , data modeling , visualization , DAX , and Power BI/Fabric administration , along with a solid foundation in Microsoft Azure and Microsoft Entra . You will work closely with data engineers, analysts, and stakeholders to build a scalable and secure data visualization ecosystem. Key Responsibilities: Design end-to-end Power BI Architecture including data ingestion, modeling, visualization, and governance. Lead implementation of dimensional data models to support enterprise reporting and analytics needs. Develop and optimize Power BI reports and dashboards using DAX, M Language (Power Query), and advanced visualizations. Architect and manage the Power BI Service environment including workspaces, datasets, dataflows, gateways, and security. Define and implement Power BI SDLC processes including versioning, deployment pipelines, and documentation. Manage Power BI/Fabric administration tasks, including tenant settings, capacity management, and usage monitoring. Ensure best practices in report performance tuning , data refresh optimization , and data security . Collaborate with Azure teams to integrate Power BI solutions with Microsoft Azure services (Data Lake, Synapse, Data Factory, etc.). Implement Microsoft Entra (Azure AD) role-based access controls and security for BI content. Provide thought leadership and mentorship to BI developers and analysts. Stay current on Microsofts data and analytics roadmap and assess applicability to ongoing projects. Required Skills & Qualifications: Strong experience with Power BI Desktop , Power BI Service , and Power BI Premium/Fabric . Expertise in DAX and Power Query (M Language) . Proven experience with dimensional modeling and data warehousing concepts. Proficient in ETL processes and integrating data from multiple sources. Demonstrated success in leading enterprise BI implementations . Solid understanding and experience with Power BI governance , security , and lifecycle management . Experience with Microsoft Azure platform , especially Azure Data Services. Experience and Knowledge of Microsoft Entra (Azure AD) for authentication and access management. Strong communication and stakeholder management skills. Preferred Qualifications: Microsoft Certified: Power BI Data Analyst Associate or Azure Data Engineer Associate . Familiarity with DevOps and CI/CD pipelines for Power BI deployments. Experience working in Agile/Scrum environments.
Posted 3 weeks ago
4.0 - 8.0 years
12 - 22 Lacs
Pune
Work from Office
Key Responsibilities Oversight & Optimisation of Data Lakehouse & Architecture, Data Engineering & Pipelines Understand lakehouse architectures that unify structured and semi-structured data at scale Strong experience of Implementing, monitoring job scheduling and orchestration using Airflow , Azure Data Factory , and CI/CD triggers and with Azure Dataflows , Databricks , and Delta Lake for real-time/batch processing, Managing schema evolution , data versioning (e.g., Delta Lake), and pipeline adaptability Pipeline performance tuning for latency, resource usage, and throughput optimization Cloud Infrastructure & Automation Infra automation using Terraform, Azure Bicep, and AWS CDK Setting up scalable cloud storage (Data Lake Gen2, S3, Blob, RDS, etc.) Administering RBAC , secure key vault access, and compliance-driven access controls Tuning infrastructure and services for cost efficiency and compute optimization Full-Stack Cloud Data Platform Design Designing end-to-end Azure/AWS data platforms including ingestion, transformation, storage, and serving layers Interfacing with BI/AI teams to ensure data readiness, semantic modeling, and ML enablement Familiarity with metadata management, lineage tracking, and data catalog integration Enterprise Readiness & Delivery Experience working with MNCs and large enterprises with strict processes, approvals, and data governance Capable of evaluating alternative tools/services across clouds for architecture flexibility and cost-performance balance Hands-on with CI/CD , monitoring , and security best practices in regulated environments (BFSI, Pharma, Manufacturing) Lead cost-performance optimization across Azure and hybrid cloud environments Design modular, scalable infrastructure using Terraform / CDK / Bicep with a DevSecOps mindset Explore alternative cloud tools/services across compute, storage, identity, and monitoring to propose optimal solutions Drive RBAC, approval workflows, and governance controls in line with typical enterprise, MNC deployment security protocols Support BI/data teams with infra tuning , pipeline stability , and client demo readiness Collaborate with client-side architects, procurement, and finance teams for approvals and architectural alignment Ideal Profile 47 years of experience in cloud infrastructure and platform engineering Strong hold on Microsoft Azure , with hands-on exposure to AWS / GCP / Snowflake acceptable Skilled in IaC tools (Terraform, CDK), CI/CD , monitoring (Grafana, Prometheus), and cost optimization tools Comfortable proposing innovative, multi-vendor architectures that balance cost, performance, and compliance Prior experience working with large global clients or regulated environments (e.g., BFSI, Pharma, Manufacturing) Preferred Certifications Microsoft Azure Administrator / Architect (Associate/Expert) AWS Solutions Architect / FinOps Certified Bonus: Snowflake, DevOps Professional, or Data Platform certifications
Posted 3 weeks ago
4.0 - 8.0 years
5 - 15 Lacs
Chennai, Delhi / NCR, Mumbai (All Areas)
Hybrid
Job Description (JD): Azure Databricks / ADF / Synapse , with strong emphasis on Python, SQL, Data Lake, and Data Warehouse : Job Title: Data Engineer Azure (Databricks / ADF / Synapse) Experience: 4 to 7 Years Location: Pan India Employment Type: Full-Time Notice Period: Immediate to 30 Days Job Summary: We are looking for a skilled and experienced Data Engineer with 4 to 8 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Azure Databricks , Azure Data Factory (ADF) , or Azure Synapse Analytics , along with Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 48 years of experience in Data Engineering or related roles. Hands-on experience in Azure Databricks , ADF , or Synapse Analytics Proficiency in Python for data processing and scripting. Strong command over SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Good to Have: Experienced in Delta Lake , Power BI , or Azure DevOps . Knowledge of Spark , Scala , or other distributed processing frameworks. Exposure to BI tools like Power BI , Tableau , or Looker . Familiarity with data security and compliance in the cloud. Experience in leading a development team.
Posted 3 weeks ago
6.0 - 11.0 years
25 - 35 Lacs
Bengaluru
Hybrid
We are hiring Azure Data Engineers for an active project-Bangalore location Interested candidates can share details on the mail with their updated resume. Total Exp? Rel exp in Azure Data Engineering? Current organization? Current location? Current fixed salary? Expected Salary? Do you have any offers? if yes mention the offer you have and reason for looking for more opportunity? Open to relocate Bangalore? Notice period? if serving/not working, mention your LWD? Do you have PF account ?
Posted 3 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Location: Bangalore/Hyderabad/Pune Experience level: 7+ Years About the Role We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Bengaluru
Work from Office
We are seeking a CRMA Sr. Developer with a strong technical background and extensive understanding and experience in database and analytical tools to join our team. This role offers the opportunity to work with cutting-edge analytics tools like Tableau CRM and Einstein Analytics and make an impact on our data-driven journey. Key Responsibilities: Data Analysis and Insights: o Utilize Tableau CRM and Einstein Analytics to identify unexpected business outcomes and perform in-depth analysis. o Experience with other business intelligence (BI), extract-transform-load (ETL), Power BI and other business analytics reporting tools. Technical Support: o Provide expert support in Tableau CRM, Einstein AI, and analytics tools to ensure optimal utilization within the organization. Data Design and Analysis Techniques: o Employ data cleansing, statistical analysis, data mining, predictive analysis, and other data analysis techniques to extract valuable insights from extensive datasets. o Design and configure highly scalable solutions for clients. Research and Recommendations: o Conduct research and offer recommendations on product components, services, protocols, and standards to enhance data analytics capabilities. Data Transformation and Visualization: o Collaborate in the design of data transformation processes, data flows, and datasets. o Create dashboard wireframes and stories to communicate data insights effectively. o Use methods like data cleansing, statistics, data mining, predictive analysis, and other data analysis techniques to collect and extract insights from large scale data sets. Required Skills: Salesforce Expertise: A strong foundation in Salesforce development is essential. Proficiency in Salesforce platform features, data modeling, and configuration is a must. Platform Experience: A minimum of 4 years of hands-on experience working with Salesforce Einstein Analytics and Tableau CRM. Einstein Discovery: Profound understanding of Einstein Discovery from both a functional and technical perspective. Analytical Skills: Strong experience in data analysis with the ability to identify trends and create visualizations with extensive datasets. Business Alignment: Demonstrated ability to align data analytics with business objectives, envision solutions to address business problems, and derive value from data. Scalable Solutions: Experience in building and configuring highly scalable solutions for clients. Proof of Concepts: Capable of creating and delivering hands-on technology proof of concepts. Database Understanding: Strong conceptual understanding and hands on working experience in relational databases, data warehouses, data lakes, and must have good skills in SQL, normalization, and data modeling. Communication Skills: Effective communication skills to collaborate with clients, team members, and stakeholders, including the ability to translate business requirements into technical solutions. Problem-Solving: Strong problem-solving abilities to address complex business challenges and optimize processes. Qualifications: Bachelor's degree in computer science, Information Technology, or related field. 4+ years of experience as Salesforce Eninstein Analytics Developer. Certification in Tableau CRM and Einstein Discovery Consultant is highly required and Sales Cloud. Strong communication skills to articulate technical concepts to non-technical stakeholders.
Posted 3 weeks ago
10.0 - 14.0 years
12 - 16 Lacs
Bengaluru
Work from Office
We are seeking a CRMA Developer with a strong technical background and extensive understanding and experience in database and analytical tools to join our team. This role offers the opportunity to work with cutting-edge analytics tools like Tableau CRM and Einstein Analytics and make an impact on our data-driven journey. Key Responsibilities: Data Analysis and Insights: o Utilize Tableau CRM and Einstein Analytics to identify unexpected business outcomes and perform in-depth analysis. o Experience with other business intelligence (BI), extract-transform-load (ETL), Power BI and other business analytics reporting tools. Technical Support: o Provide expert support in Tableau CRM, Einstein AI, and analytics tools to ensure optimal utilization within the organization. Data Design and Analysis Techniques: o Employ data cleansing, statistical analysis, data mining, predictive analysis, and other data analysis techniques to extract valuable insights from extensive datasets. o Design and configure highly scalable solutions for clients. Research and Recommendations: o Conduct research and offer recommendations on product components, services, protocols, and standards to enhance data analytics capabilities. Data Transformation and Visualization: o Collaborate in the design of data transformation processes, data flows, and datasets. o Create dashboard wireframes and stories to communicate data insights effectively. o Use methods like data cleansing, statistics, data mining, predictive analysis, and other data analysis techniques to collect and extract insights from large scale data sets. Required Skills: Salesforce Expertise: A strong foundation in Salesforce development is essential. Proficiency in Salesforce platform features, data modeling, and configuration is a must. Platform Experience: A minimum of 5 years of hands-on experience working with Salesforce Einstein Analytics and Tableau CRM on at least 4 large projects. Einstein Discovery: Profound understanding of Einstein Discovery from both a functional and technical perspective. Analytical Skills: Strong experience in data analysis with the ability to identify trends and create visualizations with extensive datasets. Business Alignment: Demonstrated ability to align data analytics with business objectives, envision solutions to address business problems, and derive value from data. Scalable Solutions: Experience in building and configuring highly scalable solutions for clients. Proof of Concepts: Capable of creating and delivering hands-on technology proof of concepts. Database Understanding: Strong conceptual understanding and hands on working experience in relational databases, data warehouses, data lakes, and must have good skills in SQL, normalization, and data modeling. Communication Skills: Effective communication skills to collaborate with clients, team members, and stakeholders, including the ability to translate business requirements into technical solutions. Problem-Solving: Strong problem-solving abilities to address complex business challenges and optimize processes. Qualifications: Bachelor's degree in computer science, Information Technology, or related field. 5 to 10+ years of experience as Salesforce Eninstein Analytics Developer with at least 4 Years of implementation experience on large scale projects. Certification in Tableau CRM and Einstein Discovery Consultant is highly required and Sales Cloud. Strong communication skills to articulate technical concepts to non-technical stakeholders.
Posted 3 weeks ago
15.0 - 20.0 years
20 - 30 Lacs
Noida, Gurugram
Hybrid
Design architectures using Microsoft SQL Server MongoDB Develop ETLdata lakes, Integrate reporting tools like Power BI, Qlik, and Crystal Reports to data strategy Implement AWS cloud services,PaaS,SaaS, IaaS,SQL and NoSQL databases,data integration
Posted 3 weeks ago
4.0 - 6.0 years
1 - 2 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 3 weeks ago
4.0 - 9.0 years
9 - 14 Lacs
Hyderabad
Work from Office
As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc. Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs. Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 4+ years of overall experience in Data & Analytics engineering 4+ years of experience working with Azure, Databricks, and ADF, Data Lake Solid experience working with data platforms and products using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NIC External Candidate Application Internal Employee Application
Posted 3 weeks ago
7.0 - 12.0 years
12 - 22 Lacs
Mumbai
Work from Office
Job Name (Digital Banking) Associate Data Analyst Location - Mumbai Grade - Senior Manager / AVP Looking for Business Analyst working in Regulated sector by RBI - Bank, Lending NBFC or consulting Firms - Working on Banking data. Having experience in Business credit risk. Predominant Skills - Data Quality; Remediation Processes (Databases, SQL and Python) Data Visualisation Skills (Dashboard, Tableau Power, BI) Informatica Data Quality Basic understanding of Data Lakes and Cloud environment Job Purpose HDFC Bank has huge volume of data, both structured and unstructured, and we are focused on creating assets out of data and deriving best value from the data for the Bank. The Data Remediation and DaaS specialist will be responsible for improving customer data quality through various internal data remediation methodologies. This role will also focus on designing, implementing, and maintaining global and local data marts on the Banks Data Lake to support business, marketing, analytics, regulatory, and other functional use cases. This role is crucial in ensuring high-quality customer data while enabling business functions with reliable and well-structured data marts. The ideal candidate will be someone with a passion for data quality, strong technical skills, and a strategic mindset to drive data-driven decision-making across the Bank. Role & responsibilities Customer Data Quality Management • Analyze and assess data quality issues in customer records • Implement data cleansing, standardization, and deduplication strategies. • Monitor and improve the accuracy, completeness, and consistency of customer data. Formulate Data Remediation Strategies • Conduct root cause analysis to identify sources of poor data quality. • Coordinate with internal stakeholders to drive data improvement initiatives. Data Mart Development & Maintenance • Engage with multiple business, product, credit, risk, analytics, marketing, finance, BIU etc. stakeholders to discover requirements of data marts along with the current challenges faced Providing inputs and recommendation on continuous improvement of policies, procedures, processes, standards, and control pertaining to Data Marts Quantify the impact in business value terms (revenue/cost/loss) due to launch of global and loc Experience Required 5-7 years of total work experience in Data Quality/ Data Product creation 5+ years of experience in Banking and Financial services Experience of working in large, multi-functional, matrix organization Strong technical & functional understanding of Data Remediation and Data Products that includes Staging, Mapping, Cleanse Function, Match Rules, Validation, Trust Scores, Remediation Techniques, Mart creation methodologies & best practices etc Experience with industry-leading master data/metadata/data quality suites, such as Informatica Data Quality Exposure of working in Cloud environment will be an added advantage
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Pune
Work from Office
Qualification: Degree in Computer Science (or similar), alternatively well-founded professional experience in the desired field Roles Responsibilities: As a Senior Data Engineer, you manage and develop the solutions in close alignment with various business and Spoke stakeholders. You are responsible for the implementation of the IT governance guidelines. Collaborate with the Spokes Data Scientists, Data Analysts, and Business Analysts, when relevant. Tasks Create and manage data pipeline architecture for data ingestion, pipeline setup and data curation Experience working with and creating cloud data solutions Assemble large, complex data sets that meet functional/non-functional business requirements Implement the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Pyspark, SQL and AWS big data-technologies Build analytics tools that use the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics Manipulate data at scale: getting data in a ready-to-use state in close alignment with various business an d Spoke stak eholders Must Have: Advanced knowledge: ETL Data Lake, Data Warehouse, RDS architectures knowledge Python, SQL (Any other OOP language is also valuable) Pyspark (preferably) or Spark Knowledge Object-oriented programming, Clean Code and good documentation skills AWS: S3, Athena, Lambda, Glue, IAM, SQS, EC2, Quicksight, and etc. Git Data Analysis Visualization Optional: AWS CDK Cloud Development Kit CI/CD knowledge
Posted 3 weeks ago
4.0 - 7.0 years
1 - 6 Lacs
Bengaluru
Hybrid
Job Title: Data Engineer & Gen AI Engineer Project Duration- 6 months Exp- 4-7 yrs Location : Bangalore (Hybrid) Budget : upto 19 LPA Required Skills & Qualifications: python, Lang chain, Azure Databricks, data quality, PySpark, SQL, & Delta Lake
Posted 3 weeks ago
10.0 - 17.0 years
30 - 40 Lacs
Pune
Hybrid
The Opportunity Join our team at Nutanix as a Staff Engineer in our Pune office, where you will play a crucial role in providing a solid Nutanix Cloud Manager (NCM) product to our customers. Your mission will be to design and build an observability platform that allows our customers to proactively monitor and manage their infrastructure and applications. By building a reliable NCM data platform and driving the adoption of best practices, you will contribute to the success of our products. This is a unique opportunity to work with cutting-edge technologies, lead and mentor engineers, and be part of a fast-paced environment where autonomy and ownership are valued. The work will range from analyzing application requirements and proposing and benchmarking databases, designing data processing pipelines to data analysis and recommending best practices based on workload. About the Team At Nutanix, you'll be joining the Insights team, a dynamic and innovative group dedicated to leveraging data analytics for impactful business decision-making. With team members located in both the US and India, we foster a collaborative environment that encourages sharing diverse perspectives and ideas. Our culture is rooted in creativity and continuous improvement, allowing us to drive meaningful change and deliver exceptional results. You will report to the Director of Engineering, who values open communication and mentorship, ensuring that each team member has the support and guidance necessary to excel in their roles. Our work setup is hybrid, requiring team members to be in the office 23 days a week, which balances flexibility with the benefits of in-person collaboration. Your Role Drive technical direction and architecture for the NCM Data Platform working with other senior engineers. Design and develop the next generation of Data Platform features for on-prem and cloud. Drive data modeling discussions working with senior engineers to ensure the best solution for requirements from various service teams. Collaborate with Product Management, QA and documentation teams across multiple geographies to deliver high-quality products and services. Work across all components of a big data platform such as API gateway, message bus, databases and database abstraction layers. Mentor junior engineers and drive best practices for design/code reviews. Propose and drive adoption of best practices for operationalizing data platforms, including data catalog, tracing capabilities, configuration-driven change, etc. What You Will Bring 15+ years of experience with a Bachelors or Masters degree in computer science or related streams. Experience architecting and building highly available and scalable data platforms. Experience with one or more of the following languages (Python, Golang, C++, Java). Experience working with Database technologies and optimizing workloads. Experience working with Big Data technologies viz. Message queues/API gateways, Kubernetes ecosystem, and microservice patterns. Having an owner's mindset and prior experience of leading teams in a fast-paced and demanding environment with good knowledge of SDLC practices. Awareness of Data Marts, Data Lakes, and Data Warehouses is a plus. Experience working with one or more of the cloud platforms (AWS, Azure, GCP, etc) is a plus.
Posted 3 weeks ago
8.0 - 13.0 years
8 - 17 Lacs
Chennai
Remote
MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough