Home
Jobs

1693 Data Engineering Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 day ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

2.0 - 4.0 years

10 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Design and Build Data Infrastructure : Develop scalable data pipelines and data lake/warehouse solutions for real-time and batch data using cloud and open-source tools. Develop & Automate Data Workflows : Create Python-based ETL/ELT processes for data ingestion, validation, integration, and transformation across multiple sources. Ensure Data Quality & Governance : Implement monitoring systems, resolve data quality issues, and enforce data governance and security best practices. Collaborate & Mentor : Work with cross-functional teams to deliver data solutions, and mentor junior engineers as the team grows. Explore New Tech : Research and implement emerging tools and technologies to improve system performance and scalability.

Posted 1 day ago

Apply

4.0 - 8.0 years

12 - 22 Lacs

Nagpur, Chennai, Bengaluru

Work from Office

Naukri logo

Build ETL pipelines using FME to ingest and transform data from Idox/CCF systems. Create Custom Transformers in FME Manage and query spatial datasets using PostgreSQL/PostGIS. Handle spatial formats like GeoPackage, GML, GeoJSON, Shapefiles. Required Candidate profile Strong hands-on exp with FME workflows, spatial data transformation. Tools: CategoryToolsETLFME (Safe Software), Talend (optional), PythonSpatial DBPostGIS, Oracle SpatialGIS Perks and benefits As per company standards

Posted 1 day ago

Apply

4.0 - 9.0 years

17 - 22 Lacs

Gurugram

Work from Office

Naukri logo

Job Title - S&C Global Network - AI - Healthcare Analytics - Consultant Management Level: 9-Team Lead/Consultant Location: Bangalore/Gurgaon Must-have skills: R,Phython,SQL,Spark,Tableau ,Power BI Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHATS IN IT FOR YOU An opportunity to work on high-visibility projects with top Pharma clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. What you would do in this role Support delivery of small to medium-sized teams to deliver consulting projects for global clients. Responsibilities may include strategy, implementation, process design, and change management for specific modules. Work with the team or as an Individual contributor on the project assigned which includes a variety of skills to be utilized from Data Engineering to Data Science Provide Subject matter expertise in various sub-segments of the LS industry. Develop assets and methodologies, point-of-view, research, or white papers for use by the team and the larger community. Acquire new skills that have utility across industry groups. Support strategies and operating models focused on some business units and assess likely competitive responses. Also, assess implementation readiness and points of greatest impact. Co-lead proposals, and business development efforts and coordinate with other colleagues to create consensus-driven deliverables. Execute a transformational change plan aligned with the clients business strategy and context for change. Engage stakeholders in the change journey and build commitment to change. Make presentations wherever required to a known audience or client on functional aspects of his or her domain. Who are we looking for Bachelors or Masters degree in Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems, or other Quantitative field. Proven experience (4+ years) in working on Life Sciences/Pharma/Healthcare projects and delivering successful outcomes. Excellent understanding of Pharma data sets commercial, clinical, RWE (Real World Evidence) & EMR (Electronic medical records) Leverage ones hands on experience of working across one or more of these areas such as real-world evidence data, R&D clinical data, digital marketing data. Hands-on experience with handling Datasets like Komodo, RAVE, IQVIA, Truven, Optum etc. Hands-on experience in building and deployment of Statistical Models/Machine Learning including Segmentation & predictive modeling, hypothesis testing, multivariate statistical analysis, time series techniques, and optimization. Proficiency in Programming languages such as R, Python, SQL, Spark, etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the Data Visualization tools like Tableau, Power BI, Qlikview, Spotfire is good to have. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good communication, interpersonal, and presentation skills Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 4-8 Years Educational Qualification: Bachelors or Masters degree in Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems, or other Quantitative field.

Posted 1 day ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and maintain data pipelines.- Ensure data quality throughout the data lifecycle.- Implement ETL processes for data migration and deployment.- Collaborate with cross-functional teams to understand data requirements.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data engineering principles.- Experience with cloud-based data services.- Knowledge of SQL and database management systems.- Hands-on experience with data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Job Title - Sales Excellence -Client Success - Data Engineering Specialist - CF Management Level :ML9 Location:Open Must have skills:GCP, SQL, Data Engineering, Python Good to have skills:managing ETL pipelines. Job Summary : We are: Sales Excellence. Sales Excellence at Accenture empowers our people to compete, win and grow. We provide everything they need to grow their client portfolios, optimize their deals and enable their sales talent, all driven by sales intelligence. The team will be aligned to the Client Success, which is a new function to support Accentures approach to putting client value and client experience at the heart of everything we do to foster client love. Our ambition is that every client loves working with Accenture and believes were the ideal partner to help them create and realize their vision for the future beyond their expectations. You are: A builder at heart curious about new tools and their usefulness, eager to create prototypes, and adaptable to changing paths. You enjoy sharing your experiments with a small team and are responsive to the needs of your clients. The work: The Center of Excellence (COE) enables Sales Excellence to deliver best-in-class service offerings to Accenture leaders, practitioners, and sales teams. As a member of the COE Analytics Tools & Reporting team, you will help in building and enhancing data foundation for reporting tools and Analytics tool to provide insights on underlying trends and key drivers of the business. Roles & Responsibilities: Collaborate with the Client Success, Analytics COE, CIO Engineering/DevOps team, and stakeholders to build and enhance Client success data lake. Write complex SQL scripts to transform data for the creation of dashboards or reports and validate the accuracy and completeness of the data. Build automated solutions to support any business operation or data transfer. Document and build efficient data model for reporting and analytics use case. Assure the Data Lake data accuracy, consistency, and timeliness while ensuring user acceptance and satisfaction. Work with the Client Success, Sales Excellence COE members, CIO Engineering/DevOps team and Analytics Leads to standardize Data in data lake. Professional & Technical Skills: Bachelors degree or equivalent experience in Data Engineering, analytics, or similar field. At least 4 years of professional experience in developing and managing ETL pipelines. A minimum of 2 years of GCP experience. Ability to write complex SQL and prepare data for dashboarding. Experience in managing and documenting data models. Understanding of Data governance and policies. Proficiency in Python and SQL scripting language. Ability to translate business requirements into technical specification for engineering team. Curiosity, creativity, a collaborative attitude, and attention to detail. Ability to explain technical information to technical as well as non-technical users. Ability to work remotely with minimal supervision in a global environment. Proficiency with Microsoft office tools. Additional Information: Masters degree in analytics or similar field. Data visualization or reporting using text data as well as sales, pricing, and finance data. Ability to prioritize workload and manage downstream stakeholders. About Our Company | AccentureQualification Experience: Minimum 5+ year(s) of experience is required Educational Qualification: Bachelors degree or equivalent experience in Data Engineering, analytics, or similar field

Posted 1 day ago

Apply

1.0 - 3.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Industry & Function AI Data Engineer Analyst S&C Global Network Location: Primary - Bengaluru, Secondary - Gurugram Management Level: 11 - Analyst Must-Have Skills: Proficiency in Python, SQL, and data engineering tools, Experience with cloud platforms like AWS, Azure, or GCP, Knowledge of data warehousing concepts and ETL processes, Strong problem-solving and analytical skills. Good-to-Have Skills: Familiarity with big data technologies (e.g., Hadoop, Spark), Experience with data pipeline tools (e.g., Apache Airflow), Understanding of MLOps practices, Strong communication and collaboration skills Snowflake,DBT,etc.like Computer Science, Information Systems, or Engineering. MBA is a plus. Job Summary :As a Data Engineer Analyst, you will play a critical role in designing, implementing, and optimizing data infrastructure to power analytics, machine learning, and enterprise decision-making. You will collaborate closely with stakeholders, translate business needs into data-driven solutions, and ensure seamless integration into enterprise ecosystems. Roles & Responsibilities: Build and maintain scalable data pipelines and systems for data ingestion, transformation, and storage. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable solutions. Ensure data accuracy, consistency, and reliability through robust validation and governance practices. Stay updated with the latest tools and technologies in data engineering through professional development opportunities. Professional & Technical Skills: Excellent problem-solving abilities. Strong analytical thinking and attention to detail. Strong knowledge of traditional statistical methods, basic machine learning techniques. Ability to work independently and as part of a team. Effective time management and organizational skills. Advanced knowledge of data engineering techniques and tools. Experience with data wrangling and preprocessing. Familiarity with version control systems like Git. Additional Information: An ideal candidate for the Data Engineer Analyst role at Accenture should have a strong technical background in data engineering, with proficiency in Python, SQL, and data engineering tools. They should be experienced with cloud platforms like AWS, Azure, or GCP, and have a solid understanding of data warehousing concepts and ETL processes. Strong problem-solving and analytical skills are essential for this role. About Our Company | Accenture Qualification Experience: Minimum 1-3 years in data engineering or related roles. Experience in Consumer Goods and Services (CG&S) is a plus. Educational Qualification: Bachelor's or master's degree in a quantitative field

Posted 1 day ago

Apply

10.0 - 12.0 years

37 - 40 Lacs

Pune

Work from Office

Naukri logo

JR: R00208204 Experience: 10-12Years Educational Qualification: Any Degree --------------------------------------------------------------------- Job Title - S&C- Data and AI - CFO&EV Quantexa Platform(Assoc Manager) Management Level: 8-Associate Manager Location: Pune, PDC2C Must-have skills: Quantexa Platform Good to have skills: Experience in financial modeling, valuation techniques, and deal structuring. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. WHATS IN IT FOR YOU Accenture CFO & EV team under Data & AI team has comprehensive suite of capabilities in Risk, Fraud, Financial crime, and Finance. Within risk realm, our focus revolves around the model development, model validation, and auditing of models. Additionally, our work extends to ongoing performance evaluation, vigilant monitoring, meticulous governance, and thorough documentation of models. Get to work with top financial clients globally Access resources enabling you to utilize cutting-edge technologies, fostering innovation with the worlds most recognizable companies. Accenture will continually invest in your learning and growth and will support you in expanding your knowledge. Youll be part of a diverse and vibrant team collaborating with talented individuals from various backgrounds and disciplines continually pushing the boundaries of business capabilities, fostering an environment of innovation. What you would do in this role Engagement Execution Lead client engagements that may involve model development, validation, governance, strategy, transformation, implementation and end-to-end delivery of fraud analytics/management solutions for Accentures clients. Advise clients on a wide range of Fraud Management/ Analytics initiatives. Projects may involve Fraud Management advisory work for CXOs, etc. to achieve a variety of business and operational outcomes. Develop and frame Proof of Concept for key clients, where applicable Practice Enablement Mentor, groom and counsel analysts and consultants. Support development of the Practice by driving innovations, initiatives. Develop thought capital and disseminate information around current and emerging trends in Fraud Analytics and Management Support efforts of sales team to identify and win potential opportunities by assisting with RFPs, RFI. Assist in designing POVs, GTM collateral. Travel:Willingness to travel up to 40% of the time Professional Development Skills: Project Dependent Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Advanced skills in development and validation of fraud analytics models, strategies, visualizations. Understanding of new/ evolving methodologies/tools/technologies in the Fraud management space. Expertise in one or more domain/industry including regulations, frameworks etc. Experience in building models using AI/ML methodologies Modeling:Experience in one or more of analytical tools such as SAS, R, Python, SQL, etc. Knowledge of data processes, ETL and tools/ vendor products such as VISA AA, FICO Falcon, EWS, RSA, IBM Trusteer, SAS AML, Quantexa, Ripjar, Actimize etc. Proven experience in one of data engineering, data governance, data science roles Experience in Generative AI or Central / Supervisory banking is a plus. Strong conceptual knowledge and practical experience in the Development, Validation and Deployment of ML/AL models Hands-on programming experience with any of the analytics tools and visualization tools (Python, R, PySpark, SAS, SQL, PowerBI/ Tableau) Knowledge of big data, ML ops and cloud platforms (Azure/GCP/AWS) Strong written and oral communication skills Project management skills and the ability to manage multiple tasks concurrently Strong delivery experience of short and long term analytics projects Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 10-12Years Educational Qualification: Any Degree

Posted 1 day ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BE Project Role :Software Development Engineer Project Role Description :Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have Skills :PySparkGood to Have Skills :No Industry SpecializationJob :Key Responsibilities :Overall 8 years of experience working in Data Analytics projects, Work on client projects to deliver AWS, PySpark, Databricks based Data engineering Analytics solutions Build and operate very large data warehouses or data lakes ETL optimization, designing, coding, tuning big data processes using Apache Spark Build data pipelines applications to stream and process datasets at low latencies Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience :Minimum of 2 years of experience in Databricks engineering solutions on any of the Cloud platforms using PySpark Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture delivery Minimum 3 year of Experience in one or more programming languages Python, Java, Scala Experience using airflow for the data pipelines in min 1 project 2 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform. Must be able to understand ETL technologies and translate into Cloud (AWS, Azure, Google Cloud) native tools or Pyspark. Professional Attributes :1 Should have involved in data engineering project from requirements phase to delivery 2 Good communication skill to interact with client and understand the requirement 3 Should have capability to work independently and guide the team. Educational Qualification:Additional Info : Qualification BE

Posted 1 day ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Product 360 (PIM) Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data architecture and ensuring that data is accessible and reliable for decision-making purposes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering practices.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica Product 360 (PIM).- Strong understanding of data modeling and database design principles.- Experience with data integration and ETL tools.- Familiarity with cloud data platforms and services.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 5 years of experience in Informatica Product 360 (PIM).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in problem-solving sessions, contributing your insights and expertise to develop effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and analytics, ensuring that the data platform remains robust and efficient. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay abreast of industry trends and technologies.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and tools.- Experience with cloud-based data storage solutions.- Familiarity with data modeling concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Financial Planning and Analysis (FP&A) Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIFinancial planning and analysis (FP&A) refers to the processes designed to help organizations accurately plan, forecast, and budget to support the company s major business decisions and future financial health. These processes include planning, budgeting, forecasting, scenario modeling, and performance reporting. What are we looking for Financial Planning & Analysis Data Analysis & Interpretation Data Engineering Adaptable and flexible Ability to work well in a team Commitment to quality Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification Any Graduation

Posted 1 day ago

Apply

3.0 - 5.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Delivery - Financial Planning and Analysis (FP&A) Designation: I&F Decision Sci Practitioner Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIFinancial planning and analysis (FP&A) refers to the processes designed to help organizations accurately plan, forecast, and budget to support the company s major business decisions and future financial health. These processes include planning, budgeting, forecasting, scenario modeling, and performance reporting. What are we looking for Financial Planning & Analysis Data Analysis & Interpretation Data Engineering Adaptable and flexible Ability to work well in a team Commitment to quality Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day-to-day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Qualification Any Graduation

Posted 1 day ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services, Microsoft Dynamics AX Technical Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for a Team Lead - Migration Engineer with deep expertise in Microsoft Dynamics 365 Finance & Operations (D365 F&O) to drive successful data migration, system upgrades, and platform transitions. The ideal candidate will be responsible for leading migration projects, ensuring data integrity, optimizing migration processes, and collaborating with teams to ensure seamless transitions.Key Responsibilities:Lead end-to-end migration projects for transitioning legacy systems to Microsoft Dynamics 365 F&O.Develop migration strategies, roadmaps, and execution plans for seamless data transfer.Design and implement ETL processes to ensure accurate and efficient data migration.Collaborate with technical teams to configure data mappings, transformations, and validation rules.Ensure compliance with Microsoft best practices for data migration and security.Conduct data integrity checks, validation, and reconciliation post-migration.Provide guidance and mentorship to a team of migration engineers, developers, and consultants.Troubleshoot migration-related issues and optimize performance for large-scale data transfers.Stay updated with the latest D365 F&O upgrades, tools, and methodologies related to data migration.Required Skills & Qualifications:Proven experience in Microsoft Dynamics 365 F&O migration projects.Strong knowledge of data architecture, ETL tools, and integration frameworks.Expertise in SQL, Azure Data Factory, and other data transformation tools.Hands-on experience with data cleansing, mapping, validation, and reconciliation.Ability to lead and manage teams in large-scale migration projects.Excellent analytical and problem-solving skills.Strong communication and stakeholder management abilities.Preferred Certifications:Microsoft Certified:Dynamics 365 Finance Functional Consultant AssociateMicrosoft Certified:Dynamics 365 Data Migration & Integration SpecialistMicrosoft Certified:Azure Data Engineer Associate Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved throughout the development process. Your role will be pivotal in driving innovation and efficiency within the application development lifecycle, fostering a collaborative environment that encourages creativity and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Experience with cloud computing platforms and services.- Strong understanding of application development methodologies.- Ability to design and implement scalable solutions.- Familiarity with data integration and ETL processes. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Databricks.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

2.0 - 7.0 years

17 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Title - Data Eng, Mgmt. & Governance - Analyst S&C Global Network Management Level: 11 Location: Hyderabad Must have skills: Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Good to have skills: Exposure to Retail, Banking, Healthcare projects and knowledge of PowerBI & PowerApps is an added advantage. Job Summary : As a Data Operations Analyst, you would be responsible to ensure our esteemed business is fully supported in using the business-critical AI enabled applications. This involves solving day-to-day application issues, business queries, addressing adhoc data requests to ensure clients can extract maximum value for the AI applications. WHATS IN IT FOR YOU An opportunity to work on high-visibility projects with top clients around the globe. Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners, and business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge, and capabilities. Opportunity to thrive in a culture that is committed to accelerating equality for all. Engage in boundaryless collaboration across the entire organization. Roles & Responsibilities: Monitor and maintain pre-processing pipelines, model execution batches and validation of model outputs. In case of deviations or model degradation, take up detailed root cause analysis and implement permanent fixes. Debug issues related to data loads, batch pipeline, application functionality including special handling of data/batch streams. As a Data Operations Analyst, you would be working on initial triaging of code related defects/issues, provide root cause analysis and implement code fix for permanent resolution of the defect. Design, build, test and deploy small to medium size enhancements that deliver value to business and enhance application availability and usability. Responsible for sanity testing of use cases as part of pre-deployment and post-production activities. Primarily responsible for Application availability and stability by remediating application issues/bugs or other vulnerabilities. Data Operations Analysts evolve to become Subject Matter Experts as they mature in servicing the applications. Professional & Technical Skills: Proven experience (2+ years) in working as per the above job description is required. Experience/Education on Statistics, Data Science, Applied Mathematics, Business Analytics, Computer Science, Information Systems is preferable. Proven experience (2+ years) in working as per the above job description is required. Exposure to Retail, Banking, Healthcare projects is added advantage. Proficiency and hands-on experience in Data Engineering technologies like Python, R, SQL, Spark, Pyspark, Databricks, Hadoop etc. Ability to work with large data sets and present findings/insights to key stakeholders; Data management using databases like SQL. Experience with any of the cloud platforms like AWS, Azure, or Google Cloud for deploying and scaling language models. Experience with any of the data visualization tools like Tableau, Qlikview, and Spotfire is good. Knowledge on PowerBI & PowerApps is an added advantage. Excellent analytical and problem-solving skills, with a data-driven mindset. Proficient in Excel, MS Word, PowerPoint, etc. Ability to solve complex business problems and deliver client delight. Strong writing skills to build points of view on current industry trends. Good Client handling skills; able to demonstrate thought leadership & problem-solving skills. Additional Information: - The ideal candidate will possess a strong educational background in computer science or a related field. This position is based at our Hyderabad office. About Our Company | AccentureQualification Experience: Minimum 2 years of experience is required Educational Qualification: Bachelors or masters degree in any engineering stream or MCA.

Posted 1 day ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : Cloud Data Architecture Good to have skills : Data & AI Solution Architecture, Modern Data IntegrationMinimum 10+ year(s) of experience is required Educational Qualification : Should have completed Graduation from reputed College/University Project Role :Database Administrator Project Role Description :Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have Skills :Cloud Data ArchitectureGood to Have Skills :Data & AI Solution Architecture, Modern Data IntegrationJob :Key Responsibilities :CPG/Retail industry backgroundStrong understanding of Modern Data Engineering StackHands on experience with Big Data and data streamingStrong knowledge in architecture and design patternsExperience in designing of architectural roadmaps, rapid POC developmentKnowledge of SQL , NoSQL and Graph databasesKnowledge on data modeling, SCD types, data miningExperience with unified query engines , ETL tools , Data virtualization etcWide knowledge on modern data engineering design principl Technical Experience :Primary skills :System design , Modern Data engineering, Big DataExperience developing applications with high volume transactionsPassion for data and analytics, a philosophy of using technology to help solve business problems.Strong verbal and written communicationAbility to Manage teamWillingness to travel to client sites in India based on the need. Professional Attributes :- Must have performed in client facing roles- Strong Communication skills - leadership skills- Team handling skills- Analytical skills- Presentation skills- Ability to work under pressure Educational Qualification:Should have completed Graduation from reputed College/UniversityAdditional Info :Industry Experience CPG, Auto, Retail, Manufacturing Qualification Should have completed Graduation from reputed College/University

Posted 1 day ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationMust Have; Azure Data Factory SQL Azure Data Lake Storage Data Engineering Solutions Structured and Unstructured Data Processing Data Modelling, Analysis, Design, Development, and Documentation Cloud Computing CI/CD DevOps Good to Have (Minimum of 3) Azure DevOps PowerApps Azure Function App Databricks PowerBI (advanced knowledge) Python Airflow DAG Infra deployments (Azure, Bicep) Networking & security Scheduling for Orchestration of Workflows Agile Frameworks (Scrum) Qualification 15 years full time education

Posted 1 day ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Senior Data Engineer, you bring over 10 years of experience in applying advanced concepts and technologies in production environments.Your expertise and skills make you an ideal candidate to lead and deliver cutting-edge data solutions.Expertise Extensive hands-on experience with Azure Databricks and modern data architecture principles. In-depth understanding of Lakehouse and Medallion Architectures and their practical applications. Advanced knowledge of Delta Lake, including data storage, schema evolution, and ACID transactions. Comprehensive expertise in working with Parquet files, including handling challenges and designing effective solutions. Working experience with the Unity Catalog Knowledge on Azure Cloud services Working experience with Azure DevOpsSkills Proficiency in writing clear, maintainable, and modular code using Python and PySpark. Advanced SQL expertise, including query optimization and performance tuning.Tools Experience with Infrastructure as Code (IaC), preferably using Terraform. Proficiency in CI/CD pipelines, with a strong preference for Azure DevOps. Familiarity with Azure Data Factory for seamless data integration and orchestration. Hands-on experience with Apache Airflow for workflow automation. Automation skills using PowerShell. Nice to have/Basic knowledge of Lakehouse Apps and frameworks like Angular.js, Node.js, or React.js.You Also. Possess excellent communication skills, making technical concepts accessible to non-technical stakeholders. Nice to have API knowledge Nice to have Data Modelling knowledge Are open to discussing and tackling challenges with a collaborative mindset. Enjoy teaching, sharing knowledge, and mentoring team members to foster growth. Thrive in multidisciplinary Scrum teams, collaborating effectively to achieve shared goals. Have a solid foundation in software development, enabling you to bridge gaps between development and data engineering. Demonstrate a strong drive for continuous improvement and learning new technologies. Take full ownership of the build, run, and change processes, ensuring solutions are reliable and scalable. Embody a positive, proactive mindset, fostering teamwork and mutual support within your team. Qualification 15 years full time education

Posted 1 day ago

Apply

2.0 - 5.0 years

9 - 15 Lacs

Noida

Work from Office

Naukri logo

Job Description: Data Engineer (Climate & Water Resources) Position Overview We are seeking a highly skilled Data Engineer with experience in climate-related and water resources data management. The ideal candidate will develop, optimize, and maintain data pipelines for hydrological, meteorological, and climate datasets. This role involves working with large geospatial and time-series data, supporting climate risk assessments, flood modeling, and water resource management projects. Key Responsibilities Automate data collection from sources such as NOAA, ECMWF, IMD, NASA, and regional hydrological agencies. Process and clean large datasets, ensuring data integrity, accuracy, and accessibility. Handle large-scale geospatial and time-series datasets related to rainfall, river discharge, groundwater levels, and climate models. Work with climate projection datasets (e.g., CMIP6, ERA5) for impact assessments and forecasting. Develop efficient storage solutions using cloud platforms (AWS, GCP, or Azure). Support hydrological and hydraulic modeling teams by structuring input data for flood simulations, drought analysis, and climate impact assessments. Collaborate with data scientists to build machine learning models for climate risk assessment. Implement scalable data processing workflows using Python, SQL, and distributed computing tools. Optimize data workflows for efficient processing and visualization. Work closely with hydrologists, climate scientists, and engineers to understand data requirements. Document data pipelines, metadata, and workflows for reproducibility and transparency. Required Qualifications & Skills Education & Experience Bachelors or Masters degree in Computer Science, Data Science, Environmental Engineering, Hydrology, Climate Science, or a related field. 3-5 years of experience in data engineering, preferably in climate science, hydrology, or environmental domains. Technical Skills Strong programming skills in Python, SQL, or R . Experience with big data processing tools (Apache Spark, Dask, or Hadoop). Familiarity with climate and hydrological datasets (HECRAS, HEC HMS, SWAT, GFS). Experience with geospatial data tools (GDAL, PostGIS, GeoPandas). Knowledge of cloud platforms (AWS S3, GCP BigQuery, Azure Data Lake). Preferred Skills Experience in climate risk modeling and statistical analysis of environmental data. Familiarity with APIs for climate data access (Copernicus, NASA Earthdata). Understanding of hydrological models and remote sensing datasets .

Posted 1 day ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data architecture design and implementation.- Optimize data delivery and re-design infrastructure for greater scalability.- Implement data security and privacy measures.- Collaborate with data scientists and analysts to understand data needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Strong understanding of cloud-based data solutions.- Experience with data warehousing and data lakes.- Knowledge of SQL and NoSQL databases.- Hands-on experience with data integration tools.- Good To Have Skills: Experience with Azure Machine Learning. Additional Information:- The candidate should have a minimum of 5 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : BI Engineer Project Role Description : Develop, migrate, deploy, and maintain data-driven insights and knowledge engine interfaces that drive adoption and decision making. Integrate security and data privacy protection. Must have skills : SAS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a SAS BASE & MACROS, you will be responsible for building and designing scalable and open Business Intelligence (BI) architecture to provide cross-enterprise visibility and agility for business innovation. You will create industry and function data models used to build reports and dashboards, ensuring seamless integration with Accentures Data and AI framework to meet client needs. Roles & Responsibilities1.Data Engineer to lead or drive the migration of legacy SAS data preparation jobs to a modern Python-based data engineering framework. 2. Should have deep experience in both SAS and Python, strong knowledge of data transformation workflows, and a solid understanding of database systems and ETL best practices.3.Should analyze existing SAS data preparation and data feed scripts and workflows to identify logic and dependencies.4.Should translate and re-engineer SAS jobs into scalable, efficient Python-based data pipelines.5.Collaborate with data analysts, scientists, and engineers to validate and test converted workflows.6.Optimize performance of new Python workflows and ensure data quality and consistency.7.Document migration processes, coding standards, and pipeline configurations.8.Integrate new pipelines with google cloud platforms as required.9.Provide guidance and support for testing, validation, and production deployment Professional & Technical Skills: - Must To Have Skills: Proficiency in SAS Base & Macros- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have 8+ years of exp with min 3 years of exp in SAS or python Data engineering Qualification 15 years full time education

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies