Home
Jobs

1885 Data Engineering Jobs - Page 49

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

9 - 11 Lacs

Hyderabad

Remote

Naukri logo

Role: Data Engineer (ETL Processes, SSIS, AWS) Duration: Fulltime Location: Remote Working hours: 4:30am to 10:30am IST shift timings. Note: We need a ETL engineer for MS SQL Server Integration Service working in 4:30am to 10:30am IST shift timings. Roles & Responsibilities: Design, develop, and maintain ETL processes using SQL Server Integration Services (SSIS). Create and optimize complex SQL queries, stored procedures, and data transformation logic on Oracle and SQL Server databases. Build scalable and reliable data pipelines using AWS services (e.g., S3, Glue, Lambda, RDS, Redshift). Develop and maintain Linux shell scripts to automate data workflows and perform system-level tasks. Schedule, monitor, and troubleshoot batch jobs using tools like Control-M, AutoSys, or cron. Collaborate with stakeholders to understand data requirements and deliver high-quality integration solutions. Ensure data quality, consistency, and security across systems. Maintain detailed documentation of ETL processes, job flows, and technical specifications. Experience with job scheduling tools such as Control-M and/or AutoSys. Exposure to version control tools (e.g., Git) and CI/CD pipelines.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

9 - 11 Lacs

Hyderabad

Remote

Naukri logo

Role: Data Engineer (Azure, Snowflake) - Mid-Level Duration: 6+ Months Location: Remote Working Hours: 12:30pm IST - 9:30pm IST (3am - 12pm EST) Job Summary: We are looking for a Data Engineer with solid hands-on experience in Azure-based data pipelines and Snowflake to help build and scale data ingestion, transformation, and integration processes in a cloud-native environment. Key Responsibilities: Develop and maintain data pipelines using ADF, Snowflake, and Azure Storage Perform data integration from various sources including APIs, flat files, and databases Write clean, optimized SQL and support data modeling efforts in Snowflake Monitor and troubleshoot pipeline issues and data quality concerns Contribute to documentation and promote best practices across the team Qualifications: 3-5 years of experience in data engineering or related role Strong hands-on knowledge of Snowflake, Azure Data Factory, SQL, and Azure Data Lake Proficient in scripting (Python preferred) for data manipulation and automation Understanding of data warehousing concepts and ETL /ELT patterns Experience with Git, JIRA, and agile delivery environments is a plus Strong attention to detail and eagerness to learn in a collaborative team setting

Posted 3 weeks ago

Apply

14.0 - 20.0 years

35 - 55 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job Summary: Design and implement ML solutions, architecting scalable and efficient systems. Primary Skills: Strong in Machine Learning Algorithms Data Engineering and ETL/ELT Data cleaning, preprocessing and EDA Feature Engineering, Data Splitting and encoding MLOps (Model versioning, Training, experimenting, deployment and monitoring) Python, Pandas, TensorFlow, PyTorch, Scikit-learn, Keras, XGBoost, LightGBM, Matplotlib, R, Scala, Java, etc. Git, DVC, MLFlow, Kubernetes, Kubeflow, Docker, Containers, CI/CD deployments, Apache Airflow Databricks, Snowflake, Salesforce, SAP, AWS/Azure/GCP Data Cloud Platforms AWS SageMaker, Google AI Platform, Azure Machine Learning Model Design and Optimization, LLMs models (OpenAI, BERT, LLaMA, Gemini etc.) RDBMS, No SQL database, Vector DB, RAG Pipelines AI Agent Frameworks, AI agent authentication and Deployment AI security and compliance, Prompt Engineering Secondary Skills: Cloud computing, Data engineering, DevOps Design and develop AI/ML models and algorithms Responsibilities: Collaborate with data scientists and engineers Ensure scalability and performance of AI/ML systems Requirements: 12-15 years of experience in AI/ML development Strong expertise in AI/ML frameworks and tools Excellent problem-solving and technical skills

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Hyderabad

Remote

Naukri logo

Role: Senior Data Engineer Azure/Snowflake Duration: 6+ Months Location: Remote Working Hours: 12:30pm IST - 9:30pm IST (3am - 12pm EST) Job Summary: We are seeking a Senior Data Engineer with advanced hands-on experience in Snowflake and Azure to support the development and optimization of enterprise-grade data pipelines. This role is ideal for someone who enjoys deep technical work and solving complex data engineering challenges in a modern cloud environment. Key Responsibilities: Build and enhance scalable data pipelines using Azure Data Factory, Snowflake, and Azure Data Lake Develop and maintain ELT processes to ingest and transform data from various structured and semi-structured sources Write optimized and reusable SQL for complex data transformations in Snowflake Collaborate closely with analytics teams to ensure clean, reliable data delivery Monitor and troubleshoot pipeline performance, data quality, and reliability Participate in code reviews and contribute to best practices around data engineering standards and governance Qualifications: 5+ years of data engineering experience in enterprise environments Deep hands-on experience with Snowflake, Azure Data Factory, Azure Blob/Data Lake, and SQL Proficient in scripting for data workflows (Python or similar) Strong grasp of data warehousing concepts and ELT development best practices Experience with version control tools (e.g., Git) and CI/CD processes for data pipelines Detail-oriented with strong problem-solving skills and the ability to work independently

Posted 3 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE You will play a key role in a regulatory submission content automation initiative which will modernize and digitize the regulatory submission process, positioning Amgen as a leader in regulatory innovation. The initiative leverages state-of-the-art technologies, including Generative AI, Structured Content Management, and integrated data to automate the creation, review, and approval of regulatory content. ? The role is responsible for sourcing and analyzing data for this initiative and support designing, building, and maintaining the data pipelines to drive business actions and automation . This role involves working with Operations source systems, find the right data sources, standardize data sets, supporting data governance to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Roles & Responsibilities Ensure reliable , secure and compliant operating environment. Identify , extract, and integrate required business data from Operations systems residing in modern cloud-based architectures. Design, develop, test and maintain scalable data pipelines, ensuring data quality via ETL/ELT processes. Schedul e and manag e workflows the ensure pipeline s run on schedule and are monitored for failures. Implement data integration solutions and manage end-to-end pipeline projects, including scope, timelines, and risk. Reverse-engineer schemas and explore source system tables to map local representations of target business concepts. Navigate application UIs and backends to gain business domain knowledge and detect data inconsistencies. Break down information models into fine-grained, business-contextualized data components. Work closely with cross-functional teams, including product teams, data architects, and business SMEs, to understand requirements and design solutions. Collaborate with data scientists to develop pipelines that meet dynamic business needs across regions. Create and maintain data models, dictionaries, and documentation to ensure accuracy and consistency. Adhere to SOPs, GDEs , and best practices for coding, testing, and reusable component design. Basic Qualifications and Experience Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Functional Skills: Must-Have Skills: Hands on experience with data practices, technologies , and platforms , such as Databricks, Python, Prophecy, Gitlab, LucidChart etc Proficiency in data analysis tools ( eg. SQL) and experience with data sourcing tools Excellent problem-solving skills and the ability to work with large, complex datasets U nderstanding of data governance frameworks, tools, and best practices. Knowledge of and experience with data standards (FAIR) and protection regulations and compliance requirements (e.g., GDPR, CCPA) Good-to-Have Skills: Experience with ETL tools and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Knowledge of Python /R , Databricks, cloud data platforms Professional Certifications Certified Data Engineer / Data Analyst (preferred on Databricks ) Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills

Posted 4 weeks ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Indore, Jaipur, Bengaluru

Work from Office

Naukri logo

Exp in dashboard story development, dashboard creation, and data engineering pipelines. Manage and organize large volumes of application log data using Google Big Query Exp with log analytics, user engagement metrics, and product performance metrics Required Candidate profile Exp with tool like Tableau Power BI, or ThoughtSpot AI . Understand log data generated by Python-based applications. Ensure data integrity, consistency, and accessibility for analytical purposes.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Chennai, Guindy, Chenai

Work from Office

Naukri logo

Python Automation Developer Using GenAI Chennai - Guindy, India Information Technology 16778 Overview Collaboration and Communication: Collaborate with cross-functional teams to understand business requirements and deliver effective solutions. Communicate complex solutions clearly to clients and stakeholders.5. Continuous Learning and Innovation: Stay updated on the latest developments in Generative AI and automation technologies. Integrate new innovations into client solutions and drive continuous improvement.ResponsibilitiesDesign and Development: Develop and maintain Python-based automation solutions incorporating Generative AI technologies. Create reusable, modular libraries and frameworks to streamline solution deployment. Produce detailed solution design documents and ensure adherence to technical standards.2. Implementation and Integration: Implement AI-powered automation solutions using Python and relevant frameworks. Integrate solutions within cloud environments (e.g., Azure, AWS) to ensure scalability and reliability. Utilize large language models (e.g., OpenAI, Google Bard) to build innovative AI solutions.3. Testing and Quality Assurance: Prepare unit test cases and end-to-end automation test plans. Troubleshoot, debug, and refine intelligent solutions to address real-world challengesRequirementsBachelors or masters degree in computer science, Engineering, or a related field. 3+ years of experience in software development or data engineering. Proficiency in Python and experience with frameworks. Hands-on experience with cloud platforms (e.g., Azure, AWS) and DevOps tools (e.g., GitHub, Azure DevOps). Strong problem-solving skills and the ability to tailor AI frameworks for specific use cases. Excellent communication skills and a collaborative mindset.

Posted 4 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer - Senior Software Engineer Bangalore, India Information Technology 16750 Overview We are seeking a skilled and experienced Data Engineer who has expertise in playing a vital role in supporting, data ingestion/migration, creating data pipelines, creating data marts and managing, monitoring the data using tech stack Azure, SQL, Python, PySpark, Airflow and Snowflake. Responsibilities 1. Data Ingestion/MigrationCollaborate with cross-functional teams to Ingest/migrate data from various sources to staging area. Develop and implement efficient data migration strategies, ensuring data integrity and security throughout the process. 2. Data Pipeline DevelopmentDesign, develop, and maintain robust data pipelines that extract, transform, and load (ETL) data from different sources into GCP. Implement data quality checks and ensure scalability, reliability, and performance of the pipelines. 3. Data ManagementBuild and maintain data models and schemas, ensuring optimal storage, organization, and accessibility of data. Collaborate with requirement team to understand their data requirements and provide solutions by creating data marts to meet their needs. 4. Performance OptimizationIdentify and resolve performance bottlenecks within the data pipelines and data services. Optimize queries, job configurations, and data processing techniques to improve overall system efficiency. 5. Data Governance and SecurityImplement data governance policies, access controls, and data security measures to ensure compliance with regulatory requirements and protect sensitive data. Monitor and troubleshoot data-related issues, ensuring high availability and reliability of data systems. 7. Documentation and CollaborationCreate comprehensive technical documentation, including data flow diagrams, system architecture, and standard operating procedures. Collaborate with cross-functional teams, analysts, and software engineers, to understand their requirements and provide technical expertise. Requirements Qualifications: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - Proven experience as a Data Engineer with a focus on output driven. - Knowledge and hands-on experience with Azure, SQL, Python, PySpark, Airflow, Snowflake and related tools. - Proficiency in data processing and pipeline development. Solid understanding of data modeling, database design, and ETL principles. - Experience with data migration projects, including data extraction, transformation, and loading. - Familiarity with data governance, security, and compliance practices. - Good communication and interpersonal skills, with the ability to articulate technical concepts to non-technical stakeholders.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Chennai, Guindy

Work from Office

Naukri logo

Data ELT Engineer Chennai - Guindy, India Information Technology 17075 Overview We are looking for a highly skilled DataELT Engineer to architect and implement data solutions that support our enterprise analytics and real-time decision-making capabilities. This role combines data modeling expertise with hands-on experience building and managing ELT pipelines across diverse data sources. You will work with Snowflake , AWS Glue , and Apache Kafka to ingest, transform, and stream both batch and real-time data, ensuring high data quality and performance across systems. If you have a passion for data architecture and scalable engineering, we want to hear from you. Responsibilities Design, build, and maintain scalable ELT pipelines into Snowflake from diverse sources including relational databases (SQL Server, MySQL, Oracle) and SaaS platforms. Utilize AWS Glue for data extraction and transformation, and Kafka for real-time streaming ingestion. Model data using dimensional and normalized techniques to support analytics and business intelligence workloads. Handle large-scale batch processing jobs and implement real-time streaming solutions. Ensure data quality, consistency, and governance across pipelines. Collaborate with data analysts, data scientists, and business stakeholders to align models with organizational needs. Monitor, troubleshoot, and optimize pipeline performance and reliability. Requirements 5+ years of experience in data engineering and data modeling. Strong proficiency with SQL and data modeling techniques (star, snowflake schemas). Hands-on experience with Snowflake data platform. Proficiency with AWS Glue (ETL jobs, crawlers, workflows). Experience using Apache Kafka for streaming data integration. Experience with batch and streaming data processing. Familiarity with orchestration tools (e.g., Airflow, Step Functions) is a plus. Strong understanding of data governance and best practices in data architecture. Excellent problem-solving skills and communication abilities.

Posted 4 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms such as AWS or Azure.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Your future duties and responsibilities: Job Overview: CGI is looking for a talented and motivated Data Engineer with strong expertise in Python, Apache Spark, HDFS, and MongoDB to build and manage scalable, efficient, and reliable data pipelines and infrastructure Youll play a key role in transforming raw data into actionable insights, working closely with data scientists, analysts, and business teams. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Python and Spark. Ingest, process, and transform large datasets from various sources into usable formats. Manage and optimize data storage using HDFS and MongoDB. Ensure high availability and performance of data infrastructure. Implement data quality checks, validations, and monitoring processes. Collaborate with cross-functional teams to understand data needs and deliver solutions. Write reusable and maintainable code with strong documentation practices. Optimize performance of data workflows and troubleshoot bottlenecks. Maintain data governance, privacy, and security best practices. Required qualifications to be successful in this role: Minimum 6 years of experience as a Data Engineer or similar role. Strong proficiency in Python for data manipulation and pipeline development. Hands-on experience with Apache Spark for large-scale data processing. Experience with HDFS and distributed data storage systems. Proficient in working with MongoDB, including data modeling, indexing, and querying. Strong understanding of data architecture, data modeling, and performance tuning. Familiarity with version control tools like Git. Experience with workflow orchestration tools (e.g., Airflow, Luigi) is a plus. Knowledge of cloud services (AWS, GCP, or Azure) is preferred. Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Preferred Skills: Experience with containerization (Docker, Kubernetes). Knowledge of real-time data streaming tools like Kafka. Familiarity with data visualization tools (e.g., Power BI, Tableau). Exposure to Agile/Scrum methodologies. Skills: English Oracle Python Java Note: 1.This role will require- 8 weeks of in-office work after joining, after which we will transition to a hybrid working model, with 2 days per week in the office. 2.Mode of interview F2F 3.Time : Registration Window -9am to 12.30 pm.Candidates who are shortlisted will be required to stay throughout the day for subsequent rounds of interviews Notice Period: 0-45 Days

Posted 4 weeks ago

Apply

10.0 - 14.0 years

8 - 13 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Assoc Manager Qualifications: Any Graduation Years of Experience: 10 to 14 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage, Direct active participation on GenAI and Machine Learning projects Other skills:Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 4 weeks ago

Apply

5.0 - 8.0 years

6 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Senior Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day to day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 4 weeks ago

Apply

6.0 - 9.0 years

15 - 17 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Proper Job Description- JD shared by client. 6+ years of experience in data engineering. Strong knowledge in SQL. Expertise in Snowflake, DBT and Python Minimum 3+ years SnapLogic or FivTran tool knowledge is added advantage. Must automate manual work using SnapLogic. Good communication and interpersonal skills is must as need to collaborate with data team, business analyst 2. Primary Skills in 5 liners that manager cannot negotiate on - Snowflake, DBT and Python & SQL 3. Location and Flexible locations – Yes

Posted 4 weeks ago

Apply

7.0 - 11.0 years

6 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage, Direct active participation on GenAI and Machine Learning projects Other skills:Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 4 weeks ago

Apply

7.0 - 11.0 years

6 - 11 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do A data analyst is responsible for collecting, storing, and organizing data related to how Wireless Telecommunication products and services are built and bill. They bring technical expertise to ensure the quality and accuracy of that data, they also need to have experience with Finance for Telecommunication Mobility services. Knowledge of AT&T Data Sources for Wireless services & knowledge of client tools is advantage. Developing and implementing Data Analysis to identity data anomalies and leading trends to identify potential billing issues. Able to handle multi-biller customer, discounts eligibility criteria that are ever changing, and they must adapt and reconfigure audits in very short time.Manage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Qualification Any Graduation

Posted 4 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do Helps transform back office and network operations, reduce time to market and grow revenue, by improving customer experience and capex efficiency, and reducing cost-to-serveGood Customer Support Experience preferred with good networking knowledgeManage problems caused by information technology infrastructure errors to minimize their adverse impact on business and to prevent their recurrence by seeking the root cause of those incidents and initiating actions to improve or correct the situation. What are we looking for 5 years of programming skills- advanced level in relation to responsibility for maintenance of existing & creation of new queries via SQL scripts, Python, PySpark programming skills, experience with Databricks, Palantir is advantage Other skillsMust be self-motivated and understand short turnaround expectations Desire to learn and understand data models and billing processes Critical thinking Experience with reporting and metrics- strong numerical skills Experience in expense, billing, or financial management Experience in process/system management Good organizational skills, self-disciplined, systematic approach with good interpersonal skills Flexible, Analytical mind, Problem solver Knowledge of Telecom Products and Services Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 4 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Advanced Application Engineer Project Role Description : Utilize modular architectures, next-generation integration techniques and a cloud-first, mobile-first mindset to provide vision to Application Development Teams. Work with an Agile mindset to create value across projects of multiple scopes and scale. Must have skills : BlueYonder Enterprise Supply Planning Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are looking for an experienced Integration Architect to lead the design and execution of integration strategies for Blue Yonder (BY) implementations across cloud-native environments. The ideal candidate will possess strong expertise in integrating supply chain platforms with enterprise cloud systems, data lakes, and Snowflake, along with working knowledge of Generative AI (Gen AI) to enhance automation and intelligence in integration and data workflows. Roles & Responsibilities:- Architect and implement end-to-end integration solutions for Blue Yonder (WMS, TMS, ESP, etc) with enterprise systems (ERP, CRM, legacy).- Design integration flows using cloud-native middleware platforms (Azure Integration Services, AWS Glue, GCP Dataflow, etc.).- Enable real-time and batch data ingestion into cloud-based Data Lakes (e.g., AWS S3, Azure Data Lake, Google Cloud Storage) and downstream to Snowflake.- Develop scalable data pipelines to support analytics, reporting, and operational insights from Blue Yonder and other systems.- Integrate Snowflake as an enterprise data platform for unified reporting and machine learning use cases. Professional & Technical Skills: - Leverage Generative AI (e.g., OpenAI, Azure OpenAI) for :Auto-generating integration mapping specs and documentation.- Enhancing data quality and reconciliation with intelligent agents.- Developing copilots for integration teams to speed up development and troubleshooting.- Ensure integration architecture adheres to security, performance, and compliance standards.- Collaborate with enterprise architects, functional consultants, data engineers, and business stakeholders.- Lead troubleshooting, performance tuning, and hypercare support post-deployment Additional Information:- The candidate should have minimum 5 years of experience in BlueYonder Enterprise Supply Planning.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

7.0 - 12.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Python (Programming Language) Good to have skills : Microsoft Power Business Intelligence (BI), Google BigQuery, Apache AirflowMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data platform blueprint and design- Implement data platform components- Ensure seamless integration between systems and data models Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language)- Good To Have Skills: Experience with Apache Airflow, Google BigQuery, Microsoft Power Business Intelligence (BI)- Strong understanding of data engineering principles- Experience in building scalable data platforms- Proficient in data modeling and database design Additional Information:- The candidate should have a minimum of 7.5 years of experience in Python (Programming Language)- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 7.0 years

20 - 27 Lacs

Gurugram

Work from Office

Naukri logo

The ideal candidate is a hands-on technology developer with experience in developing scalable applications and platforms. They must be at ease working in an agile environment with little supervision. The person should be a self-motivated person with a passion for problem solving and continuous learning. Role and responsibilities Strong technical, analytical, and problem-solving skills Strong organizational skills, with the ability to work autonomously as well as in a team-based environment Data pipeline framework development Technical skills requirements The candidate must demonstrate proficiency in, CDH On-premise for data processing and extraction Ability to own and deliver on large, multi-faceted projects Fluency in complex SQL and experience with RDBMSs Project Experience in CDH experience, Spark, PySpark, Scala, Python, NiFi, Hive, NoSql DBs) Experience designing and building big data pipelines Experience working on large scale, distributed systems Experience working on any Databricks would be added advantage Strong hands-on experience of programming language like PySpark, Scala with Spark, Python. Exposure to various ETL and Business Intelligence tools Experience in shell scripting to automate pipeline execution. Solid grounding in Agile methodologies Experience with git and other source control systems Strong communication and presentation skills Nice-to-have skills Certification in Hadoop/Big Data Hortonworks/Cloudera Databricks Spark certification Unix or Shell scripting Strong delivery background across the delivery of high-value, business-facing technical projects in major organizations Experience of managing client delivery teams, ideally coming from a Data Engineering / Data Science environment Qualifications Tech./M.Tech./MS or BCA/MCA degree from a reputed university

Posted 4 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Key Responsibilities- Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. -Build and operate very large data warehouses or data lakes. ETL optimization, designing, coding, & tuning big data processes using Apache Spark. Build data pipelines & applications to stream and process datasets at low latencies. -Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience:- Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake.-Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. -Minimum of 2 years of experience years in real time streaming using Kafka/Kinesis- Minimum 4 year of Experience in one or more programming languages Python, Java, Scala.- Experience using airflow for the data pipelines in min 1 project.-1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Professional Attributes:- Ready to work in B Shift (12 PM to 10 PM) - A Client facing skills:solid experience working in client facing environments, to be able to build trusted relationships with client stakeholders.- Good critical thinking and problem-solving abilities - Health care knowledge - Good Communication Skills ducational Qualification:Bachelor of Engineering / Bachelor of Technology Additional Information:Data Engineering, PySpark, AWS, Python Programming Language, Apache Spark, Databricks, Hadoop, Certifications in Databrick or Python or AWS. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Azure Databricks, PySparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and streamline processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of new applications- Conduct code reviews and ensure coding standards are met- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with PySpark- Strong understanding of data engineering concepts- Experience in building and optimizing data pipelines- Knowledge of cloud platforms like Microsoft Azure- Familiarity with data governance and security practices Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySparkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Key ResponsibilitiesWork on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. Build and operate very large data warehouses or data lakes. ETL optimization, designing, coding, & tuning big data processes using Apache Spark. Build data pipelines & applications to stream and process datasets at low latencies. Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience:Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake.Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. Minimum of 2 years of experience years in real time streaming using Kafka/KinesisMinimum 4 year of Experience in one or more programming languages Python, Java, Scala.Experience using airflow for the data pipelines in min 1 project.1 years of experience developing CICD pipelines using GIT, Jenkins, Docker, Kubernetes, Shell Scripting, Terraform Professional Attributes:Ready to work in B Shift (12 PM 10 PM) A Client facing skills:solid experience working in client facing environments, to be able to build trusted relationships with client stakeholders.Good critical thinking and problem-solving abilities Health care knowledge Good Communication Skills Educational Qualification:Bachelor of Engineering / Bachelor of Technology Additional Information:Data Engineering, PySpark, AWS, Python Programming Language, Apache Spark, Databricks, Hadoop, Certifications in Databrick or Python or AWS. Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education:**Position Summary :**The Data Analyst will focus on collecting, cleaning, and analyzing data to support business decisions.**Key Responsibilities:**- Gather, process, and analyze data to identify trends and insights.- Develop dashboards and reports to communicate findings.- Collaborate with stakeholders to understand data needs.- Ensure data accuracy and quality in all analyses.- Prepare and clean datasets for analysis to ensure accuracy and usability.- Generate reports and dashboards to communicate key performance metrics.- Support data-driven decision-making by identifying actionable insights.- Monitor data pipelines and troubleshoot issues to ensure smooth operation.- Collaborate with cross-functional teams to understand and meet data needs.** Qualifications:**- Bachelor's degree in a relevant field (e.g., Data Science, Statistics, Computer Science).- 2-4 years of experience in data analytics.- Proficiency in tools like Power BI, Tableau, and SQL.- Strong analytical and problem-solving skills.- Effective communication and teamwork abilities. Additional Information:- The candidate should have minimum 5 years of experience in Data Analytics.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies