Home
Jobs

157 Etl Development Jobs - Page 6

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

6 - 10 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability

Posted 1 month ago

Apply

0.0 - 2.0 years

4 - 7 Lacs

Navi Mumbai

Work from Office

Naukri logo

Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 5+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets

Posted 1 month ago

Apply

0.0 - 1.0 years

3 - 6 Lacs

Navi Mumbai

Work from Office

Naukri logo

Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 3+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets

Posted 1 month ago

Apply

10.0 - 20.0 years

10 - 19 Lacs

Karur

Remote

Naukri logo

Greetings from MindPro Technologies Pvt Ltd (Www.mindprotech.com) Job Description for Informatica Lead Position Experience Required: 10+ Years Mode : Remote Key Responsibilities Design & Development Lead the design, development, and implementation of ETL processes using Informatica products. Create, optimize, and maintain mappings, sessions, and workflows to ensure high performance and reliability. API Data Integration Coordinate with development teams to design and manage data ingestion from APIs into the Informatica environment. Develop strategies for real-time or near-real-time data processing, ensuring secure and efficient data flow. Collaborate on API specifications, error handling, and data validation requirements. Data Integration & Warehousing Integrate data from diverse sources (e.g., relational databases, flat files, cloud-based systems, APIs) into target data warehouses or data lakes. Ensure data quality by implementing best practices, validation checks, and error handling. Project Leadership Provide technical oversight and guidance to the ETL development team. Work with and expand development standards, processes, and coding practices to maintain a consistent and high-quality codebase. Coordinate with product owners, project managers, and onshore teams to track progress and meet milestones. Solution Architecture Work with business analysts and stakeholders to gather requirements and translate them into technical solutions. Propose improvements, optimizations, and best practices to enhance existing data integration solutions. Performance Tuning & Troubleshooting Identify bottlenecks in mappings or workflows; recommend and implement performance tuning strategies. Troubleshoot ETL or other data ingestion related issues, perform root cause analysis, and provide solutions in a timely manner. Collaboration & Communication Collaborate with onshore business and technical teams to ensure smooth project execution and knowledge sharing. Communicate project status, potential risks, and technical details to stakeholders and leadership. Participate in regular meetings with onshore teams, aligning on priorities and resolving issues. Documentation & Reporting Maintain comprehensive technical documentation including design specifications, test cases, and operational procedures. Generate reports or dashboards as needed to keep stakeholders informed of project progress and data pipeline performance.

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 16 Lacs

Noida

Work from Office

Naukri logo

Design, build, and manage PostgreSQL databases. Perform tuning, develop ETL, automate processes, ensure security, handle migrations, backups, and recovery. Strong SQL skills, Oracle to PostgreSQL migration a plus.

Posted 1 month ago

Apply

8 - 10 years

12 - 17 Lacs

Pune

Work from Office

Naukri logo

About The Role Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ? Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs ? Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor ? 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipro’s Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc ? 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Tableau. Experience8-10 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

7 - 12 years

9 - 19 Lacs

Bengaluru

Remote

Naukri logo

Payroll Company- Anlage Infotech Client-NTT Data Role & responsibilities Job Posting Title: Sr. ETL Developer Experience : 7+ year of Relevant experience in ETL Location : Any NTT Data Location Temporary Remote Shift Timing : 01:00 PM IST to 11:00 PM IST **************************************************************************************************** MUST HAVE Mandatory Skills for each technology (All MUST) Minimum 7+ Years of hands-on experience with ETL Development Hands on experience with ETL Flows & Jobs – Specifically ADF Pipelines & SSIS Hands on experience with Data Warehouse & Data Mart Should be very strong with SQL Queries on MS-SQL Server **************************************************************************************************** Required Strong hands-on experience in SQLs, PL/SQLs [Procedures, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp MUST] Experience on MS-SQL [MUST], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred) Note- Immediate Joiners only, Currently serving notice period, 15 days or Less

Posted 1 month ago

Apply

7 - 12 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

A bachelors degree in Computer Science or a related field. 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. Worked extensively on data integration, designing, and developing reusable interfaces Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling. Expert level understanding of data warehouse, core database concepts and relational database design. Experience in writing stored procedures, optimization, and performance tuning?Strong Technology acumen and a deep strategic mindset. Proven track record of delivering results Proven analytical skills and experience making decisions based on hard and soft data A desire and openness to learning and continuous improvement, both of yourself and your team members. Hands-on experience on development of APIs is a plus Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required Familiarity with Postgres and Python is a plus

Posted 1 month ago

Apply

2 - 5 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Role Data Engineer Skills: Data Modeling:* Design and implement efficient data models, ensuring data accuracy and optimal performance. ETL Development:* Develop, maintain, and optimize ETL processes to extract, transform, and load data from various sources into our data warehouse. SQL Expertise:* Write complex SQL queries to extract, manipulate, and analyze data as needed. Python Development:* Develop and maintain Python scripts and applications to support data processing and automation. AWS Expertise:* Leverage your deep knowledge of AWS services, such as S3, Redshift, Glue, EMR, and Athena, to build and maintain data pipelines and infrastructure. Infrastructure as Code (IaC):* Experience with tools like Terraform or CloudFormation to automate the provisioning and management of AWS resources is a plus. Big Data Processing:* Knowledge of PySpark for big data processing and analysis is desirable. Source Code Management:* Utilize Git and GitHub for version control and collaboration on data engineering projects. Performance Optimization:* Identify and implement optimizations for data processing pipelines to enhance efficiency and reduce costs. Data Quality:* Implement data quality checks and validation procedures to maintain data integrity. Collaboration:* Work closely with data scientists, analysts, and other teams to understand data requirements and deliver high-quality data solutions. Documentation:* Maintain comprehensive documentation for all data engineering processes and projects.

Posted 1 month ago

Apply

5 - 10 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Title: Senior Software Engineer - ETL Developer Main location: Hyderabad/ Bangalore / Chennai Employment Type: Full Time Experience: 5 to 10 yrs Role & responsibilities : Sr ETL Developer Position Description Looking for a Senior ETL Developer who has: ETL Development & Implementation Strong experience in designing, developing, and deploying ETL solutions using Informatica Cloud Services (ICS), Informatica PowerCenter, and other data integration tools. • Data Integration & Optimization Proficient in extracting, transforming, and loading (ETL) data from multiple sources, optimizing performance, and ensuring data quality. • Stakeholder Collaboration Skilled at working with cross-functional teams, including data engineers, analysts, and business stakeholders, to align data solutions with business needs. • Scripting & Data Handling Experience with SQL, PL/SQL, and scripting languages (e.g., Python, Shell) for data manipulation, transformation, and automation. • Tool Proficiency Familiarity with Informatica Cloud, version control systems (e.g., Git), JIRA, Confluence, and Microsoft Office Suite. • Agile Methodologies Knowledge of Agile frameworks (Scrum, Kanban) with experience in managing backlogs, writing user stories, and participating in sprint planning. • Testing & Validation Involvement in ETL testing, data validation, unit testing, and integration testing to ensure accuracy, consistency, and completeness of data. • Problem-Solving Skills Strong analytical mindset to troubleshoot, debug, and optimize ETL workflows, data pipelines, and integration solutions effectively. • Communication & Documentation Excellent written and verbal communication skills to document ETL processes, create technical design documents, and present data integration strategies to stakeholders. Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, lets turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, youll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. Thats why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our companys strategy and direction. Your work creates value. Youll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. Youll shape your career by joining a company built to grow and last. Youll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.

Posted 1 month ago

Apply

5 - 10 years

10 - 15 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

At least 5 years of Information Technology. Experience with hashtag#SQL Server, PL/SQL, data load, extract Experience in hashtag#Unix shell scripting, Python and automations. Experience in Bitbucket, GitLab and GitHub Experience in Jira and Chalk pages. Nice to have hashtag#Automation ( hashtag#Shell , hashtag#Python ) hashtag#Hadoop

Posted 1 month ago

Apply

3 - 8 years

18 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

ETL Development: Design, develop, and implement Ab Initio components and graphs for ETL processes. Develop complex data pipelines for large-scale data processing. Create and maintain data integration solutions. Data Analysis and Requirements: Analyze data requirements and collaborate with stakeholders to understand business needs. Understand and translate business requirements into technical solutions. Performance Tuning and Optimization: Optimize Ab Initio processes for performance and efficiency. Troubleshoot and debug issues related to application performance and deployment. Code and Documentation:

Posted 1 month ago

Apply

8 - 12 years

25 - 30 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Key Responsibilities: ETL Development: Design, develop, and implement Ab Initio components and graphs for ETL processes. Develop complex data pipelines for large-scale data processing. Create and maintain data integration solutions. Data Analysis and Requirements: Analyze data requirements and collaborate with stakeholders to understand business needs. Understand and translate business requirements into technical solutions. Performance Tuning and Optimization: Optimize Ab Initio processes for performance and efficiency. Troubleshoot and debug issues related to application performance and deployment. Code and Documentation:

Posted 1 month ago

Apply

11 - 17 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking a highly skilled and experienced Principal AI Solution Architect to join our dynamic team. The candidate will lead the AI Solutioning and Designing across Enterprise Teams and cross-functional teams. They will primarily be working with the MDM CoE to lead and drive AI solutions and optimizations and also provide thought leadership. The role involves developing and implementing AI strategies, collaborating with cross-functional teams, and ensuring the scalability, reliability, and performance of AI solutions. To succeed in this role, the candidate must have strong AI/ML, Data Science , GenAI experience along with MDM knowledg . Candidate must have AI/ML, data science and GenAI experience on technologies like ( PySpark / PyTorch , TensorFlow, LLM , Autogen , Hugging FaceVectorDB , Embeddings, RAGs etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Lead the Designing, solutioning and development of enterprise-level GenAI applications using LLM frameworks such as Langchain , Autogen , and Hugging Face. Architect intelligent pipelines using PySpark , TensorFlow, and PyTorch within Databricks and AWS environments. Implement embedding models and manage VectorStores for retrieval-augmented generation (RAG) solutions. Integrate and leverage MDM platforms like Informatica and Reltio to supply high-quality structured data to ML systems. Utilize SQL and Python for data engineering, data wrangling, and pipeline automation. Build scalable APIs and services to serve GenAI models in production. Lead cross-functional collaboration with data scientists, engineers, and product teams to scope, design, and deploy AI-powered systems. Ensure model governance, version control, and auditability aligned with regulatory and compliance expectations. Basic Qualifications and Experience Master’s degree with 11 - 1 4 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Bachelor’s degree with 1 5 - 16 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Diploma with 1 7 - 1 8 years of hands-on experience in Data Science, AI/ML technologies, or related technical domains Functional Skills: Must-Have Skills: 1 4 + years of experience working in AI/ML or Data Science roles, including designing and implementing GenAI solutions. Extensive hands-on experience with LLM frameworks and tools such as Langchain , Autogen , Hugging Face, OpenAI APIs, and embedding models. Expertise in AI/ML solution architecture and design , knowledge of industry best practices Experience desining GenAI based solutions using Databricks platform Hands-on experience with Python, PySpark , PyTorch , LLMs, Vector DB, Embeddings, SciKit , Langchain , SK-learn, Tensorflow , APIs, Autogen , VectorStores , MongoDB, DataBricks , Django Strong knowledge of AWS and cloud-based AI infrastructure Excellent problem-solving skills Strong communication and leadership skills Ability to collaborate effectively with cross-functional teams and stakeholders Experience in managing and mentoring junior team members Must be able to p rovide thought leadership to the junior team members Good-to-Have Skills: Prior experience in Data Modeling, ETL development, and data profiling to support AI/ML workflows. Working knowledge of Life Sciences or Pharma industry standards and regulatory considerations. Proficiency in tools like JIRA and Confluence for Agile delivery and project collaboration. Familiarity with MongoDB, VectorStores , and modern architecture principles for scalable GenAI applications. Professional Certifications Any Data Analysis certification (SQL , Python, Other DBs or Programming languages ) Any cloud certification (AWS or AZURE) Data Science and ML Certification s Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

5 - 10 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct regular team meetings to discuss progress and challenges Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio Strong understanding of ETL processes Experience with data integration and data warehousing Knowledge of data quality and data governance principles Hands-on experience with Ab Initio GDE and EME tools Additional Information: The candidate should have a minimum of 5 years of experience in Ab Initio This position is based at our Chennai office A 15 years full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

2 - 7 years

10 - 20 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Salary: 10- 30 LPA Exp: 2-7 years Location: Gurgaon/Pune/Bangalore/Chennai Notice period: Immediate to 30 days..!! Key Responsibilities: 2+ years hands on strong experience in Ab-Initio technology. Should have good knowledge about Ab-Initio components like Reformat, Join, Sort, Rollup, Normalize, Scan, Lookup, MFS, Ab-Initio parallelism and products like Metadata HUB, Conduct>IT, Express>IT, Control center and good to have clear understanding of concepts like Meta programming, continuous flows & PDL. Very good knowledge of Data warehouse, SQL and Unix shell scripting. Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage. Experience in working with banking domain data is an added advantage. Excellent technical knowledge in Design, development & validation of complex ETL features using Ab-Initio. Excellent knowledge in integration with upstream and downstream processes and systems. Ensure compliance to technical standards, and processes. Ability to engage and collaborate with Stakeholders to deliver assigned tasks with defined quality goals. Can work independently with minimum supervision and help the development team on technical Issues. Good Communication and analytical skills.

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Chennai

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Mumbai

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Kolkata

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Job Summary: We are seeking a highly skilled SAP BODS Data Engineer with strong expertise in ETL development and Enterprise Data Warehousing (EDW) . The ideal candidate will have a deep understanding of SAP Business Objects Data Services (BODS) and will be responsible for designing, developing, and maintaining robust data integration solutions. Key Responsibilities: Design, develop, and implement efficient ETL solutions using SAP BODS. Build and optimize SAP BODS jobs, including job design, data flows, scripting, and debugging. Develop and maintain scalable data extraction, transformation, and loading (ETL) processes from diverse data sources. Create and manage data integration workflows to ensure high performance and scalability. Collaborate closely with data architects, analysts, and business stakeholders to deliver accurate and timely data solutions. Ensure data quality and consistency across different systems and platforms. Troubleshoot and resolve data-related issues in a timely manner. Document all ETL processes and maintain technical documentation. Required Skills & Qualifications: 3+ years of hands-on experience with ETL development using SAP BODS . Strong proficiency in SAP BODS job design, data flow creation, scripting, and debugging. Solid understanding of data integration , ETL concepts , and data warehousing principles . Proficiency in SQL for data querying, data manipulation, and performance tuning. Familiarity with data modeling concepts and major database systems (e.g., Oracle, SQL Server, SAP HANA). Excellent problem-solving skills and keen attention to detail. Strong communication and interpersonal skills to facilitate effective collaboration. Ability to work independently, prioritize tasks, and manage multiple tasks in a dynamic environment. Required Skills Sap,Edw,Etl

Posted 1 month ago

Apply

8 - 13 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

remote typeOn-site locationsGurugram, HR time typeFull time posted onPosted 5 Days Ago job requisition idREQ401285 Senior ETL Developer What this job involves Are you comfortable working independently without close supervision? We offer an exciting role where you can enhance your skills and play a crucial part in delivering consistent, high-quality administrative and support tasks for the EPM team The Senior ETL Developer/SSIS Administrator will lead the design of logical data models for JLL's EPM Landscape system. This role is responsible for implementing physical database structures and constructs, as well as developing operational data stores and data marts. The role entails developing and fine-tuning SQL procedures to enhance system performance. The individual will support functional tasks of medium-to-high technological complexity and build SSIS packages and transformations to meet business needs. This position contributes to maximizing the value of SSIS within the organization and collaborates with cross-functional teams to align data integration solutions with business objectives. Responsibilities The Senior ETL Developer will be responsible for: Gathering requirements and processing information to design data transformations that will effectively meet end-user needs. Designing, developing, and testing ETL processes for large-scale data extraction, transformation, and loading from source systems to the Data Warehouse and Data Marts. Creating SSIS packages to clean, prepare, and load data into the data warehouse and transfer data to EPM, ensuring data integrity and consistency throughout the ETL process. Monitoring and optimizing ETL performance and data quality. Creating routines for importing data using CSV files. Mapping disparate data sources - relational DBs, text files, Excel files - onto the target schema. Scheduling the packages to extract data at specific time intervals. Planning, coordinating, and supporting ETL processes, including architecting table structure, building ETL processes, documentation, and long-term preparedness. Extracting complex data from multiple data sources into usable and meaningful reports and analyses by implementing PL/SQL queries. Ensuring that the data architecture is scalable and maintainable. Troubleshooting data integration and data quality issues and bugs, analyzing reasons for failure, implementing optimal solutions, and revising procedures and documentation as needed. Utilizing hands-on SQL features Stored Procedures, Indexes, Partitioning, Bulk loads, DB configuration, Security/Roles, Maintenance. Developing queries and procedures, creating custom reports/views, and assisting in debugging. The developer will also be responsible for designing SSIS packages and ensuring their stability, reliability, and performance. Sounds like you? To apply, you need to have: 8+ years of experience in Microsoft SQL Server Management Studioadministration and development Bachelors degree or equivalent Competency in Microsoft Office and Smart View Experience with Microsoft SQL databases and SSIS / SSAS development. Experience working with Microsoft SSIS to create and deploy packages and deploy for ETL processes. Experience in writing and troubleshooting SQL statements, creating stored procedures, views, and SQL functions Experience with data analytics and development. Strong SQL coding experience with performance optimization experience for data queries. Experience creating and supporting SSAS Cubes. Knowledge of Microsoft PowerShell and Batch scripting Good to have Power BI development experience Strong critical and analytical thinking and problem-solving skills Ability to multi-task and thrive in fast-paced, rapidly changing, and complex environment Good written and verbal communication skills Ability to learn new skills quickly to make a measurable difference Strong team player - proven success in contributing to a team-oriented environment Excellent communication (written and oral) and interpersonal skills Excellent troubleshooting and problem resolution skills What we can do for you At JLL, we make sure that you become the best version of yourself by helping you realize your full potential in an entrepreneurial and inclusive work environment. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay, and benefits package. Apply today! Location On-site Gurugram, HR Scheduled Weekly Hours 40 Job Tags: . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 1 month ago

Apply

5 - 10 years

4 - 7 Lacs

Lucknow

Work from Office

Naukri logo

Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software Support the entire application lifecycle (concept, design, develop, test, release and support) Special interest in SQL/Query and/or MDX/XMLA/OLAP data sources Responsible for end-to-end product development of Java-based services Work with other developers to implement best practices, introduce new tools, and improve processes Stay up to date with new technology trends Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 15+ years of software development in a professional environment Proficiency in Java Experience integrating services with relational databases and/or OLAP data sources Knowledge and experience in relational database, OLAP and/or query planning Strong working knowledge of SQL and/or MDX/XMLA Knowledge and experience creating applications on cloud platforms (Kubernetes, RedHat OCP) Exposure to agile development, continuous integration, continuous development environment (CI/CD) with tools such asGitHub, JIRA, Jenkins, etc. Other Toolsssh clients, docker Excellent interpersonal and communication skills with ability to effectively articulate technical challenges and devise solutions Ability to work independently in a large matrix environment Preferred technical and professional experience The position requires a back-end developer with strong Java skills Experienced integrating Business Intelligence tools with relational data sources Experienced integrating Business Intelligence tools with OLAP technologies such as SAP/BW, SAP/BW4HANA Experienced defining relational or OLAP test assets -test suites, automated tests - to ensure high code coverage and tight integration with Business Intelligence tools Full lifecycle of SAP/BW and BW4HANA assets - Cube upgrade, server, and server supportadministering, maintaining, and upgrading using current SAP tooling

Posted 1 month ago

Apply

12 - 15 years

15 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Overview Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping peoples careers and being thoughtful about employee wellbeing. Boeing India Software Engineering team is currently looking for one Lead Software Engineer Developer to join their team in Bengaluru, KA. As a ETL Developer , you will be part of the Application Solutions team, which develops software applications and Digital products that create direct value to its customers. We provide re-vamped work environments focused on delivering data-driven solutions at a rapidly increased pace over traditional development. Be a part of our passionate and motivated team who are excited to use the latest in software technologies for modern web and mobile application development. Through our products we deliver innovative solutions to our global customer base at an accelerated pace. Position Responsibilities: Perform data mining and collection procedures. Ensure data quality and integrity, Interpret and analyze data problems. Visualize data and create reports. Experiments with new models and techniques Determines how data can be used to achieve customer / user goals. Designs data modeling processes Create algorithms and predictive models to for analysis. Enables development of prediction engines, pattern detection analysis, and optimization algorithms, etc. Develops guidance for analytics-based wireframes. Organizes and conducts data assessments. Discovers insights from structured and unstructured data. Estimate user stories/features (story point estimation) and tasks in hours with the required level of accuracy and commit them as part of Sprint Planning. Contributes to the backlog grooming meetings by promptly asking relevant questions to ensure requirements achieve the right level of DOR. Raise any impediments/risks (technical/operational/personal) they come across and approaches Scrum Master/Technical Architect/PO accordingly to arrive at a solution. Update the status and the remaining efforts for their tasks on a daily basis. Ensures change requests are treated correctly and tracked in the system, impact analysis done, and risks/timelines are appropriately communicated. Hands-on experience in understanding aerospace domain specific data Must coordinate with data scientists in data preparation, exploration and making data ready. Must have clear understanding of defining data products and monetizing. Must have experience in building self-service capabilities to users. Build quality checks across the data lineage and responsible in designing and implementing different data patterns. Can influence different stakeholders for funding and building the vision of the product in terms of usage, productivity, and scalability of the solutions. Build impactful or outcome-based solutions/products. Basic Qualifications (Required Skills/Experience): Bachelors or masters degree as BASIC QUALIFICATION 12-15 years of experience as a data engineer. Expertise in SQL, Python, Knowledge of Java, Oracle, R, Data modeling, Power BI. Experience in understanding and interacting with multiple data formats. Ability to rapidly learn and understand software from source code. Expertise in understanding, analyzing & optimizing large, complicated SQL statements Strong knowledge and experience in SQL Server, database design and ETL queries. Develop software models to simulate real world problems to help operational leaders understand on which variables to focus. Candidate should have proficiency to streamline and optimize databases for efficient and consistent data consumption. Strong understanding of Datawarehouse concepts, data lake, data mesh Familiar with ETL tools and Data ingestion patterns Hands on experience in building data pipelines using GCP. Hands on experience in writing complex SQL (No- SQL is a big plus) Hands on experience with data pipeline orchestration tools such as Airflow/GCP Composer Hands on experience on Data Modelling Experience in leading teams with diversity Experience in performance tuning of large datawarehouse/datalakes. Exposure to prompt engineering, LLMs, and vector DB. Python, SQL and Pyspark Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse) Preferred Qualifications [Required Skills/Experience] PubSUB, Terraform Deep Learning - Tensor flow Time series, BI/Visualization Tools - Power BI and Tablaeu, Languages - R/Phython Deep Learning - Tensor flow Machine Learning NLP Typical Education & Experience Education/experience typically acquired through advanced education (e.g. Bachelor) and typically 12 to 15 years' related work experience or an equivalent combination of education and experience (e.g. Master+11 years of related work experience etc.) Relocation This position does offer relocation within INDIA. Export Control Requirements This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India)

Posted 1 month ago

Apply

7 - 12 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:4-13yrs Work Location :Bglre/Hybed/Chennai/Pune/Kolkata Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. - Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. - Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). - Performs source system analysis as required. - Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. - Implements versioning of the ETL repository and supporting code as necessary. - Develops stored procedures, database triggers and SQL queries where needed. - Implements best practices and tunes SQL code for optimization. - Loads data from SF Power Exchange to Relational database using Informatica. - Works with XML's, XML parser, Java and HTTP transformation within Informatica. - Works with Informatica Data Quality (Analyst and Developer) - Primary skill is Informatica PowerCenter Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601) to proceed further

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies