Home
Jobs

272 Datastage Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : Data Warehouse ETL TestingMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to ensure the successful development of applications. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct code reviews and ensure best practices are followed Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Good To Have Skills: Experience with Data Warehouse ETL Testing- Strong understanding of data integration and ETL processes- Hands-on experience in developing and implementing data pipelines- Knowledge of data quality and data governance principles Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient Ab Initio applications.- Collaborate with team members to troubleshoot and resolve application issues.- Conduct regular code reviews to ensure quality and efficiency.- Stay updated on industry trends and best practices in application development.- Provide mentorship and guidance to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of ETL processes and data integration.- Experience with data quality and data governance principles.- Knowledge of SQL and database management systems.- Good To Have Skills: Experience with data modeling and data warehousing. Additional Information:- The candidate should have a minimum of 3 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop and enhance applications for various business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Implement best practices for application design and development- Ensure timely delivery of high-quality applications Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio with minimum of 5 years of experience in Ab Initio- Strong understanding of ETL processes- Experience with data integration and data warehousing- Hands-on experience in designing and developing applications using Ab Initio- Knowledge of data modeling and database concepts Additional Information:- This position is based at our Pune, Bengaluru & Chennai office- A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative team environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data warehousing concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS Glue.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

0.0 - 5.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Over the past 20 years Amazon has earned the trust of over 300 million customers worldwide by providing unprecedented convenience, selection and value on Amazon.com. By deploying Amazon Pay s products and services, merchants make it easy for these millions of customers to safely purchase from their third party sites using the information already stored in their Amazon account. In this role, you will lead Data Engineering efforts to drive automation for Amazon Pay organization. You will be part of the data engineering team that will envision, build and deliver high-performance, and fault-tolerant data pipeliens. As a Data Engineer, you will be working with cross-functional partners from Science, Product, SDEs, Operations and leadership to translate raw data into actionable insights for stakeholders, empowering them to make data-driven decisions. Design, implement, and support a platform providing ad-hoc access to large data sets Interface with other technology teams to extract, transform, and load data from a wide variety of data sources Implement data structures using best practices in data modeling, ETL/ELT processes, and SQL, Redshift, and OLAP technologies Model data and metadata for ad-hoc and pre-built reporting Interface with business customers, gathering requirements and delivering complete reporting solutions Build robust and scalable data integration (ETL) pipelines using SQL, Python and Spark. Build and deliver high quality data sets to support business analyst, data scientists, and customer reporting needs. Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers - 1+ years of data engineering experience - Experience with SQL - Experience with data modeling, warehousing and building ETL pipelines - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) - Experience with one or more scripting language (e.g., Python, KornShell) - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.

Posted 1 month ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Subject matter expert (SME) in one or more Healthcare domains Analyzes and documents client's business requirements, processes and communicates these requirements by constructing conceptual data and process models, including data dictionaries and functional design documents Collaborates with data teams, departments, other IT groups, and technology vendors to define the data needs and facilitate analytic solutions Provides input into developing and modifying data warehouse and analytic systems to meet client needs and develops business specifications to support these modifications Ability to communicate complex technical and functional concepts verbally and in writing Ability to lead socialization and consensus building efforts for innovative data and analytic solutions Identifies opportunities for reuse of data across the enterprise; profiles and validates data sources Creates test scenarios and develops test plans to be used in testing the data products to verify that client requirements are incorporated into the system design. Assists in analyzing testing results throughout the project Participates in architecture and technical reviews to verify 'intent of change' is carried out through the entire project Performs root cause analysis and application resolution Assignment of work and development of less experienced team members Ensure proper documentation and on-time delivery of all functional artifacts and deliverables Document and communicate architectural vision, technical strategies, and trade-offs to gain broad buy-in Reduce inefficiencies through rationalization and standards adherence Responsible for identifying, updating, and curating the data standardization and data quality rules Responsible for leading and providing direction for data management, data profiling and source to target mapping Responsible for optimizing and troubleshooting and data engineering processes Work independently, but effectively partner with a broader team, to design and develop enterprise data solutions Ability to creatively take on new challenges and work outside comfort zone Comfortable in communicating alternative ideas with clinicians related to information options and solutions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering, or another related field) Revenue Cycle Management Domain Experience 7+ years in a healthcare data warehouse setting and experience in profiling and analyzing disparate healthcare datasets (Financial, Clinical Quality, Value Based Care, population health, Revenue cycle analytics, Health system operations, etc.) and ability to convert this data into insights 7+ years working with healthcare datasets and ability to convert the business requirements into functional designs that are scalable and maintainable 7 + years of experience with Oracle/SQL server databases including T-SQL, PL/SQL, Indexing, partitioning, performance tuning 7+ years of experience in creating Source to Target Mappings and ETL designs (using SSIS / Informatica / DataStage) for integration of new/modified data streams into the data warehouse/data marts 5+ years of experience in designing and implementing data models to support analytics with solid knowledge of dimensional modeling concepts Experience with Epic Clarity and/or Caboodle data models Healthcare Domain Knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

3.0 - 8.0 years

8 - 13 Lacs

Noida

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Cloud Migration Planning and Execution: Assist in developing and implementing strategies for migrating ETL processes to cloud platforms like Azure Participate in assessing the current infrastructure and creating a detailed migration roadmap ETL Development and Optimization: Design, develop, and optimize DataStage ETL jobs for cloud environments Ensure data integrity and performance during the migration process Unix Scripting and Automation: Utilize Unix shell scripting to automate data processing tasks and manage ETL workflows Implement and maintain scripts for data extraction, transformation, and loading Collaboration and Coordination: Work closely with cloud architects, senior data engineers, and other stakeholders to ensure seamless integration and migration Coordinate with IT security teams to ensure compliance with data privacy and security regulations Technical Support and Troubleshooting: Provide technical support during and after the migration to resolve any issues Conduct testing and validation to ensure the accuracy and performance of migrated data Documentation and Training: Maintain comprehensive documentation of the migration process, including data mappings, ETL workflows, and system configurations Assist in training team members and end-users on new cloud-based ETL processes and tools Performance Monitoring and Optimization: Monitor the performance of ETL processes in the cloud and make necessary adjustments to optimize efficiency Implement best practices for cloud resource management and cost optimization Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Engineering graduate or equivalent experience 3+ years of relevant Datastage development experience 2+ years experience in development/coding on Spark/Scala or Python or Pyspark 1+ years of experience working on Microsoft Azure Databricks Relevant experience on Databases like Teradata, Snowflake Hands-on development experience in UNIX scripting Experience in working on data warehousing projects Experience with Test Driven Development and Agile methodologies Sound knowledge of SQL programming and SQL Query Skills Proven ability to apply the knowledge of principles and techniques to solve technical problems and write code based on technical design Proficient in learning & adopting new technologies and use them to execute the use cases for business problem solving Exposure to job schedulers like Airflow and ability to create and modify DAGs Proven solid Communication skills (written and Verbal) Proven ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization Proven exposure on DevOps methodology and creating CI/CD deployment pipeline Proven excellent Analytical and Communication skills (Both Verbal and Written) At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

3.0 - 7.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together Primary Responsibilities Support the full data engineering lifecycle including research, proof of concepts, design, development, testing, deployment, and maintenance of data management solutions Utilize knowledge of various data management technologies to drive data engineering projects Lead data acquisition efforts to gather data from various structured or semi-structured source systems of record to hydrate client data warehouse and power analytics across numerous health care domains Leverage combination of ETL/ELT methodologies to pull complex relational and dimensional data to support loading DataMart’s and reporting aggregates Eliminate unwarranted complexity and unneeded interdependencies Detect data quality issues, identify root causes, implement fixes, and manage data audits to mitigate data challenges Implement, modify, and maintain data integration efforts that improve data efficiency, reliability, and value Leverage and facilitate the evolution of best practices for data acquisition, transformation, storage, and aggregation that solve current challenges and reduce the risk of future challenges Effectively create data transformations that address business requirements and other constraints Partner with the broader analytics organization to make recommendations for changes to data systems and the architecture of data platforms Support the implementation of a modern data framework that facilitates business intelligence reporting and advanced analytics Prepare high level design documents and detailed technical design documents with best practices to enable efficient data ingestion, transformation and data movement Leverage DevOps tools to enable code versioning and code deployment Leverage data pipeline monitoring tools to detect data integrity issues before they result into user visible outages or data quality issues Leverage processes and diagnostics tools to troubleshoot, maintain and optimize solutions and respond to customer and production issues Continuously support technical debt reduction, process transformation, and overall optimization Leverage and contribute to the evolution of standards for high quality documentation of data definitions, transformations, and processes to ensure data transparency, governance, and security Ensure that all solutions meet the business needs and requirements for security, scalability, and reliability Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree (preferably in information technology, engineering, math, computer science, analytics, engineering or other related field) 3+ years of experience in Microsoft Azure Cloud, Azure Data Factory, Data Bricks, Spark, Scala / Python , ADO. 5+ years of combined experience in data engineering, ingestion, normalization, transformation, aggregation, structuring, and storage 5+ years of combined experience working with industry standard relational, dimensional or non-relational data storage systems 5+ years of experience in designing ETL/ELT solutions using tools like Informatica, DataStage, SSIS , PL/SQL, T-SQL, etc. 5+ years of experience in managing data assets using SQL, Python, Scala, VB.NET or other similar querying/coding language 3+ years of experience working with healthcare data or data to support healthcare organizations At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills.

Posted 1 month ago

Apply

9.0 - 14.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

This role involves the development and application of engineering practice and knowledge in designing, managing and improving the processes for Industrial operations, including procurement, supply chain and facilities engineering and maintenance of the facilities. Project and change management of industrial transformations are also included in this role. - Grade Specific Focus on Industrial Operations Engineering. Fully competent in own area. Acts as a key contributor in a more complex/ critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward. Skills (competencies)

Posted 1 month ago

Apply

3.0 - 11.0 years

17 - 19 Lacs

Pune

Work from Office

Naukri logo

Join us as an ETL Test Automation Engineer at Barclays, where youll spearhead the evolution of our digital landscape, driving innovation and excellence. Youll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. To be successful as an ETL Test Automation Engineer you should have experience with: ETL Tools: Proficiency with enterprise ETL platforms like Abinitio primarily, Informatica PowerCenter, IBM DataStage, Talend, SSIS, or Matillion Scripting/Programming: Strong skills in Python, Java, or Scala for automation development SQL: Advanced database querying and optimization techniques across multiple database platforms Data Warehousing: Understanding of dimensional modeling and data warehouse architectures Cloud Platforms: Experience with AWS (Glue, Redshift), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery) Hadoop Ecosystem: Hands-on experience with HDFS, MapReduce, YARN, and Spark Some other highly valued skills includes: CI/CD Pipelines: Implementation of automated deployment for ETL processes Testing Frameworks: Data validation and automated testing methodologies i. e. Pytest Workflow Orchestration: Tools like Airflow, Control-M, or Oozie, Tivoli for process scheduling Version Control: Git-based workflows for ETL code management You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organizations technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Must Have Skills - Strong experience working with ETL tool IBM Info sphere Data Stage to develop data pipelines and Data Warehousing. Strong hands-on experience on DataBricks. Have strong hands-on experience with SQL and relational databases Proactive with strong communication and interpersonal skills to effectively collaborate with team members and stakeholders. Strong understanding of data processing concepts (ETL) The candidate should be prepared to sometimes step outside the developer role to gather and create their own analysis and requirements. Secondary skills required Experience in T-SQL writing stored procedures. Moderate Experience in AWS Cloud services. Ability to write sufficient and comprehensive documentation about data processing flow.

Posted 1 month ago

Apply

8.0 - 15.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Associate Director, Tech SME In this role, you will: Design and engineer software with the customer/user experience as a key objective. Actively contributes to Technology Engineering Practice by sharing Subject Matter Expertise from their area of specialism, best practice and learnings. Drives adherence to all standards and policies within their area of Technology Engineering Delivery and support of data related infrastructure and architecture to optimise data storage and consumption across the bank, including addressing functional and non-functional requirements relevant to data in large applications Engineer and implement security measures for the protection of internal systems, and services. Establish an environment to minimize variation and ensure predictable high quality code and data Assist in team development while holding teams accountable for their commitments, removing roadblocks to their work; leveraging organizational resources to improve capacity for project work; and mentoring and developing team members. Promote empowerment of the team, ensure that each team member is fully engaged in the project and making a meaningful contribution, and encourage a sustainable pace with high levels of quality for the team. Managing stakeholder communications and helping to implement an effective system of project governance Requirements To be successful in this role, you should meet the following requirements: Knowledge of DataStage , Oracle and Unix Experience in SpringBoot API , Kubernetes , Postman , GCP Experience on design and implement DevOps Continuous Integration / Continuous Delivery (CI/CD) Pipeline. Experience supporting middleware / database design, build and troubleshooting in development and production environments. Experience of working with monitoring and alerting tools such as but not limited to AppDynamics and Splunk Experience in agile and DevOps environment using team collaboration tools such as Github, Confluence and JIRA. Working with Ops and Dev Engineers to ensure operational issues are identified and addressed at all stages of a product or service release / change. Provide support in identification and resolution of all incidents associated with the IT service Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Keep up to date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable Technical leadership of large team of developers and help development of team capabilities. Work with senior business stakeholders

Posted 1 month ago

Apply

6.0 - 8.0 years

1 - 4 Lacs

Chennai

Work from Office

Naukri logo

Job Title:Snowflake Developer Experience6-8 Years Location:Chennai - Hybrid : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).

Posted 1 month ago

Apply

3.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and , ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist . In this role, you will: A senior full stack engineer with deep hands-on experience and knowledge in ETL (Extract, Transform, Load) tools e.g. IBM Data stage, SQL, Shell scripting, Control-M job development, API Development, Design Patterns, SDLC, IaC tools, testing and site reliability engineering. Define and implement best practices for software development, framework, and patterns, including coding standards, code reviews, and testing methodologies. Generalist with the breadth and depth of experience in CICD best practices and has core experience in one of the following areas: Software Development (ie. Secure coding/SDLC/API development/clean code management) Testing (ie. TDD/BDD/Automated testing/Contract testing/API testing) Site Reliability Engineering (ie. Release engineering/Observability/Risk management) See a problem or an opportunity with the ability to engineer a solution, is respected for what they deliver not just what they say, should think about the business impact of their work and has a holistic view to problem- solving Have proven industry experience in developing code, defining process, standards, ability to pick up on new technologies, challenges, apply thinking to many problems across multiple technical domains Contributes to architectural discussions by asking the right questions to ensure a solution matches the business needs Identify opportunities for system optimization, performance tuning, and scalability enhancements. Implement solutions to improve system efficiency and reliability. Excellent verbal and written communication skills to articulate technical concepts to both technical and non-technical stakeholders. Requirements To be successful in this role, you must meet the following requirements: Is tech-forward in thinking, actively researching new ideas/processes and is the driving force to adopt them Ability to work across cultures and all locations in a complex, matrix organization, proven experience to deliver engineering solutions to banking or financial services organization, Excellent leadership and team management skills, with the ability to motivate and inspire teams to achieve their goals, with ability to analyze complex technical and business problems and develop effective solutions. Managing operational functions, directing process re-engineering and efficiency exercises. Strong ability to balance risks vs rewards and maximizing the cost effectiveness and profitability for the business, Experience with Agile methodologies and software development processes. Good to have skills: Knowledge on latest technology, tools like Scala, Python, Dataflow, Databricks, Apache Spark SQL, Hadoop, REST API, databases, Hadoop, Kafka, Cloud technologies e.g. GCP, AWS, will be an added advantage

Posted 1 month ago

Apply

4.0 - 8.0 years

0 - 1 Lacs

Bengaluru

Work from Office

Naukri logo

Were Hiring: Sr. Software Engineer SnapLogic | Bangalore | 48 Years Experience Job Title: Sr. Software Engineer – SnapLogic Location: Bangalore Experience: 4–8 Years Client & Budget: Will be discussed during the call Notice Period: Immediate to 30 Days preferred Key Responsibilities -Design and develop SnapLogic pipelines for enterprise data integration -Migrate ETL jobs into SnapLogic and manage platform moderation on AWS -Work closely with cross-functional teams to gather integration requirements -Configure SnapLogic components (snaps, pipelines, transformations) for optimized performance -Ensure data quality and reliability through well-structured ETL processes -Keep up with new SnapLogic features and best practices to enhance platform usage -Collaborate with business stakeholders to deliver long-term, sustainable solutions Required Skills -SnapLogic: 2–4 years of hands-on experience in pipeline development & debugging -ETL Tools: Experience with tools like DataStage, Informatica -Cloud & Data Warehousing: AWS Cloud exposure and hands-on Snowflake experience -Databases: Strong in SQL, PL/SQL, and RDBMS concepts -ETL Best Practices: Data transformation, cleansing, and mapping Bonus: SnapLogic Developer Certification is a big plus! Why Join Us? -Work on cutting-edge integration projects with modern tech stacks -Be part of a collaborative and forward-thinking engineering team -Opportunity to work with enterprise clients and mission-critical data platforms Ready to Apply? Send your CV to [ YourEmail@example.com ] or DM me to learn more. #Hiring#SnapLogic#ETLDeveloper#SoftwareEngineer#BangaloreJobs#AWS#Snowflake#DataStage#Informatica#SQL#PLSQL#DataIntegration#Hurryup#Applynow#Bengalurujobs#Snaplogicjobs#Referfriends#Hriring#ImmediateJoiner#Rwefercolleuges#Experienced#Datastageskill#linkedinconnection#like#share#refer#opnetowork#urgentopprtunuties#indianjobs#

Posted 1 month ago

Apply

8.0 - 12.0 years

17 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Candidate, We have job opening for SnapLogic Developer with one of our client . If you are interested in this position, please share update resume in this email id : shaswati.m@bct-consulting.com Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.

Posted 1 month ago

Apply

8.0 - 10.0 years

15 - 20 Lacs

Gurugram

Work from Office

Naukri logo

Position Summary: We are looking for an experienced Microsoft 365 Specialist to join our dynamic team for streamlining the enterprise Project data. The ideal candidate will possess a strong proficiency in Microsoft 365 applications and Generative AI tools, along with extensive knowledge of data governance principles. This role will focus on data aggregation, integration, and the development of a robust data architecture to ensure data integrity and accessibility across multiple digital projects in the organization. The candidate should be capable of acting as a developer to build a future-proof architecture that connects various data storage options in our Digital business groups. This would make our digital projects future-proof and AI implementation ready with respect to data flow, data quality and lead to overall operational excellence A Snapshot of your Day How You'll Make an Impact (responsibilities of role) Utilize the full suite of Microsoft 365 applications to streamline data & workflows across different Digital Projects and segments. Customization of the same (as required) will be needed. Act as a developer to build a future-proof architecture that connects various data storage options, including applications, cloud services, drives, and SharePoint etc. Designed architecture shall consolidate fragmented data from various sources to create a single, reliable source of truth for accurate reporting and analysis Integrate and leverage Generative AI tools, such as Co-Pilot, to improve data analysis and reporting capabilities Implement data governance policies, workflows and practices to ensure data quality, security, and compliance with relevant regulations Experience in data integration and transformation techniques, including ETL (Extract, Transform, Load) processes, to ensure data consistency and accuracy Collaborate with stakeholders to identify data needs and ensure accurate reporting and analysis Ensure data integrity and accessibility across the organization, enabling informed decision-making Communicate effectively with cross-functional teams and stakeholders to understand data requirements and deliver solutions that meet business needs Provide training and support to team members on data governance policies, procedures and required operability of Microsoft 365 tools Keep abreast of new features and capabilities in Microsoft 365 related to data governance. What You Bring Bachelor's/master's degree in information technology or computer science, or a related field. 8 to 10 years of experience in developing architectures for data governance. Proven experience with Microsoft 365 applications and Generative AI tools, like Co-Pilot Strong understanding of data governance principles, practices, and policies Experienced in utilizing a variety of database management systems and data exchange formats to optimize data storage, retrieval, and interoperability Knowledge of relevant industry regulations and standards . Proficiency in data architecture design Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders Strong problem-solving skills and the ability to work collaboratively in a dynamic team environment across the globe

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

6.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Diverse Lynx is looking for Datastage Developer to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing data integration solutions using Datastage Extracting, transforming, and loading data from various sources into target systems Developing and testing complex data integration workflows, including the use of parallel processing and data quality checks Collaborating with database administrators, data architects, and stakeholders to ensure the accuracy and consistency of data Monitoring performance and optimizing Datastage jobs to ensure they run efficiently and meet SLAs Troubleshooting issues and resolving problems related to data integration Knowledge of data warehousing, data integration, and data processing concepts Strong problem-solving skills and the ability to think creatively and critically Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders

Posted 1 month ago

Apply

3.0 - 7.0 years

13 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

To influence key stakeholders to achieve the best-desired outcome. Responsible for translating detailed designs into robust, scalable and reusable solutions that deliver exceptional user experience and communicate the design and key design decisions to related parties. Carrying out the detailed technical analysis of projects, changes and implementations to production. A desire to find ways to continually improve the service delivered to customers You will be working in product development & Production support as per the project requirements. You may need to work on UK shifts and occasionally on weekends as per project needs. As part of Production support, you need to be understanding and solving the issues or alteast need to be vigilant in escalating the ticket within the timelines if not able to solve the ticket or required another team’s help. Should be able to identify the bottlenecks of the process and automate them wherever feasible. Should be able to proactively identify and resolve the issues wherever required. Self-motivated, focused and able to work efficiently to deadlines are essential. Requirements To be successful in this role, you should meet the following requirements: Must have Good Knowledge on Oracle Hyperion financial Management (HFM), Financial Data Management Enterprise Edition (FDMEE), IT Infrastructure architecture design and implementation Must have good knowledge and experience on PowerShell Scripting, Oracle DB, SQL, Control M & DataStage. Good knowledge of Devops tooling like Ansible, G3, Ci/CD pipeline. Knowledge of collaboration tools preferably JIRA and Confluence. Good knowledge on HSBC Internal systems like Service now, ICE controls, DMOV, DUSE etc. Good understanding on Incident Management/Change Management/Problem Management. Both spoken and written communication skills with experience of adapting your style and approach to the audience and message to be delivered. Good to have Python and Django skills.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 10 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

This position is responsible for design, implementation, and support of MetLifes enterprise data management and integration systems, the underlying infrastructure, and integrations with other enterprise systems and applications using AIX, Linux, or Microsoft Technologies. Provide technical expertise in the planning, engineering, design, implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes. Partner with the Capacity Management, Production Management, Application Development Teams and the Business to ensure customer expectations are maintained and exceeded. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Identify and resolve complex data management and integration system issues (Tier 1 support) utilizing product knowledge and structured troubleshooting tools and techniques. Support Disaster Recovery implementation and testing as required Experience in design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java ) Begin tackling organizational impediments Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelors degree in computer science, Information Systems, or related field. Experience 3+ years of total experience and at least 2+ years of experience in Informatica applications implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Informatica PowerCenter Operating System Knowledge (Linux/Windows/AIX) Azure Dev Ops Pipeline Knowledge Enterprise Scheduling Knowledge (Maestro) Troubleshooting Communications CP4D Datastage Mainframe z/OS Knowledge Experience in creating and working on Service Now tasks/tickets Other Requirements (licenses, certifications, specialized training - if required) Working Relationships Internal Contacts (and purpose of relationship): MetLife internal partners External Contacts (and purpose of relationship) - If Applicable MetLife external partners

Posted 1 month ago

Apply

2.0 - 3.0 years

3 - 8 Lacs

Pimpri-Chinchwad, Pune

Work from Office

Naukri logo

Role & responsibilities Develop, implement & fine-tune deep learning AI models for wide range of Computer Vision application including object classification, object detection, segmentation, OCR & NLP Perform data annotation using automation script & preprocessing to prepare high-quality datasets for training our in-house deep learning models. Training of AI models with model inference and evaluate model accuracy & performance metrics using frameworks like PyTorch, Tensorflow Collaborate with software engineers to deploy models in production systems, ensuring scalability, reliability and efficiency. Develop APIs using FastAPI or similar frameworks to enable seamless integration and interaction with deployed models Experience in Deep Learning, Computer Vision, NLP with a focus on object classification, detection, segmentation, OCR & text processing. Pytho , C++, Understanding of deep learning frameworks such as Tensorflow, PyTorc,h or Keras Experience with AI model training,hyperparameters tunin,g and evaluation of large-scale datasets. Data Annotation familiarity & preprocessing techniques to prepare datasets for deep learning models. Version control systems like Git Problem-solving skills, communication skills, collaboration skills, team player AI Model deployment & serving, including the use of APIs and frameworks like FastAPI

Posted 1 month ago

Apply

3 - 5 years

3 - 7 Lacs

Chennai

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Mandatory Skills: SQL Server. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies