Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
5 - 15 Lacs
kolkata, bengaluru, delhi / ncr
Work from Office
Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology • Basics of Architecture and Design fundamentals • Knowledge of Testing tools • Knowledge of agile methodologies • Understanding of Project life cycle activities on development and maintenance projects • Understanding of one or more Estimation methodologies, Knowledge of Quality processes • Basics of business domain to understand the business requirements • Analytical abilities, Strong Technical Skills, Good communication skills • Good understanding of the technology and domain • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods • Awareness of latest technologies and trends • Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills: Technology->Data Management - Data Integration->DataStage Preferred Skills: Technology->Data Management - Data Integration->DataStage
Posted 15 hours ago
15.0 - 20.0 years
10 - 14 Lacs
bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : IBM InfoSphere DataStage Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationRoleTechnology SupportJob TitleSenior DataStage ConsultantCareer Level08Must have skills:Apache, Tomcat, IIS, IBM WebSphere Administration, IBM DataStage, Linux shell scripting Good to have skills:Teradata, RDBMS and SQL experience in Oracle, DB2 About The Role :Lead the effort to design, build and configure applications, acting as the primary point of contact.Job Summary The ETL Developer is responsible for L2-L3 production support , administration of DataStage Application, and designing, building, deploying and maintaining ETL DataStage interfaces using the IBM InfoSphere DataStage ETL development tool.Key Responsibilities1. Extend Production Support in L2 / L3 capacity. Being able to efficiently debug and troubleshoot production issues2. Evaluate existing data solutions, write scalable ETLs, develop documentation, and train/help team members.3. Collaborate and work with business/development teams and infrastructure teams on L3 issues and follow the task to completion.4. Participate & provide support for releases, risks, mitigation plan, and regular DR exercise for project roll out.5. Drive Automation, Fixes to prevent issues from reoccurring.6. Manage Service Level Agreements7. Bring continuous improvements to reduce time to resolve for production incidents8. Perform root cause analysis and identify and implement corrective and preventive measures9. Document standards, processes and procedures relating to best practices, issues and resolutions10. Constantly upskill with tools & technologies to meet organization's future needs11. Be available on call (on rotation) in a support role12. Effectively manage multiple, competing prioritiesTechnical Responsibilities:Excellent Understanding of technical concepts.Strong understanding of OS related dependenciesStrong exposure to the shell scripting. Having expertise in any Cloud and Middleware technologies would be a great value add on Professional Attributes:Good verbal and written communication skills to connect with customers at varying levels of the organizationAbility to operate independently and make decisions with little direct supervisionCandidate must be willing to cross skill and up skill based on project and business requirements.Education Qualification:Higher Level Qualification in a technical subject is desirableIBM DataStage certification. Additional Information:A:Strong written & oral communication skills.B:Should be open to work in shifts. Qualification 15 years full time education
Posted 1 day ago
5.0 - 8.0 years
1 - 2 Lacs
hyderabad, pune, bengaluru
Hybrid
Role & responsibilities Datastage Developer Preferred candidate profile Exp..-5-8YRS Location: Hyderabad , Pune & Banglore Notice Period : 0-30Days Interested Candidates share your cv at Muktai.S@alphacom.in
Posted 2 days ago
4.0 - 9.0 years
8 - 18 Lacs
pune
Work from Office
JD for Datastage Must have Skills- IBM DataStage, SQL Good to have- Unix, US Healthcare & Insurance. Detailed Job Description- 1.Relevant experience in designing and development of ETL artefacts using IBM DataStage. 2.Strong business analysis and requirement definition background in Data Integration (ETL). 3.Strong SQL/Database concepts. 4.Excellent oral and written communication skills. 5.Experience in establishing criteria for data use/test cases.
Posted 2 days ago
4.0 - 9.0 years
8 - 18 Lacs
chennai
Work from Office
JD for Datastage Must have Skills- IBM DataStage, SQL Good to have- Unix, US Healthcare & Insurance. Detailed Job Description- 1.Relevant experience in designing and development of ETL artefacts using IBM DataStage. 2.Strong business analysis and requirement definition background in Data Integration (ETL). 3.Strong SQL/Database concepts. 4.Excellent oral and written communication skills. 5.Experience in establishing criteria for data use/test cases.
Posted 2 days ago
4.0 - 9.0 years
8 - 18 Lacs
bengaluru
Work from Office
JD for Datastage Must have Skills- IBM DataStage, SQL Good to have- Unix, US Healthcare & Insurance. Detailed Job Description- 1.Relevant experience in designing and development of ETL artefacts using IBM DataStage. 2.Strong business analysis and requirement definition background in Data Integration (ETL). 3.Strong SQL/Database concepts. 4.Excellent oral and written communication skills. 5.Experience in establishing criteria for data use/test cases.
Posted 2 days ago
6.0 - 10.0 years
12 - 20 Lacs
nagpur
Work from Office
HCL Hiring for BA-with IBM DataStage and ETL role!! Location-Nagpur Experience-6 to 10 Years Notice Period- Immediate to 60 Days 7+ Years experience of understanding of business analysis principles and methodologies with IBM DataStage and ETL concepts. Proficiency in SQL and data modeling techniques. Excellent communication, interpersonal, and presentation skills. Ability to analyze complex data and translate it into actionable insights. Experience with data warehousing concepts and principles. Familiarity with project management methodologies. Knowledge of Agile development methodologies is often preferred.
Posted 2 days ago
7.0 - 9.0 years
20 - 30 Lacs
chandigarh, gurugram, bengaluru
Work from Office
We are seeking an experienced and strategic Informatica Consultant with deep expertise in Informatica PowerCenter, IBM DataStage and the Informatica Intelligent Data Management Cloud (IDMC). The consultant will be responsible for assessing existing on-premises PowerCenter environments, designing robust migration strategies, and leading the hands-on implementation of data integration, quality, and governance solutions on the IDMC platform. Design scalable, secure, and high-performance data management solutions on IDMC. Create technical architecture documents, migration roadmaps, and project implementation plans. Lead and execute the migration of PowerCenter assets (mappings, workflows, sessions) to IDMC's Cloud Data Integration (CDI). Develop new cloud-native data integration pipelines and services as per client requirements. Implement and configure various IDMC services, including Data Quality, Data Catalog, Master Data Management (MDM), and Data Governance, to meet business needs. Champion best practices for cloud data management and guide clients on adopting optimal design patterns within IDMC. Mentor junior developers and client teams to build their technical capabilities. Act as the technical lead on projects, providing oversight for development, testing, and deployment activities. Troubleshoot and resolve complex technical issues throughout the project lifecycle. Communicate effectively with business stakeholders, IT leadership, and technical teams to ensure alignment and report on project progress, risks, and outcomes. Hands-on experience designing and developing data pipelines, jobs and sequences in IBM DataStage. Expert-level SQL skills and comprehensive experience with major relational databases (e.g., Oracle, SQL Server, Snowflake, Teradata).
Posted 2 days ago
4.0 - 7.0 years
0 - 2 Lacs
gurugram
Work from Office
Teradata Developer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Data Analytics Data Analytics is one of the highest growth practices within Evalueserve , providing you rewarding career opportunities. Established in 2014, the global DA team extends beyond 1000+ (and growing) data science professionals across data engineering, business intelligence, digital marketing, advanced analytics, technology, and product engineering. Our more tenured teammates, some of whom have been with Evalueserve since it started more than 20 years ago, have enjoyed leadership opportunities in different regions of the world across our seven business lines. What you will be doing at Evalueserve Design, develop, and optimize ETL workflows using IBM DataStage and Teradata. Collaborate with cross-functional teams to gather requirements and deliver scalable data solutions. Lead data integration efforts and ensure high data quality across systems. Perform performance tuning and troubleshooting of Teradata queries and ETL jobs. Guide and mentor junior developers, ensuring adherence to best practices. Participate in solution architecture, code reviews, and stakeholder presentations. What were looking for 5-8 years of experience in data engineering. Teradata SQL advanced query writing, performance optimization. IBM DataStage ETL design, job orchestration, debugging. Unix/Linux Shell Scripting for job automation and scheduling. SQL & PL/SQL strong command over relational database concepts. Data Warehousing dimensional modeling, star/snowflake schemas. Version Control Tools Git, Bitbucket. Scheduling Tools Control-M, Autosys (preferred). Bigdata Platforms exposure to Cloudera or any Hadoop system. Data Governance & Quality understanding of metadata management and data lineage. Agile Methodologies experience working in Agile/Scrum environments. Strong technical skills in IBM DataStage, Teradata, Unix Shell scripting, SQL (Teradata/ Hadoop) and Control M scheduling tools. Design and develop complex cloud-based data solutions for streaming and batch data processing. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 3 days ago
5.0 - 8.0 years
12 - 19 Lacs
hyderabad
Hybrid
Core Skills Strong hands-on experience with IBM DataStage (ETL tool) parallel, server, and sequence jobs. Proficiency in ETL design, development, and optimization . Strong knowledge of SQL/PLSQL for data extraction, transformation, and loading. Experience with Unix/Linux shell scripting . Knowledge of Data Warehousing concepts (star schema, snowflake schema, slowly changing dimensions, fact/dimension tables). Additional Skills Experience with job scheduling tools (Control-M, Autosys, or equivalent). Exposure to big data platforms (Hadoop, Spark) is a plus. Familiarity with cloud platforms (AWS, Azure, GCP) for ETL migration/implementation. Strong debugging, performance tuning, and error handling skills. Ability to work in agile methodology and collaborate with business/QA teams
Posted 3 days ago
6.0 - 11.0 years
15 - 30 Lacs
hyderabad, pune
Hybrid
Exp- 5 to 12 years Location- Pune & Hyderabad Job description- Data Warehousing ETL with experience in Data Stage, SQL, UNIX, Experience in creating, compiling jobs using data stage designer, view job status, job log, scheduling jobs and monitor using data stage director Expertise in data warehousing/ETL programming and fulfillment of data warehouse project tasks such as data extraction, cleansing, aggregating, validations, transforming and loading Good knowledge of database commands (DDL and DML) and data warehousing concepts. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 3 days ago
3.0 - 6.0 years
6 - 14 Lacs
Pune, Chennai, Bengaluru
Work from Office
A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology • Basics of Architecture and Design fundamentals • Knowledge of Testing tools • Knowledge of agile methodologies • Understanding of Project life cycle activities on development and maintenance projects • Understanding of one or more Estimation methodologies, Knowledge of Quality processes • Basics of business domain to understand the business requirements • Analytical abilities, Strong Technical Skills, Good communication skills • Good understanding of the technology and domain • Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods • Awareness of latest technologies and trends • Excellent problem solving, analytical and debugging skills Technical and Professional Requirements: Primary skills: Technology->Data Management - Data Integration->DataStage Preferred Skills: Technology->Data Management - Data Integration->DataStage
Posted 4 weeks ago
6.0 - 11.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Location - Bangalore Skills and Qualifications : At least 6+ years' relevant experience would generally be expected to find the skills required for this role. 6+ years of being a practitioner in data engineering or a related field. Strong programming skills in Python, with experience in data manipulation and analysis libraries (e.g., Pandas, NumPy, Dask). Proficiency in SQL and experience with relational databases (e.g., Sybase, DB2, Snowflake, PostgreSQL, SQL Server). Experience with data warehousing concepts and technologies (e.g., dimensional modeling, star schema, data vault modeling, Kimball methodology, Inmon methodology, data lake design). Familiarity with ETL/ELT processes and tools (e.g., Informatica PowerCenter, IBM DataStage, Ab Initio) and open-source frameworks for data transformation (e.g., Apache Spark, Apache Airflow). Experience with message queues and streaming platforms (e.g., Kafka, RabbitMQ). Experience with version control systems (e.g., Git). Experience using Jupyter notebooks for data exploration, analysis, and visualization. Excellent communication and collaboration skills. Ability to work independently and as part of a geographically distributed team. Nice to have Understanding of any cloud-based application development & Dev Ops. Understanding of business intelligence tools - Tableau,PowerBI Understanding of Trade Lifecycle / Financial markets. If you are keen to join us, you will be part of an organization that values your contributions, recognizes your potential, and provides ample opportunities for growth. For more information, visit www.capco.com. Follow us on Twitter, Facebook, LinkedIn, and YouTube.
Posted 4 weeks ago
5.0 - 9.0 years
10 - 15 Lacs
Noida
Work from Office
ETL (IBM Datastage), Shell Python Scripting Database Expertise DevOps CI/CD Experience Mandatory Competencies Beh - Communication and collaboration ETL - ETL - Data Stage DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Development Tools and Management - Development Tools and Management - CI/CD Database - Oracle - PL/SQL Packages Database - Sql Server - SQL Packages
Posted 1 month ago
4.0 - 8.0 years
20 - 27 Lacs
Chennai
Hybrid
Key Responsibilities Design, develop, and maintain ETL pipelines using IBM DataStage (CP4D) and AWS Glue/Lambda for ingestion from varied sources like flat files, APIs, Oracle, DB2, etc. Build and optimize data flows for loading curated datasets into Snowflake , leveraging best practices for schema design, partitioning, and transformation logic. Participate in code reviews , performance tuning, and defect triage sessions. Work closely with data governance teams to ensure lineage, privacy tagging, and quality controls are embedded within pipelines. Contribute to CI/CD integration of ETL components using Git, Jenkins, and parameterized job configurations. Troubleshoot and resolve issues in QA/UAT/Production environments as needed. Adhere to agile delivery practices, sprint planning, and documentation requirements. Required Skills and Experience 4+ years of experience in ETL development with at least 12 years in IBM DataStage (preferably CP4D version) . Hands-on experience with AWS Glue (PySpark or Spark) and AWS Lambda for event-based processing. Experience working with Snowflake : loading strategies, stream-task, zero-copy cloning, and performance tuning. Proficiency in SQL , Unix scripting , and basic Python for data handling or automation. Familiarity with S3 , version control systems (Git), and job orchestration tools. Experience with data profiling, cleansing, and quality validation routines. Understanding of data lake/data warehouse architectures and DevOps practices
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : IBM InfoSphere DataStage Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationRoleTechnology SupportJob TitleSenior DataStage ConsultantCareer Level08Must have skills:Apache, Tomcat, IIS, IBM WebSphere Administration, IBM DataStage, Linux shell scripting Good to have skills:Teradata, RDBMS and SQL experience in Oracle, DB2 About The Role :Lead the effort to design, build and configure applications, acting as the primary point of contact.Job Summary The ETL Developer is responsible for L2-L3 production support , administration of DataStage Application, and designing, building, deploying and maintaining ETL DataStage interfaces using the IBM InfoSphere DataStage ETL development tool.Key Responsibilities1. Extend Production Support in L2 / L3 capacity. Being able to efficiently debug and troubleshoot production issues2. Evaluate existing data solutions, write scalable ETLs, develop documentation, and train/help team members.3. Collaborate and work with business/development teams and infrastructure teams on L3 issues and follow the task to completion.4. Participate & provide support for releases, risks, mitigation plan, and regular DR exercise for project roll out.5. Drive Automation, Fixes to prevent issues from reoccurring.6. Manage Service Level Agreements7. Bring continuous improvements to reduce time to resolve for production incidents8. Perform root cause analysis and identify and implement corrective and preventive measures9. Document standards, processes and procedures relating to best practices, issues and resolutions10. Constantly upskill with tools & technologies to meet organization's future needs11. Be available on call (on rotation) in a support role12. Effectively manage multiple, competing prioritiesTechnical Responsibilities:Excellent Understanding of technical concepts.Strong understanding of OS related dependenciesStrong exposure to the shell scripting. Having expertise in any Cloud and Middleware technologies would be a great value add on Professional Attributes:Good verbal and written communication skills to connect with customers at varying levels of the organizationAbility to operate independently and make decisions with little direct supervisionCandidate must be willing to cross skill and up skill based on project and business requirements.Education Qualification:Higher Level Qualification in a technical subject is desirableIBM DataStage certification. Additional Information:A:Strong written & oral communication skills.B:Should be open to work in shifts. Qualification 15 years full time education
Posted 1 month ago
7.0 - 11.0 years
8 - 17 Lacs
Chennai
Hybrid
Position Description: We're looking for a skilled and experienced DataStage Developer who isn't just great at building ETL pipelines but also has hands-on experience tackling the challenges of migrating these processes to a modern platform like CP4D. As a DataStage Developer on our team, you'll be involved in a mix of maintaining our current DataStage environment and spearheading the migration effort to CP4D. Heres a bit more detail: Developing and Maintaining ETL: Design, develop, test, deploy, and support robust and efficient ETL jobs using IBM DataStage. These jobs will move critical data for our Collections operations from various source systems into our data warehouse and analytical platforms. Focusing on Data for Collections: work closely with our business partners in Collections to understand their data needs, reporting requirements, and process flows. This means dealing with data related to customer accounts, payments, delinquencies, recovery efforts, and compliance. Leading the CP4D Migration: This is a key part of the role. You will actively participate in and potentially lead the technical aspects of migrating existing DataStage jobs and workflows to the IBM Cloud Pak for Data platform. This includes: Analyzing existing DataStage jobs to understand their logic and dependencies. Re-platforming or refactoring these jobs within the CP4D DataStage service. Identifying and addressing technical challenges specific to the migration Ensuring data integrity and accuracy throughout the migration process. Skills Required: ETL Experience Required: 7+ years of hands-on IBM DataStage Direct CP4D Migration Experience 5+ years of experience in SQL. Strong skills in writing complex SQL queries Education Required: Bachelor's Degree
Posted 1 month ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cognos Developer & Admin Required. EducationThe resource should be full time MCA/M. Tech/B. Tech/B.E. and should preferably have relevant certifications ExperienceThe resource should have a minimum of 3 years of experience of working in the in BI DW projects in areas pertaining to reporting and visualization using cognos. The resources shall have worked in at least two projects where they were involved in developing reporting/ visualization * He shall have good understanding of UNIX. Should be well conversant in English and should have excellent writing, MIS, communication, time management and multi-tasking skill Preferred technical and professional experience Experience with various cloud and integration platforms (e.g. AWS, Google, Azure) Agile mindset - ability to process changes of priorities and requests, ownership, critical thinking Experience with an ETL/Data Integration tool (eg. IBM InfoSphere DataStage, Azure Data Factory, Informatica PowerCenter)
Posted 1 month ago
3.0 - 8.0 years
0 - 1 Lacs
Hyderabad
Work from Office
Key Responsibilities 1. Incident Management Monitor production systems for issues and respond promptly to incidents. Log, categorize, and prioritize incidents for resolution. Collaborate with development teams to address and resolve issues. Communicate with stakeholders regarding incident status and resolution timelines. expertia.ai+2linkedin.com+2virtusa.com+2tavoq.com 2. Root Cause Analysis (RCA) Conduct thorough investigations to identify the underlying causes of recurring issues. Implement long-term solutions to prevent future occurrences. Document findings and share insights with relevant teams. tech-champion.com+1tealhq.com+1 3. System Monitoring & Performance Optimization Utilize monitoring tools to track system health and performance. Identify and address performance bottlenecks or capacity issues. Ensure systems meet performance benchmarks and service level agreements (SLAs). virtusa.com 4. Release Management & Application Maintenance Assist with the deployment of software updates, patches, and new releases. Ensure smooth transitions from development to production environments. Coordinate with cross-functional teams to minimize disruptions during releases. virtusa.com+1tealhq.com+1tech-champion.com 5. User Support & Troubleshooting Provide end-user support for technical issues. Investigate user-reported problems and offer solutions or workarounds. Maintain clear communication with users regarding issue status and resolution. virtusa.com+1resumehead.com+1 6. Documentation & Knowledge Sharing Maintain detailed records of incidents, resolutions, and system configurations. Create and update operational runbooks, FAQs, and knowledge base articles. Share knowledge with team members to improve overall support capabilities. virtusa.com Essential Tools & Technologies Monitoring & Alerting : Nagios, Datadog, New Relic Log Management & Analysis : Splunk, Elasticsearch, Graylog Version Control : Git, SVN Ticketing Systems : JIRA, ServiceNow Automation & Scripting : Python, Shell scripting Database Management : SQL, Oracle, MySQL cvformat.io+2tealhq.com+2virtusa.com+2 Skills & Competencies Technical Skills Proficiency in system monitoring and troubleshooting. Strong understanding of application performance metrics. Experience with database management and query optimization. Familiarity with cloud platforms and infrastructure.expertia.ai Soft Skills Analytical Thinking : Ability to diagnose complex issues and develop effective solutions. Communication : Clear and concise communication with stakeholders at all levels. Teamwork : Collaborative approach to problem-solving and knowledge sharing. Adaptability : Flexibility to handle changing priorities and technologies. cvformat.io
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad
Work from Office
DataStage Developer Key Skills: IBM Datastage ETL Development SQL, PL/SQL Data Warehousing Performance Tuning Unix/Linux scripting (preferred) Location - Hyderabad Experience - 5 to 12
Posted 1 month ago
4.0 - 9.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : IBM InfoSphere DataStage Good to have skills : Snowflake Data WarehouseMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of innovative solutions that meet client needs. You will be responsible for delivering high-quality code while adhering to best practices and standards in software development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM InfoSphere DataStage.- Must To Have Skills: Experience with Snowflake Data Warehouse.- Strong understanding of ETL processes and data integration techniques.- Minimum 4 years of experience with database management and SQL.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 7 years of experience in IBM InfoSphere DataStage & Snowflake.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
17 - 22 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : Informatica MDMMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring that all development aligns with best practices and organizational standards. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM InfoSphere DataStage.- Good To Have Skills: Experience with Informatica MDM.- Strong understanding of data integration and ETL processes.- Experience with database management and SQL.- Familiarity with data warehousing concepts and practices. Additional Information:- The candidate should have minimum 7.5 years of experience in IBM InfoSphere DataStage.- This position is based in Mumbai.- A 15 years full time education is required. Collaborate with key stakeholders to develop and enhance MDM solutions using IBM Datastage, Infosphere, and equivalent MDM platforms. Perform in ETL (Extract, Transform, Load) processes, ensuring data accuracy and consistency. Designing and implementing data models, workflows, and integration pipelines. Support data governance efforts by adhering to established policies and standards. Troubleshoot and resolve technical issues related to MDM systems. Contribute to documentation and knowledge sharing within the team. Translate pharmaceutical-specific master data requirements (customer, product, brand) into technical specifications. Design and oversee the delivery of functional and technical components for MDM solutions tailored to the industry. Collaborate with cross-functional teams to ensure alignment with business needs related to customer, product, and brand data. Qualification 15 years full time education
Posted 1 month ago
7.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Urgent Requirement for Datastage Developer. Experience:7+ Years Location: Pan India. Role Overview - Design and development of ETL and BI applications in DataStage - Design/develop testing processes to ensure end to end performance, data integrity and usability. - Carry out performance testing, integration and system testing - Good SQL Knowledge is mandatory - Basic Unix knowledge is required .
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Chennai
Work from Office
Develop and manage data integration workflows using IBM InfoSphere DataStage. You will design, implement, and optimize ETL processes to ensure efficient data processing. Expertise in DataStage, ETL tools, and database management is required for this role.
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Design and develop ETL processes using IBM DataStage. Focus on data integration, transformation, and loading, ensuring efficient data pipelines.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City