Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Role & responsibilities Designing and developing Teradata databases Configuring and testing Teradata systems Troubleshooting Teradata systems Liaising with Teradata support staff and other technical teams Providing training and support to end users Requirements & Skills B.Tech/BE in Computer Science or related field 5+ years of experience in Teradata development Strong experience of SQL Good understanding of data warehousing concepts Experience in using Teradata utilities Excellent problem-solving skills Preferred candidate profile Immediate Joiners Only/Open for Bangalore location
Posted 2 hours ago
5.0 - 10.0 years
30 - 35 Lacs
Bengaluru
Work from Office
Role & responsibilities 5+ years of experience as a Data Engineer , focusing on ETL development. 3+ years of experience in SQL and writing complex queries for data retrieval and manipulation. 3+ years of experience in Linux command-line and bash scripting. Familiarity with data modelling in analytical databases. Strong understanding of backend data structures, with experience collaborating with data engineers ( Teradata, Databricks, AWS S3 parquet/CSV ). Experience with RESTful APIs and AWS services like S3, Glue, and Lambda Experience using Confluence for tracking documentation. Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Bachelors degree in Computer Science, Engineering, Information Technology, or a related field.
Posted 2 hours ago
3.0 - 8.0 years
4 - 8 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. Whats in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What Were Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelors degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- , SWP Priority Ratings - (Strategic Workforce Planning)
Posted 2 hours ago
4.0 - 6.0 years
6 - 9 Lacs
Pune, Chennai, Bengaluru
Work from Office
Data Engineer Number of Open Position : 4 Key Responsibilities Develop and maintain ETL pipelines for multiple source systems: Examples - SAP, Korber WMS, OpSuite ePOS, and internal BI systems. Design and implement SSIS-based data workflows including staging, data quality rules, and CDC logic. Collaborate with the DBA on schema design, indexing, and performance tuning for SQL Server 2022. Build reusable components and scripts for data loading, transformation, and validation. Support development of CDC (Change Data Capture) solutions for near real-time updates. Perform unit testing, documentation, and version control of data solutions using GitLab/Jenkins CI/CD. Ensure data security, masking, and encryption in accordance with project policies (TLS 1.3). Work closely with backend developers and analysts to align data models with reporting needs. Troubleshoot and resolve data-related issues during development and post-deployment. Required Skills & Experience 46 years of experience in data engineering or ETL development Strong hands-on expertise in: SSIS (SQL Server Integration Services) SQL Server 2019/2022 (T-SQL, stored procedures, indexing, CDC) ETL development for ERP/warehouse systems (SAP preferred) Experience working with source systems like SAP, WMS, POS or retail systems is highly desirable. Proficiency in data quality frameworks, staging strategies, and workflow orchestration. Familiarity with CI/CD for data workflows (GitLab, Jenkins, etc.) Good understanding of data warehousing concepts and performance optimization. Strong communication and documentation skills.
Posted 2 hours ago
4.0 - 7.0 years
3 - 7 Lacs
Noida
Work from Office
R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industry's most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where we're all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Description: We are seeking a Software Engineer with 4-7 year of experience to join our ETL Development team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with SSIS, T-SQL, Azure Databricks, Azure Data Lake, Azure Data Factory,. Experienced in writing SQL objects SP, UDF, Views Experienced in data modeling. Experience working with MS-SQL and NoSQL database systems such as Apache Parquet. Experience in Scala, SparkSQL, Airflow is preferred. Experience with acquiring and preparing data from primary and secondary disparate data sources Experience working on large scale data product implementation. Experience working with agile methodology preferred. Healthcare industry experience preferred. Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook
Posted 2 hours ago
5.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-DataStage Preferred Skills: Technology-Data Management - Data Integration-DataStage
Posted 3 hours ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad
Work from Office
P2-C3-STS JD Data Warehouse. In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank. As a Lead ETL Developer, you will be leading teams to develop, maintain and enhance code ensuring all IT SDLC processes are documented and practiced, working closely with multiple technologies teams across the enterprise. The Lead ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Key Responsibilities: Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts AWS, Glue SQL python SNOWFLAKE CI/CD Tools (Jenkins, GitHub) Secondary Skills zena pyspark infogix
Posted 3 hours ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Design, develop, and deploy ETL processes using SSIS/Azure Data Factory. 6+ years of experience in ETL development using SSIS or 2+ years of experience in Azure data factory Monitor and troubleshoot ETL jobs and data flows. Implement data quality checks and ensure data integrity. Maintain documentation of ETL processes and data flow diagrams Design, develop, and maintain interactive Power BI reports and dashboards to visualize key performance indicators (KPIs) and business metrics. Translate complex business requirements into technical specifications for data extraction, transformation, and reporting. Collaborate with cross-functional teams to understand their data and reporting needs. Write complex SQL queries for data extraction, manipulation, and analysis from various relational databases. Resource needs to have good insurance domain knowledge
Posted 3 hours ago
8.0 - 13.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Data Analyst WorkMode :Hybrid Work Location Chennai / Hyderabad / Bangalore / Pune Work Timing 2 PM to 11 PM Primary DataAnalyst Minimum 6 years of experience as DataAnalyst with at least 3+ Years experience in Data Migration initiatives. Experience in Migrating COTS/legacy systems including large volumes of data without compromising its accuracy and completeness Technical expertise regarding data models, database design development, data mining and segmentation techniques Experience with ETL development both on premises and in the cloud Strong functional understanding of RDBMS DWH-BI conceptual knowledge
Posted 3 hours ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
P2-C3-STS JD In this role you will be part of a team working to develop solutions enabling the business to leverage data as an asset at the bank.The Senior ETL Developer should have extensive knowledge of data warehousing cloud technologies. If you consider data as a strategic asset, evangelize the value of good data and insights, have a passion for learning and continuous improvement, this role is for you. Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and/or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Basic Qualifications Bachelor degree or military experience in related field (preferably computer science). At least 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highly-scalable cloud infrastructure services. Strong experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Understand Authentication & Authorization Services, Identity & Access Management. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect Associate, AWS Certified DevOps Engineer Professional and/or AWS Certified Solutions Architect Professional Experience defining future state roadmaps for data warehouse applications. Experience leading teams of developers within a project. Experience in financial services (banking) industry. Mandatory Skills ETL - Datawarehouse concepts Snowflake AWS, Glue CI/CD Tools (Jenkins, GitHub) python Datastage Secondary Skills zena pyspark infogix
Posted 3 hours ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 5 hours ago
5.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Hybrid
5+ Years of strong experience in Informatica CloudExp in Informatica Cloud Designer & Informatica Cloud Portal Exp in Transformations, Mapping Configuration Task, Task Flows and Parameterized Templates Experience in Informatica Power Center and DesignerGood knowledge in Oracle, SQL,PL/SQL Should have experience in scheduling the Informatica cloud ETL mappings Experience in Integration of Informatica cloud with SFDC, SAP etc. as sources Experience in Business Objects and other business intelligence platforms is an advantage Should be good in understanding of functional requirements and Business
Posted 6 hours ago
7.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Urgent Requirement for Datastage Developer. Experience:7+ Years Location: Pan India. Role Overview - Design and development of ETL and BI applications in DataStage - Design/develop testing processes to ensure end to end performance, data integrity and usability. - Carry out performance testing, integration and system testing - Good SQL Knowledge is mandatory - Basic Unix knowledge is required .
Posted 7 hours ago
6.0 - 9.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Job Details: Skill: Sr Developer - ODI Experience: 6-9 Years Location: - PAN INDIA Notice Period: Immediate Joiners Employee type : C2H Job Description: Sr. Developer - ODI Candidate should have technical knowledge with experience in the Oracle Data Integrator and Oracle Middleware technologies. Should have good expertise on Oracle Data Integrator (ODI) and Data warehousing with relevant experience 6 to 9 years Experience in Designing, implementing, and maintaining ODI load plan and process Experience in ETL development, PL/SQL & support To co-ordinate with Team Lead to ensure implementation as per stated requirements Good to have knowledge in Oracle MFT, Oracle Database, Oracle SOA Suite, BPEL, XML, WSDL/XSD, Adapters. Support and manage already developed ODI applications, perform testing on DEV/UAT/SIT/PROD environments, work on tickets assigned to you. Ready to work on shifts, support on-call depending on project and production releases. Should be able to work as an independent team member, capable of applying judgment to plan and execute your tasks. Timely completion of quality deliverables, good communication skills, professional conduct, ability to talk/handle business and cross functional teams To ensure correctness and completeness of Data loading (Full load & Incremental load) Optimizing execution time of interfaces by implementing best practices.
Posted 7 hours ago
5.0 - 6.0 years
7 - 8 Lacs
Kolkata
Work from Office
Use Talend Open Studio to design, implement, and manage data integration solutions. Develop ETL processes to ensure data is accurately extracted, transformed, and loaded into various systems for analysis.
Posted 19 hours ago
8.0 - 11.0 years
20 - 30 Lacs
Gurugram, Bengaluru
Hybrid
Key responsibilities: An expert in solution design with the ability to see the big picture across the portfolio; providing guidance and governance for the analysis, solution design, development and implementation of projects A strategic thinker who will be responsible for the technical strategy within the portfolio; ensuring it aligns with the overall architecture roadmap and business strategy. An effective communicator who will utilize their technical/business knowledge to lead technical discussions with project teams, business sponsors, and external technical partners in a manner that ensures understanding of risks, options, and overall solutions. An individual with the ability to effectively consult/support technical discussions with Account Delivery and Business Sponsors ensuring alignment to both technology and business visions. Collaborate with Designers, Business System Analysts, Application Analysts and Testing specialists to deliver high quality solutions Able to prepare high-level and detailed-level designs based on technical/business requirements and defined architectures and maintain documentation Have been instrumental in platform migration work and technical migration work in the past and understands the involved intricacies. Analyze, define and document requirements for data, workflow, logical processes, interface design, internal and external checks, controls, and outputs Ensure information security standards and requirements are incorporated into all solutions Stay current with trends in emerging technologies and how they could apply to Sun Life. Key experience: A Bachelors or master’s degree in Computer Science or related field 8 -11 years of progressive information technology experience with full application development life cycle. Domain knowledge of Insurance and Retail Wealth Management. Experience in Informatica Powercenter / IDMC Development. Experience of applying various informatica transformations and different type of sources. Ability to write complex T-SQL and stored procedures, views. Experience in SQL Server 2014 and above. Exposure to DevOps and API architecture Should have experience leading small teams (5-8 developers). Good knowledge and experience of Java1.8 or above. Experience in PostGRE SQL and No-SQL DB like MongoDB etc. Good knowledge of coding best practices and should be able to do code review of peer. Produce clean, efficient code based on specifications and troubleshoot, debug and upgrade existing software.
Posted 19 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Mumbai
Work from Office
Develops ETL solutions using Informatica PowerCentre.
Posted 19 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Chennai
Work from Office
Develop and manage data integration workflows using IBM InfoSphere DataStage. You will design, implement, and optimize ETL processes to ensure efficient data processing. Expertise in DataStage, ETL tools, and database management is required for this role.
Posted 19 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Design and implement data integration and management solutions using Informatica Big Data Management (BDM). Ensure efficient handling of large data sets, optimizing performance and ensuring seamless data flow across systems.
Posted 19 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Develop and manage ETL processes using Informatica, ensuring smooth data extraction, transformation, and loading across multiple systems. Optimize data workflows to ensure high-quality data management.
Posted 19 hours ago
4.0 - 8.0 years
6 - 10 Lacs
Mumbai
Work from Office
Design and optimize ETL workflows using Talend. Ensure data integrity and process automation.
Posted 19 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Design and develop ETL processes using IBM DataStage. Focus on data integration, transformation, and loading, ensuring efficient data pipelines.
Posted 20 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Chennai
Work from Office
Designs and develops ETL pipelines using IBM InfoSphere DataStage. Handles data integration and transformation tasks.
Posted 20 hours ago
4.0 - 6.0 years
6 - 8 Lacs
Mumbai
Work from Office
Design and implement data integration solutions using IBM InfoSphere DataStage. Develop ETL jobs, write PL/SQL scripts, and use Unix Shell Scripting for text processing to manage large datasets efficiently.
Posted 20 hours ago
3.0 - 5.0 years
15 - 18 Lacs
Noida
Work from Office
Responsibilities Design, Develop, and Maintain ETL Pipelines: Create, optimize, and manage Extract, Transform, Load (ETL) processes using Python scripts and Pentaho Data Integration (Kettle) to move and transform data from various sources into target systems (e.g., data warehouses, data lakes). Data Quality Assurance: Implement rigorous data validation, cleansing, and reconciliation procedures to ensure the accuracy, completeness, and consistency of data. Data Sourcing and Integration: Work with diverse data sources, including relational databases (SQL Server, MySQL, PostgreSQL), flat files (CSV, Excel), APIs, and cloud platforms. Performance Optimization: Identify and implement improvements for existing ETL processes to enhance data load times, efficiency, and scalability. Troubleshooting and Support: Diagnose and resolve data-related issues, ensuring data integrity and timely availability for reporting and analysis. Documentation: Create and maintain comprehensive documentation for all ETL processes, data flows, and data dictionaries. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver robust data solutions. Ad-hoc Analysis: Perform ad-hoc data analysis and provide insights to support business decisions as needed. About the Role: We are looking for a skilled and passionateData Engineerwith 3 to 4 years of experience in building robust ETL pipelines using both visual ETL tools (preferably Kettle/Pentaho) and Python-based frameworks. You will be responsible for designing, developing, and maintaining high-quality data workflows that support our data platforms and reporting environments. Key Responsibilities: Design, develop, and maintain ETL pipelines using Kettle (Pentaho) or similar tools. Build data ingestion workflows using Python (Pandas, SQLAlchemy, psycopg2). Extract data from relational and non-relational sources (APIs, CSV, databases). Perform complex transformations and ensure high data quality. Load processed data into target systems such as PostgreSQL, Snowflake, or Redshift. Implement monitoring, error handling, and logging for all ETL jobs. Maintain job orchestration via shell scripts, cron, or workflow tools (e.g., Airflow). Work with stakeholders to understand data needs and deliver accurate, timely data. Maintain documentation for pipelines, data dictionaries, and metadata. Requirements: 3 to 4 years of experience in Data Engineering or ETL development. Hands-on experience withKettle (Pentaho Data Integration) or similar ETL tools. Strong proficiency in Python (including pandas, requests, datetime, etc.). Strong SQL knowledge and experience with relational databases (PostgreSQL, SQL Server, etc.). Experience with source control (Git), scripting (Shell/Bash), and config-driven ETL pipelines. Good understanding of data warehousing concepts, performance optimization, and incremental loads. Familiarity with REST APIs, JSON, XML, and flat file processing. Good to Have: Experience with job scheduling tools (e.g., Airflow, Jenkins). Familiarity with cloud platforms (AWS, Azure, or GCP). Knowledge of Data Lakes, Big Data, or real-time streaming tools is a plus. Experience working in Agile/Scrum environments. Soft Skills: Strong analytical and problem-solving skills. Self-motivated and able to work independently and in a team. Good communication skills with technical and non-technical stakeholders. Industry Software Development Employment Type Full-time
Posted 23 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane