Jobs
Interviews

930 Etl Tool Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Business Intelligence (BI) Publisher Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : Bachelors degree in computer science or information science from a reputed Institute Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:-5-6 + years of experience in BIP,OTBI, Oracle SQL technologies -Familiarity with PL/SQL, "-Working knowledge of OBIEE and Familiarity with ETL tools ODI / Informatica is added advantage.-Familiarity Oracle BICS / OAC / DW is added advantage.-Functional exposure to Oracle E-Business Suite / Fusion Cloud modules is added advantage.-FRS (Financial Reporting Studio) is an added advantage Professional & Technical Skills: - Contribute as an IC and show exemplary commitment on project.- Good spoken and written communication is a must.- Positive Attitude and ability to deal with conflicts is a must.- Oracle Certifications pertaining to Oracle Fusion Cloud Additional Information:- The candidate should have minimum 3 years of experience in Oracle Business Intelligence (BI) Publisher.- This position is based at our Bengaluru office.- A Bachelors degree in computer science or information science from a reputed Institute is required. Qualification Bachelors degree in computer science or information science from a reputed Institute

Posted 1 month ago

Apply

10.0 - 15.0 years

14 - 18 Lacs

Bengaluru

Work from Office

Collaborate with business stakeholders, data architects, and IT teams to gather and understand data migration requirements. Analyze legacy banking and insurance systems (e.g., core banking, policy admin, claims, CRM) to identify data structures and dependencies. Work with large-scale datasets and understand big data architectures (e.g., Hadoop, Spark, Hive) to support scalable data migration and transformation. Perform data profiling, cleansing, and transformation using SQL and ETL tools, with the ability to understand and write complex SQL queries and interpret the logic implemented in ETL workflows. Develop and maintain data mapping documents and transformation logic specific to financial and insurance data (e.g., customer KYC, transactions, policies, claims). Validate migrated data against business rules, regulatory standards, and reconciliation reports. Support UAT by preparing test cases and validating migrated data with business users. Ensure data privacy and security compliance throughout the migration process. Document issues, risks, and resolutions related to data quality and migration.

Posted 1 month ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Technology-Data Management - MDM-Informatica MDM Preferred Skills: Technology-Data Management - MDM-Informatica MDM

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Educational Bachelor of Engineering,BCA,BSc,MTech,MCA,MSc Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-ETL & Data Quality-ETL - Others Preferred Skills: Technology-ETL & Data Quality-ETL - Others

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Chennai

Work from Office

Educational Bachelor of Engineering,BSc,BCA,MTech,MSc,MCA Service Line Infosys Quality Engineering Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-ETL & Data Quality-ETL - Others Preferred Skills: Technology-ETL & Data Quality-ETL - Others

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 13 Lacs

Hyderabad

Work from Office

We are looking for a PySpark solutions developer and data engineer who can design and build solutions for one of our Fortune 500 Client programs, which aims towards building a data standardized and curation needs on Hadoop cluster. This is high visibility, fast-paced key initiative will integrate data across internal and external sources, provide analytical insights, and integrate with the customers critical systems. Key Responsibilities Ability to design, build and unit test applications on Spark framework on Python. Build PySpark based applications for both batch and streaming requirements, which will require in-depth knowledge on majority of Hadoop and NoSQL databases as well. Develop and execute data pipeline testing processes and validate business rules and policies. Optimize performance of the built Spark applications in Hadoop using configurations around Spark Context, Spark-SQL, Data Frame, and Pair RDD's. Optimize performance for data access requirements by choosing the appropriate native Hadoop file formats (Avro, Parquet, ORC etc) and compression codec respectively. Build integrated solutions leveraging Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec. Build data tokenization libraries and integrate with Hive & Spark for column-level obfuscation. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources. Create and maintain integration and regression testing framework on Jenkins integrated with Bit Bucket and/or GIT repositories. Participate in the agile development process, and document and communicate issues and bugs relative to data standards in scrum meetings. Work collaboratively with onsite and offshore team. Develop & review technical documentation for artifacts delivered. Ability to solve complex data-driven scenarios and triage towards defects and production issues. Ability to learn-unlearn-relearn concepts with an open and analytical mindset. Participate in code release and production deployment. Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment. Preferred Qualifications BE/B.Tech/ B.Sc. in Computer Science/ Statistics from an accredited college or university. Minimum 3 years of extensive experience in design, build and deployment of PySpark-based applications. Expertise in handling complex large-scale Big Data environments preferably (20Tb+). Minimum 3 years of experience in the followingHIVE, YARN, HDFS. Hands-on experience writing complex SQL queries, exporting, and importing large amounts of data using utilities. Ability to build abstracted, modularized reusable code components. Prior experience on ETL tools preferably Informatica PowerCenter is advantageous. Able to quickly adapt and learn. Able to jump into an ambiguous situation and take the lead on resolution. Able to communicate and coordinate across various teams. Are comfortable tackling new challenges and new ways of working Are ready to move from traditional methods and adapt into agile ones Comfortable challenging your peers and leadership team. Can prove yourself quickly and decisively. Excellent communication skills and Good Customer Centricity. Strong Target & High Solution Orientation.

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have participated in at least one S/4 Hana data migration project to conduct workshop to define functional data migration requirement with business followed by target template definition, source to target mapping, conversion rules, data validation/testing along with identification of functional configuration required. Know the integration/ overlap/ dependencies among cross functional modules like OTC, P2P, PM, PP, FI. P2PExperience in Material and Vendor Mgmt. in ECC & SAP S/4, knowledge of P2P processes including STO, Return, Subcon, Third Party, S/4 config and testing OR OTCknowledge of OTC (Order to Cash) cycleOR PM/QM/PPKnowledge of business process for (Preventive, corrective, refurbishment, calibration) maintenance. Data migration experience from ECC to S/4.OR FIknowledge of FICO process & awareness of master and traction data related to Chart of Accounts (COA) & GL, Accounts Payable, Accounts Receivable, Bank Master, GL Account and balances. Preferred Skills: ETL - Data Integration-SAP Business Objects Data Services (SAP BODS) Data Migration-SAP Specific Data Migration

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Sr Devloper with special emphasis and experience of 8 to 10 years on Pyspark and Python along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans.8-10 years of experience in designing and developing Pyspark applications and ETL Jobs using ETL Tools. 5+ years of sound knowledge on Pyspark to implement ETL logics. Strong understanding of frontend technologies such as HTML, CSS, React & JavaScript. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad

Work from Office

Educational Bachelor Of Technology,Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledgeLocation of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag. While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Must have participated in at least one S/4 Hana data migration project to conduct workshop to define functional data migration requirement with business followed by target template definition, source to target mapping, conversion rules, data validation/testing along with identification of functional configuration required. Know the integration/ overlap/ dependencies among cross functional modules like OTC, P2P, PM, PP, FI. P2PExperience in Material and Vendor Mgmt. in ECC & SAP S/4, knowledge of P2P processes including STO, Return, Subcon, Third Party, S/4 config and testing OR OTCknowledge of OTC (Order to Cash) cycleOR PM/QM/PPKnowledge of business process for (Preventive, corrective, refurbishment, calibration) maintenance. Data migration experience from ECC to S/4.OR FIknowledge of FICO process & awareness of master and traction data related to Chart of Accounts (COA) & GL, Accounts Payable, Accounts Receivable, Bank Master, GL Account and balances. Preferred Skills: ETL - Data Integration-SAP Business Objects Data Services (SAP BODS) Data Migration-SAP Specific Data Migration

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Mumbai

Work from Office

: Job TitleEBA Stress Test Net Interest Income - Regulatory Reporting Corporate TitleVice-President LocationMumbai, India Role Description This role is responsible for supporting the end-to-end execution and regulatory reporting of the European Banking Authority (EBA) stress testing exercises for the net interest income , ensuring compliance with EBA guidelines and alignment with internal risk and finance frameworks. The candidate will play a key role in data preparation, validation, template population, and submission of regulatory deliverables, while also contributing to process improvements and regulatory change initiatives. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Assist the lead in designing an approach, developing the process, and execution of EBA stress testing of net interest Income for the DB group. Interpret and implement EBA guidelines and methodologies, ensuring alignment with internal risk frameworks. Collaborate with Finance, Risk, and Business teams to gather, validate, and analyze NII data inputs. Understand and assist in building models and assumptions for projecting NII under baseline and adverse scenarios. Actively participate in data governance frameworksandquality assurance best practices. Prepare and present detailed reports and insights to senior management and regulatory bodies. Drive continuous improvement in stress testing processes, automation, and governance. Actively participate and assist in leading change agenda, including UAT etc. Monitor regulatory developments and assess their impact on stress testing requirements. Your skills and experience Basic understanding of IFRS 9 , particularly in the classification and measurement of financial instruments and hedge accounting. Proficient in handling big data , with experience managing large and complex datasets across multiple systems (e.g., finance, risk, treasury). Advanced knowledge of core banking products , including loans, deposits, repos, and reverse repos. Strong understanding of interest rate derivatives , such as swaps, caps/floors, and their impact on financial performance. Proficiency in programming and ETL tools , including Python, Alteryx, and Tableau, is preferred. Prior exposure to EBA stress testing , including familiarity with EBA templates, scenario assumptions, and regulatory expectations, is an advantage. General understanding of macroeconomic indicators , such as GDP, inflation, unemployment, and interest rates. Competent in statistical analysis techniques , including normal distribution, standard deviation, correlation, and regression. Solid mathematical foundation , particularly in linear algebra and financial mathematics. 10+ years of experience in regulatory reporting or data analytics, preferably from BITS/IIT with additional professional qualifications such as CFA or FRM or regulatory reporting consulting background from Big 4s. How well support you . . . .

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting).

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Pune

Work from Office

Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting).

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Gen AI Integration DeveloperExtensive implementation experience in data analytics space or a senior developer role in one of the modern technology stack Excellent programming skills and proficiency in at least one of the major programming scripting languages used in Gen AI orchestration such as Python or PySpark or Java Ability to build API based scalable solutions and debug & troubleshoot software or design issues Hands on exposure to integrating atleast one of the popular LLMs(Open AI GPT, PaLM 2, Dolly, Claude 2, Cohere etc.) using API endpoints. Thorough understanding of prompt engineering; implementation exposure to LLM agents like LangChain & vector databases Pinecone or Chroma or FAISS Ability to quickly conduct experiments and analyze the features and capabilities of newer versions of the LLM models as they come into market Basic data engineering skills to load structured & unstructured data from source systems to target data stores Work closely with Gen AI leads and other team members to address requirements from the product backlog Build and maintain data pipelines and infrastructure to support AI Solutions Desirable:Hands on exposure to using cloud(Azure/GCP/AWS) services for storage, serverless-logic, search, transcription and chat Extensive experience with data engineering and ETL tools is a big plus Masters/Bachelors degree in Computer Science or Statistics or Mathematics

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Extensive implementation experience in data analytics space or a senior developer role in one of the modern technology stack Excellent programming skills and proficiency in at least one of the major programming scripting languages used in Gen AI orchestration such as Python or PySpark or Java Ability to build API based scalable solutions and debug & troubleshoot software or design issues Hands on exposure to integrating atleast one of the popular LLMs(Open AI GPT, PaLM 2, Dolly, Claude 2, Cohere etc.) using API endpoints. Thorough understanding of prompt engineering; implementation exposure to LLM agents like LangChain & vector databases Pinecone or Chroma or FAISS Ability to quickly conduct experiments and analyze the features and capabilities of newer versions of the LLM models as they come into market Basic data engineering skills to load structured & unstructured data from source systems to target data stores Work closely with Gen AI leads and other team members to address requirements from the product backlog Build and maintain data pipelines and infrastructure to support AI Solutions Desirable:Hands on exposure to using cloud(Azure/GCP/AWS) services for storage, serverless-logic, search, transcription and chat Extensive experience with data engineering and ETL tools is a big plus Masters/Bachelors degree in Computer Science or Statistics or Mathematics

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Extensive implementation experience in data analytics space or a senior developer role in one of the modern technology stack Excellent programming skills and proficiency in at least one of the major programming scripting languages used in Gen AI orchestration such as Python or PySpark or Java Ability to build API based scalable solutions and debug & troubleshoot software or design issues Hands on exposure to integrating atleast one of the popular LLMs(Open AI GPT, PaLM 2, Dolly, Claude 2, Cohere etc.) using API endpoints. Thorough understanding of prompt engineering; implementation exposure to LLM agents like LangChain & vector databases Pinecone or Chroma or FAISS Ability to quickly conduct experiments and analyze the features and capabilities of newer versions of the LLM models as they come into market Basic data engineering skills to load structured & unstructured data from source systems to target data stores Work closely with Gen AI leads and other team members to address requirements from the product backlog Build and maintain data pipelines and infrastructure to support AI Solutions Desirable:Hands on exposure to using cloud(Azure/GCP/AWS) services for storage, serverless-logic, search, transcription and chat Extensive experience with data engineering and ETL tools is a big plus Masters/Bachelors degree in Computer Science or Statistics or Mathematics

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Pune

Work from Office

Data Engineer Position Summary The Data Engineer is responsible for building and maintaining data pipelines ensuring the smooth operation of data systems and optimizing workflows to meet business requirements This role will support data integration and processing for various applications Minimum Qualifications 6 Years overall IT experience with minimum 4 years of work experience in below tech skills Tech Skills Proficient in Python scripting and PySpark for data processing tasks Strong SQL capabilities with hands on experience managing big data using ETL tools like Informatica Experience with the AWS cloud platform and its data services including S3 Redshift Lambda EMR Airflow Postgres SNS and EventBridge Skilled in BASH Shell scripting Understanding of data lakehouse architecture particularly with Iceberg format is a plus Preferred Experience with Kafka and Mulesoft API Understanding of healthcare data systems is a plus Experience in Agile methodologies Strong analytical and problem solving skills Effective communication and teamwork abilities Responsibilities Develop and maintain data pipelines and ETL processes to manage large scale datasets Collaborate to design test data architectures to align with business needs Implement and optimize data models for efficient querying and reporting Assist in the development and maintenance of data quality checks and monitoring processes Support the creation of data solutions that enable analytical capabilities Contribute to aligning data architecture with overall organizational solutions

Posted 1 month ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Hyderabad

Work from Office

Data Tester Highlights: 5 plus years experience in data testing ETL TestingValidating the extraction, transformation, and loading (ETL) of data from various sources. Data ValidationEnsuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL ProficiencyWriting and executing SQL queries to fetch and analyze data. Data ModelingUnderstanding data models, data mappings, and architectural documentation. Test Case DesignCreating test cases, test data, and executing test plans. TroubleshootingIdentifying and resolving data-related issues. Dashboard TestingValidating dashboards for accuracy, functionality, and user experience. CollaborationWorking with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional TestingSimulating user interactions and clicks to ensure dashboards are functioning correctly. Performance TestingEvaluating dashboard responsiveness and load times. Data Quality TestingVerifying that the data displayed on dashboards is accurate, complete, and consistent. Usability TestingAssessing the ease of use and navigation of dashboards. Data Visualization TestingEnsuring charts, graphs, and other visualizations are accurate and present data effectively. Security TestingVerifying that dashboards are secure and protect sensitive data. Tools and Technologies: SQLUsed for querying and validating data. Hands on snowflake ETL ToolsTools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization ToolsTableau, Power BI, or other BI tools used for creating and testing dashboards. Testing FrameworksFrameworks like Selenium or JUnit used for automating testing tasks. Cloud PlatformsAWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools

Posted 1 month ago

Apply

7.0 - 12.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Role - ETL QA Exp - 6-9 years Location - Chennai / Hyderabad JD ETL Tester Hands on experience in ETL testing Working knowledge in SSIS ETL tools at a QA level. Develop comprehensive test plans, test cases, and test scripts specifically for SSIS (SQL Server Integration Services) ETL processes. Define testing strategies and methodologies for data integration and data migration projects using SSIS Identify, document, and track defects and inconsistencies in ETL processes. Develop and maintain automated test scripts using SQL and other relevant tools to test SSIS packages. Implement test automation frameworks to improve testing efficiency and coverage for SSIS processes. Proficiency in SQL for querying and validating data. Extensive experience with SSIS for ETL processes. Familiarity with data warehousing concepts and data modelling.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Oracle Business Intelligence (BI) Publisher Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee the quality and effectiveness of the applications developed. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet specified requirements. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Business Intelligence (BI) Publisher.- Strong understanding of data visualization techniques and reporting tools.- Experience in application design and development methodologies.- Ability to translate business requirements into technical specifications.- Familiarity with database management and data integration processes. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Business Intelligence (BI) Publisher.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

4+ yrs. of experience in Tableau Extensive experience of Data Integration in Banking domain. Extensive experience in ETL tools and Oracle DB Very strong communication and intrapersonal skills Expert Needed Collaboration Teamwork time management data analysis creativity researcher problem solving Experience in Agile Methodology

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

RoleETL Tester Work ModeHybrid Work timings2pm to 11pm LocationChennai & Hyderabad Primary Skills: ETL ETL Tester Hands on experience in ETL testing Working knowledge in SSIS ETL tools at a QA level. Develop comprehensive test plans, test cases, and test scripts specifically for SSIS (SQL Server Integration Services) ETL processes. Define testing strategies and methodologies for data integration and data migration projects using SSIS Identify, document, and track defects and inconsistencies in ETL processes. Develop and maintain automated test scripts using SQL and other relevant tools to test SSIS packages. Implement test automation frameworks to improve testing efficiency and coverage for SSIS processes. Proficiency in SQL for querying and validating data. Extensive experience with SSIS for ETL processes. Familiarity with data warehousing concepts and data modelling.

Posted 1 month ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Chennai

Work from Office

Urgent Opening for Technical Support- IT Product- Bang/Hyderabad Posted On 19th Jan 2017 10:59 AM Location Bang & Hyderabad Role / Position Technical Support(L1/L2) Multiple Positions Experience (required) 3-6 yrs Description Our Client is a leading IT company DesignationTechnical Support L2 LocationBang/Hyderabad What Youll Do You will be the first point of contact for our clients in resolving technical related inquiries Responsibilities: To resolve software & application issues for the customers End to End resolution of customer issues and escalate if necessary Work with other Technical & Engineering teams To manage inquiries timely and efficiently What Were Looking For 3-5 years of expertise in Technical support Hands on SQL/ORACLE/ETL tools Must have active listening skills Strong Debugging skills Send Resumes to girish.expertiz@gmail.com -->Upload Resume

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will engage with key business representatives, data owners, end users, application designers, and data architects to model both current and new data. Your typical day will involve collaborating with various stakeholders to understand their data needs, analyzing existing data structures, and designing effective data models that support business objectives. You will also participate in discussions to ensure that the data models align with the overall data strategy and architecture, facilitating seamless data integration and accessibility across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their understanding of data modeling.- Continuously evaluate and improve data modeling processes to ensure efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with data governance and data quality frameworks.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

The leader must demonstrate an ability to anticipate, understand, and act on evolving customer needs, both stated and unstated. Through this, the candidate must create a customer-centric organization and use innovative thinking frameworks to foster value-added relations. With the right balance of bold initiatives, continuous improvement, and governance, the leader must adhere to the delivery standards set by the client and eClerx by leveraging the knowledge of market drivers and competition to effectively anticipate trends and opportunities. Besides, the leader must demonstrate a capacity to transform, align, and energize organization resources, and take appropriate risks to lead the organization in a new direction. As a leader, the candidate must build engaged and high-impact direct, virtual, and cross-functional teams, and take the lead towards raising the performance bar, build capability and bring out the best in their teams. By collaborating and forging partnerships both within and outside the functional area, the leader must work towards a shared vision and achieve positive business outcomes. Associate Program ManagerRoles and responsibilities: Understand clients requirement and provide effective and efficient solution in Snowflake. Understanding data transformation and translation requirements and which tools to leverage to get the job done. Ability to do Proof of Concepts (POCs) in areas that need R&D on cloud technologies. Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Technical and Functional Skills: Masters / Bachelors degree in Engineering, Analytics, or a related field. Total 7+ years of experience with relevant ~4+ years of Hands-on experience with Snowflake utilities SnowSQL, SnowPipe, Time travel, Replication, Zero copy cloning. Strong working knowledge on Python. Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions. In-depth understanding of data warehouse and ETL tools. Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL. Experience Snowflake APIs is mandatory. Candidate must have strong knowledge in Scheduling and Monitoring using Airflow DAGs. Strong experience in writing SQL Queries, Joins, Store Procedure, User Defined Functions. Should have sound knowledge in Data architecture and design. Should have hands on experience in developing Python scripts for data manipulation. Snowflake snowpro core certification. Developing scripts using Unix, Python, etc.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies