Jobs
Interviews

2459 Data Integration Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Chennai

Work from Office

Required Skill Set Candidates with 8+ years of experience with exposure and experience in PIM solution definition and architecting Must have good understanding of MDM architectures and business processes specifically for solutioning PIM solutions leveraging Inriver Must have Minimum 2 end-to-end Implementation experience with Inriver PIM with architecting one such solution and should be familiar with tools architecture and components (data management, data model, digital asset management, workflows, data model and extensions,supplier portal, data quality, integration aspects) Must have hands-on experience with Informatica PIM tool that includes data modeling , designing and implementing components for imports, exports, data migration and associated data cleansing/transformations, data validations rules, catalog management, techniques for managing digital assets and unstructured content in PIM, etc. Should have designed & implemented automated workflows in the tool for data management processes Knowledge and experience in integrating PIM with Data Quality tool (IDQ) for implementing Product data related DQ processes Experience in integrating PIM with ETL (data integration) and EAI tools for batch & real time integrations Experience / understanding of data services suite of products (Data quality , Data integration etc.) will be an added advantage Understanding/experience on integrating PIM with external content management tools will be useful Candidate should have excellent communication skills with ability to interface with customers to drive PIM requirement workshops to elicit and document functional and non-functional requirements Roles & Responsibilities Leading customer discussions and workshops for requirement elicitation and converting business requirements into PIM functional requirements Architecting, designing and implementing Inriver solutions as per client requirements - configuring, extending and customizing various components of PIM platform Providing technical leadership and leading other implementation team members (integration, data quality, BPM) throughout the implementations Build best practices, reusable components and accelerators for PIM implementations Mentor junior team members on PIM solution design and implementations Support practice by leading PIM solution definitions for different customers Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Informatica MDM. Experience: 8-10 Years.

Posted 3 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

ETRM Data Engineer: Key Responsibilities Design, develop, and maintain scalable data pipelines and ETRM systems. Work on data integration projects within the Energy Trading and Risk Management (ETRM) domain. Collaborate with cross-functional teams to integrate data from ETRM trading systems like Allegro, RightAngle, and Endur. Optimize and manage data storage solutions in Data Lake and Snowflake. Develop and maintain ETL processes using Azure Data Factory and Databricks. Write efficient and maintainable code in Python for data processing and analysis. Ensure data quality and integrity across various data sources and platforms. Ensure data accuracy, integrity, and availability across various trading systems. Collaborate with traders, analysts, and IT teams to understand data requirements and deliver robust solutions. Optimize and enhance data architecture for performance and scalability Mandatory Skills: Python/ pyspark Fast API Pydantic SQL Alchemy Snowflake or SQL Data Lake Azure Data Factory (ADF) CI\CD, Azure fundamentals , GIT Integration of data solutions with CETRM trading Systems(Allegro, RightAngle, Endur) Good to have: Databricks Streamlit Kafka Power BI Kubernetes Fast Stream

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Database Architecting Experience : 5-8 Years.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Gurugram

Work from Office

About the Opportunity Job TypeApplication 29 July 2025 Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like youre part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 Snowflake Data Engineer Overall Experience 5+ years of experience in Snowflake and Python. Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Create and maintain optimal data pipeline architecture. Good knowledge of cloud platforms like AWS/Azure/GCP Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (Snow pipe & others), time travel and data sharing and other Snowflake capabilities Good knowledge of Python/Py Spark, advanced features of Python Support business development efforts (proposals and client presentations). Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

9 - 11 Lacs

Ahmedabad

Work from Office

We are hiring for one of our client an IT company based from Ahmedabad,Gujarat Job Title :- SAS Developer Experience: 2 to 5 years Location:- PAN INDIA (willing to relocate) Required Candidate profile SAS Base, Macros, SQL, DI, VA, Viya RDBMS: Oracle, SQL Server, Teradata Data validation, analytics, reporting Banking data models & regulatory reporting SAS Enterprise Guide, AML/KYC

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : PySpark Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : BE Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing frameworks and distributed computing.- Experience with data integration and ETL processes.- Familiarity with cloud platforms and services related to data processing.- Knowledge of version control systems such as Git. Additional Information:- The candidate should have minimum 2 years of experience in PySpark.- This position is based at our Chennai office.- A BE is required. Qualification BE

Posted 3 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Jaipur

Work from Office

2-3 years or more experience of successfully implementing Anaplan solutions and as an Architect on at least 2 Anaplan implementation projects. Total 5+ years of experience in related technologies. Domain experience in Telecom/Contract Management would be preferred. Anaplan certification of L1, L2, L3 and Solution Architect (Mandatory) certified. Understand clients business planning & performance management processes and related business requirements. Should provide meaningful observations on performance improvements, formula re-writing, troubleshooting and analyzing the problems and its related impacts within and across models. Hands-on in Anaplan New UX, ALM, Anaplan Connect, APIs. Guide and mentor other team members throughout the implementation process. Serve as the architectural SME for large-scale connected planning solutions. Provide candid, meaningful feedback and progress updates in a timely manner to the Project Manager/Business Partner and team. Develop Anaplan model documentation. Participate and/or lead Data Integration and Migration Solutions. Participate and/or lead UAT testing and deployment. Mandatory Skills: Anaplan. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: AI Application Integration. Experience: 10 YEARS.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement

Posted 3 weeks ago

Apply

10.0 - 12.0 years

25 - 35 Lacs

Faridabad

Work from Office

We’re looking for a seasoned SAP Datasphere specialist to design and implement enterprise-level data models and integration pipelines. The role demands strong ETL craftsmanship using SAP native tools, with a foundational knowledge of BW systems leveraged during transitions and migrations. Roles and Responsibilities Key Responsibilities Data Pipeline Development Act as an architect for complex ETL workflows leveraging SAP Datasphere’s graphical and scripting tools. Should have worked with various sources and targets for data integration such as S/4HANA, ECC, Oracle and other third-party sources. Experience using BW Bridge and using standard BW datasources with Datasphere. Ensure data replication, federation, and virtualization use cases are optimally addressed. Modeling & Governance Design and maintain business-oriented semantic layers within Datasphere—creating abstracted, reusable data models and views tailored for analytics consumption. Apply rigorous data governance, lineage tracking, and quality frameworks. Performance & Operations Should be able to design highly optimized and performant data models that perform well under heavy data volume. Continuously track and enhance the performance of data pipelines and models—ensuring efficient processing and robust scalability. Manage workspace structures, access controls, and overall system hygiene. Team Collaboration & Mentorship Collaborate with IT, analytics, and business teams to operationalize data requirements. Coach junior engineers and drive standardized practices across the team Must-Have Qualifications Bachelor’s degree in computer science, Information Systems, or related field. Minimum 8 years in SAP data warehousing, including exposure to BW/BW4HANA. At least 2 years of hands-on experience in SAP Datasphere for ETL, modeling, and integration. Expertise in SQL and scripting (Python). Solid understanding of data governance, lineage, security, and metadata standards. Must be aware of ongoing and rapid changes in SAP landscape such as introduction of BDC and Databricks to SAP Data and Analytics. Nice-to-Have Certifications in SAP Datasphere, BW/4HANA, or data engineering. Knowledge of data virtualization, federation architectures, and hybrid cloud deployments. Experience with Agile or DevOps practices, CI/CD pipelines.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Mumbai

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Ahmedabad

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 16 Lacs

Hyderabad

Remote

Job description As an ETL Developer for the Data and Analytics team, at Guidewire you will participate and collaborate with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will facilitate and be an active developer when necessary to operationalize the realization of the agreed upon ETL Architecture goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development, and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes and emerge as a leader in our thriving data practice. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Build out technical processes from specifications provided in High Level Design and data specifications documents. Integrate test and validation processes and methods into every step of the development process Work with Lead Architects and provide inputs into defining user stories, scope, acceptance criteria and estimates. Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment Actively contribute to the knowledge base from every project you are assigned to. Qualifications: Bachelors or Masters Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3 - 5 years + in a technical capacity building out complex ETL Data Integration frameworks. 3+ years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. Experience with ADF or AWS Glue, Spark/Scala, GDP, CDC, ETL Data Integration, Experience working with relational and/or NoSQL databases Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Ability to work independently and within a team. Nice to have: Insurance industry experience Experience with ADF or AWS Glue Experience with the Azure data factory, Spark/Scala Experience with the Guidewire Data Platform.

Posted 3 weeks ago

Apply

12.0 - 18.0 years

60 - 70 Lacs

Bengaluru

Hybrid

About TresVista TresVista is a global enterprise offering a diversified portfolio of services that enables its clients to achieve resource optimization through leveraging an offshore capacity model. TresVista's services include investment diligence, industry research, valuation, fund administration, accounting, and data analytics. TresVista has more than 1,800 employees across offices in North America, Europe and Asia, providing high-caliber support and operating leverage to over 1,000 clients across geographies and asset classes, including asset managers, advisors, corporates and entrepreneurs. Overview This role shall strategize, implement, and monitor data analysis technology to improve productivity, operational efficiency, and deliver insights. You will be expected to drive adoption of Business Intelligence tools within the organization. In this role, you would drive the roadmap and strategic business objectives for the Business Intelligence division. The role shall also oversee organization automation and AI initiatives. You shall balance the risks presented by Generative AI and Predictive AI, help define our technical architecture and ensure effective execution through prioritization and governance, and build an industry leading team through recruitment, training, and development. Roles and Responsibilities Building Data Lake: a. Architecting enterprise data lake b. Evaluating and finalizing tech stack for data lake c. Leading initiatives to deliver data lake d. Designing and implementing data governance framework Developing data visualization: a. Understanding decisioning situation and map it to analytics needs b. Defining, designing, and conceptualizing projects of data visualization c. Ensuring the overall user experience is considered when designing deploying innovative solutions and services Leveraging Advance Analytics and AI: a. Identifying opportunities of leveraging predictive technology to automate decisions b. Driving initiatives around generative AI applications, exploring new use cases like content automation, conversational interfaces, and dynamic product features c. Designing, developing, and deploying AI including NLP models for tasks such as text classification, sentiment analysis, machine translation, and question answering Project Implementation and management: a. Presenting project and securing approvals b. Launching project, executing then, and doing project management for Business Intelligence vertical c. Recording product defects, analyzing requirements and work on impact analysis of planned features d. Gathering Product Data including user behavior, web analytics, product usage, KPIs/KRA for feature(s) and synthesizing insights Prerequisites Strong understanding of data science concepts, machine learning, and advanced analytics Experience in managing technical teams to deliver Data solutions, incl. acquisition, curation, catalogue registration, data product life cycle and data producer and consumer journeys and usage patterns Experienced with planning, estimating, organizing, and delivering on multiple projects Ability to articulate solutions and process designs clearly Strong knowledge of data integration (ETL), data quality (DQ), data governance, iPaaS (APIs, micro services, application Integration) and Big Data Good web development background (SQL, RDBMS, Java, HTML5, NET) is a plus Good understanding of Machine Learning tools and its usage such as Python/R Experience Candidate should have at least 12 - 18 years of hands-on experience in data science, analytics and machine learning with at least 5+ years in a leadership role Excellent leadership, communication and collaboration skills with a proven ability to align AI initiatives with business goals and deliver measurable outcomes Strategic thinking, problem-solving and a strong team player Education A bachelor/master's degree from a Tier I institute or a Ph.D. in data science, computer science, statistics, mathematics, machine learning or a related quantitative field is preferred Compensation The compensation structure will be as per industry standards

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements

Posted 3 weeks ago

Apply

10.0 - 15.0 years

18 - 22 Lacs

Gurugram

Remote

Work Mode : Remote Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Hybrid

A bachelors degree in Computer Science or a related field. 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. Worked extensively on data integration, designing, and developing reusable interfaces/ Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling. Expert level understanding of data warehouse, core database concepts and relational database design Experience in writing stored procedures, optimization, and performance tuning Strong Technology acumen and a deep strategic mindset Proven track record of delivering results Proven analytical skills and experience making decisions based on hard and soft data A desire and openness to learning and continuous improvement, both of yourself and your team members Hands-on experience on development of APIs is a plus Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required: Familiarity with Postgres and Python is a plus Spend Management Technology (SMT) is seeking a Backend/Database developer Who will play an integral role in designing, implementing, and supporting data integration with New systems, data warehouse, and data extraction solutions across SMT functional areas. skills Sybase DB2 ETL Technologies Python Unix, Shell scripting, Perl BI Tools/SAP Ariba Postgres

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Cloud Support Engineer at Snowflake, you will have the opportunity to work with a dynamic and expanding Support team. Your role will involve leveraging your technical expertise across various operating systems, database technologies, big data, data integration, connectors, and networking to address a wide range of issues related to data. Snowflake Support is dedicated to providing high-quality solutions to facilitate data-driven business insights and outcomes. As part of the team, you will collaborate with customers to understand their needs, offer technical guidance, and champion their feedback for product enhancements. Key to Snowflake's approach are its core values of customer-centricity, integrity, initiative, and accountability. These values underpin the team's commitment to delivering exceptional Support and fostering meaningful customer relationships. In this role, you will play a crucial part in driving customer satisfaction by sharing your expertise on the Snowflake Data Warehouse. You will serve as the primary point of contact for customers, offering guidance on product usage and advocating for their feedback to drive product enhancements. Moreover, you will contribute to team knowledge and participate in strategic initiatives to enhance organizational processes. Furthermore, you will have the opportunity to work closely with Snowflake Priority Support customers, gaining insights into their use cases and ensuring the optimal performance of their Snowflake implementation. Your responsibilities will include providing top-notch service, enabling customers to maximize the benefits of the Snowflake platform. To be successful in this role, you should ideally have experience in a 24x7 technical support environment, managing case escalations, incident resolution, and database release management. Additionally, you should be comfortable working in partnership with engineering teams to address customer requests and contribute to Support initiatives. As a Senior Cloud Support Engineer at Snowflake, you will be expected to drive technical solutions, adhere to SLAs, demonstrate problem-solving skills, and utilize various tools to investigate issues. Your responsibilities will also include documenting solutions, reporting bugs and feature requests, and collaborating with engineering teams to prioritize and resolve issues. The ideal candidate will hold a Bachelor's or Master's degree in Computer Science or a related discipline, possess at least 5 years of experience in a technical support role, and have a solid understanding of major RDBMS systems. Proficiency in SQL, query optimization, performance tuning, and system metrics interpretation are essential for this role. Furthermore, having knowledge of distributed computing principles, scripting experience, database migration expertise, and proficiency in cloud cost management tools are considered advantageous. Candidates should be willing to participate in pager duty rotations, work night shifts, and adapt to schedule changes as needed to support business requirements. Snowflake is a rapidly growing company, and as part of the team, you will have the opportunity to contribute to our success and shape the future of data analytics. If you are passionate about technology, customer success, and innovation, we invite you to join us on this exciting journey. For detailed information regarding salary and benefits for positions in the United States, please refer to the job posting on the Snowflake Careers Site at careers.snowflake.com.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

Myers-Holum is expanding the NSAW Practice and is actively seeking experienced Enterprise Architects with strong end-to-end data warehousing and business intelligence experience to play a pivotal role leading client engagements on this team. As an Enterprise Architect specializing in Data Integration and Business Intelligence, you will be responsible for leading the strategic design, architecture, and implementation of enterprise data solutions to ensure alignment with clients" long-term business goals. You will have the opportunity to develop and promote architectural visions for data integration, Business Intelligence (BI), and analytics solutions across various business functions and applications. Leveraging cutting-edge technologies such as the Oracle NetSuite Analytics Warehouse (NSAW) platform, NetSuite ERP, Suite Commerce Advanced (SCA), and other cloud-based and on-premise tools, you will design and build scalable, high-performance data warehouses and BI solutions for clients. In this role, you will lead cross-functional teams in developing data governance frameworks, data models, and integration architectures to facilitate seamless data flow across disparate systems. By translating high-level business requirements into technical specifications, you will ensure that data architecture decisions align with broader organizational IT strategies and compliance standards. Additionally, you will architect end-to-end data pipelines, integration frameworks, and governance models to enable the seamless flow of structured and unstructured data from multiple sources. Your responsibilities will also include providing thought leadership in evaluating emerging technologies, tools, and best practices for data management, integration, and business intelligence. You will oversee the deployment and adoption of key enterprise data initiatives, engage with C-suite executives and senior stakeholders to communicate architectural solutions, and lead and mentor technical teams to foster a culture of continuous learning and innovation in data management, BI, and integration. Furthermore, as part of the MHI team, you will have the opportunity to contribute to the development of internal frameworks, methodologies, and standards for data architecture, integration, and BI. By staying up to date with industry trends and emerging technologies, you will continuously evolve the enterprise data architecture to meet the evolving needs of the organization and its clients. To qualify for this role, you should possess 10+ years of relevant professional experience in data management, business intelligence, and integration architecture, along with 6+ years of experience in designing and implementing enterprise data architectures. You should have expertise in cloud-based data architectures, proficiency in data integration tools, experience with relational databases, and a strong understanding of BI platforms. Additionally, you should have hands-on experience with data governance, security, and compliance frameworks, as well as exceptional communication and stakeholder management skills. Joining Myers-Holum as an Enterprise Architect offers you the opportunity to collaborate with curious and thought-provoking minds, shape your future, and positively influence change for clients. You will be part of a dynamic team that values continuous learning, growth, and innovation, while providing stability and growth opportunities within a supportive and forward-thinking organization. If you are ready to embark on a rewarding career journey with Myers-Holum and contribute to the evolution of enterprise data architecture, we invite you to explore the possibilities and discover your true potential with us.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

delhi

On-site

We are seeking a highly skilled and experienced Talend Developer with 3-5 years of expertise in ETL development, data integration, and data pipeline optimization. The ideal candidate will possess a strong background in Talend, SQL, and cloud-based data solutions. The key responsibilities for this role include extensive experience in Talend ETL development, encompassing the design, construction, and maintenance of data integration workflows. Proficiency in SQL, data modelling, and working with various databases such as MySQL, PostgreSQL, Oracle, or SQL Server is essential. Additionally, the candidate should have experience in integrating data from diverse sources like APIs, flat files, cloud storage, and databases. Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud for data pipeline deployment and optimization is required. Knowledge of big data technologies like Hadoop, Spark, and Kafka is beneficial, with a specific focus on ETL performance tuning. Familiarity with job scheduling tools like Apache Airflow, Talend Administration Center (TAC), and Control-M is also desirable. The successful candidate should possess strong analytical skills to troubleshoot ETL failures, enhance performance, and refine data workflows. Effective communication abilities are crucial for articulating technical solutions to business stakeholders and collaborating with cross-functional teams. Qualifications for this position include 3-5 years of hands-on experience in Talend development, ETL workflows, and data integration. A solid understanding of data warehousing concepts, ETL best practices, and data governance principles is also required.,

Posted 3 weeks ago

Apply

1.0 - 4.0 years

2 - 4 Lacs

Hyderabad

Remote

Job description We have a vacancy with below details, Role : Analyst, Data Sourcing Metadata -Cloud Designations: Analyst Experience -1-4 Notice Period : Immediate to 60 days ( Currently serving) Work Mode : WFH(Remote) Working Days : 5 days Mandatory Skills : Data Management, SQL, Cloud tools(AWS/Azure/GCP),ETL Tools (Ab initio, Collibra, Informatica),Data Catalog, Data Lineage, Data Integration, Data Dictionary, Maintenance, RCA, Issue Analysis Required Skills/Knowledge: Bachelors Degree, preferably in Engineering or Computer Science with more than 1 years hands-on Data Management experience or in lieu of a degree with more than 3 years experience Minimum of 1 years experience in data management, focusing on metadata management, data governance, or data lineage, with exposure to cloud environments (AWS, Azure, or Google Cloud) and on-premise infrastructure. Basic understanding of metadata management concepts, familiarity with data cataloging tools (e.g., AWS Glue Data Catalog, AbInitio, Collibra), basic proficiency in data lineage tracking tools (e.g., Apache Atlas, AbInitio, Collibra), and understanding of data integration technologies (e.g., ETL, APIs, data pipelines). Good communication and collaboration skills, strong analytical thinking and problemsolving abilities, ability to work independently and manage multiple tasks, and attention to detail Desired Characteristics: AWS certifications such as AWS Cloud practitioner, AWS Certified Data Analytics Specialty Familiarity with hybrid cloud environments (combination of cloud and on-prem). Skilled in Ab Initio Metahub development and support including importers, extractors, Metadata Hub database extensions, technical lineage, QueryIT, Ab Initio graph development, Ab Initio Control center and Express IT Experience with harvesting technical lineage and producing lineage diagrams. Familiarity with Unix, Linux, Stonebranch, and familiarity with database platforms such as Oracle and Hive Basic knowledge of SQL and data query languages for managing and retrieving metadata. Understanding of data governance frameworks (e.g., EDMC DCAM, GDPR compliance). • Familiarity with Collibra

Posted 3 weeks ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Vapi

Work from Office

The Business Analyst/Senior Business Analyst (BA/SBA) for Master Data Management (MDM) in the Shared Service Center (SSC) will be responsible for managing and ensuring the accuracy and consistency of the organization's master data. This role will involve working closely with various departments to collect, analyze, and make changes to data as necessary. The BA/SBA will also be responsible for creating and implementing data standards and policies.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies