Jobs
Interviews

2478 Data Integration Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 2.0 years

1 - 3 Lacs

Ahmedabad

Work from Office

MIS Design & System Management Maintain and enhance spreadsheets and digital MIS tools aligned with project indicators and outcomes Collaborate with program teams to ensure system design aligns with log frames and donor requirements Create dashboards and trackers using Excel, Google Sheets, or Google Looker Studio Data Collection & Entry Coordinate and monitor data collection processes using digital platforms Validate and clean data sets to ensure consistency and reliability Provide support in digitizing data formats and improving collection tools Reporting & Documentation Generate periodic (weekly/monthly/quarterly) reports for internal teams and external partners Summarize data through charts, tables, and presentations for program reviews and strategic decisions Contribute to documentation including donor reports, case studies, and visual reports Data Quality & Monitoring Support Conduct data audits, validations, and troubleshoot discrepancies Use MIS tools to track project KPIs, outputs, and outcomes Support baseline, midline, and endline surveys with structured MIS inputs Training & Capacity Building Train staff and partners on MIS tools, data formats, and standard operating procedures Provide troubleshooting support and create/upkeep user guides and manuals Coordination & Collaboration Work closely with cross-functional teams to ensure accurate and timely data submissions Support dashboard development for project performance reviews Collaborate with M&E and IT teams to improve MIS effectiveness and data integration Mandatory Qualification And Experience Bachelors degree in Computer Science, Information Technology, Statistics, Data Science, or related fields 13 years of experience in MIS, data management, or M&E roles, preferably in the development/CSR sector Technical Skills Proficient in Advanced Excel (pivot tables, formulas, data validation, dashboards) Familiarity with Google Looker Studio, Google Sheets, and basic data visualization Hands-on experience with mobile data collection platforms like Kobo Toolbox, ODK, or Google Forms Understanding of MIS design principles aligned with M&E frameworks Soft Skills Strong analytical skills with attention to detail Excellent communication and presentation abilities Ability to multitask, prioritize responsibilities, and meet deadlines Team-oriented with a proactive and problem-solving mindset Why Join Us Work with passionate teams driving change at scale Enhance your skills in data systems and social impact measurement Be part of a dynamic work environment that values innovation, ownership, and collaboration How to apply Email your CV and a brief cover letter to career@csrbox org Subject Line: Application for Sr Associate MIS Coordinator Please Include Current Location Years of Relevant Experience Current and Expected CTC Notice Period A brief (150200 word) summary of your experience in CSR-health partnerships or donor-led projects

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

About Boomi And What Makes Us Special Are you ready to work at a fast-growing company where you can make a differenceBoomi aims to make the world a better place by connecting everyone to everything, anywhere Our award-winning, intelligent integration and automation platform helps organizations power the future of business At Boomi, youll work with world-class people and industry-leading technology We hire trailblazers with an entrepreneurial spirit who can solve challenging problems, make a real impact, and want to be part of building something big If this sounds like a good fit for you, check out boomi com or visit our Boomi Careers page to learn more Essential Requirements 1+ yearsexperience in the software engineering industry, with experience supporting large scale software systems in production Working experience with AI technologies Strong understanding and working experience with GCP/Azure/AWS Experience in Ansible/Terraform and Python Operations and Incident Management Desirable Requirements Experience in developing terraform and automation for Infrastructure as code using Terraform and Cloud Formation Templates Basic understanding of Application Integration and/or Data Integration (ETL) Be Bold Be You Be Boomi We take pride in our culture and core values and are committed to being a place where everyone can be their true, authentic self Our team members are our most valuable resources, and we look for and encourage diversity in backgrounds, thoughts, life experiences, knowledge, and capabilities All employment decisions are based on business needs, job requirements, and individual qualifications Boomi strives to create an inclusive and accessible environment for candidates and employees If you need accommodation during the application or interview process, please submit a request to talent@boomi com This inbox is strictly for accommodations, please do not send resumes or general inquiries

Posted 1 month ago

Apply

2.0 - 6.0 years

8 - 12 Lacs

Pune

Work from Office

Job Summary ? Proficiency with major search engines and platforms such as Coveo, Elasticsearch, Solr, MongoDB Atlas, or similar technologies ? Experience with Natural Language Processing (NLP) and machine learning techniques for search relevance and personalization ? Ability to design and implement ranking algorithms and relevance tuning ? Experience with A/B testing and other methods for optimizing search results ? Experience with analyzing search logs and metrics to understand user behavior and improve search performance ? Deep understanding of indexing, data storage, and retrieval mechanisms (RAG) ? Experience with data integration, ETL processes, and data normalization ? Knowledge of scaling search solutions to handle large volumes of data and high query loads ? Strong knowledge of programming languages like C# NET, Python, or JavaScript for developing and customizing search functionalities ? Experience in integrating search solutions with various APIs and third party systems ? Understanding of how search interfaces impact user experience and ways to improve search usability and efficiency ? Experience with enterprise level systems and an understanding of how search integrates with broader IT infrastructure and business processes

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Pune

Work from Office

About the Role Were looking for a Data Engineer to help build reliable and scalable data pipelines that power reports, dashboards, and business decisions at Hevo. Youll work closely with engineering, product, and business teams to make sure data is accurate, available, and easy to use. Key Responsibilities Independently design and implement scalable ELT workflows using tools like Hevo, dbt, Airflow, and Fivetran. Ensure the availability, accuracy, and timeliness of datasets powering analytics, dashboards, and operations. Collaborate with Platform and Engineering teams to address issues related to ingestion, schema design, and transformation logic. Escalate blockers and upstream issues proactively to minimize delays for stakeholders. Maintain strong documentation and ensure discoverability of all models, tables, and dashboards. Own end-to-end pipeline quality, minimizing escalations or errors in models and dashboards. Implement data observability practices such as freshness checks, lineage tracking, and incident alerts. Regularly audit and improve accuracy across business domains. Identify gaps in instrumentation, schema evolution, and transformation logic. Ensure high availability and data freshness through monitoring, alerting, and incident resolution processes. Set up internal SLAs, runbooks, and knowledge bases (data catalog, transformation logic, FAQs). Improve onboarding material and templates for future engineers and analysts Required Skills & Experience 3-5 years of experience in Data Engineering, Analytics Engineering, or related roles. Proficient in SQL and Python for data manipulation, automation, and pipeline creation. Strong understanding of ELT pipelines, schema management, and data transformation concepts. Experience with modern data stack : dbt, Airflow, Hevo, Fivetran, Snowflake, Redshift, or BigQuery. Solid grasp of data warehousing concepts: OLAP/OLTP, star/snowflake schemas, relational & columnar databases. Understanding of Rest APIs, Webhooks, and event-based data ingestion. Strong debugging skills and ability to troubleshoot issues across systems. Preferred Background Experience in high-growth industries such as eCommerce, FinTech, or hyper-commerce environments. Experience working with or contributing to a data platform (ELT/ETL tools, observability, lineage, etc.). Core Competencies Excellent communication and problem-solving skills Attention to detail and a self-starter mindset High ownership and urgency in execution Collaborative and coachable team player Strong prioritization and resilience under pressure

Posted 1 month ago

Apply

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 1 month ago

Apply

1.0 - 3.0 years

10 - 12 Lacs

Bengaluru

Remote

About the Role: We are looking for a talented Anaplan Modeler to join our team on a contract basis. You will be responsible for designing, developing, and maintaining Anaplan models to support business planning, forecasting, and other analytical processes. This role involves translating business requirements into efficient and scalable Anaplan solutions, collaborating with stakeholders, and ensuring the models meet performance and accuracy standards. Roles and Responsibilities: Must have Modelling experience in ANAPLAN projects, including implementations, upgrades, roll outs and/or support. Comfortable with creating Model, Modules, Lists, Line Items, Subsets, Line-Item Subsets, Usage of Calculation functions and dashboards using best practices. Been introduced or worked with Anaplan Optimizer, Integration methods and ALM within Anaplan. Ability to have direct discussions with clients to understand their needs and then design, develop, maintain and elaborate planning models. Anaplan Certified Model Builder Certification is a plus. Assist to Conduct, Document and Signoff Business Requirement with clients. Assign the User stories and assist in Sprint Planning. Hands on Modelling Experience in ANAPLAN Implementation focused on but not limited to Financial Forecasting, Supply Chain Planning and HR/Sales/Incentive Compensation Management or similar use cases. Strong background and experience in consulting roles focused on Sales Performance Planning / Supply chain / Financial Planning. Familiarity with SCRUM/Agile. Hands on in MS Excel using advanced formulae to develop Mock Ups for clients. Ability to effectively communicate with client team and in client facing roles. Qualifications: Any Bachelors degree in Finance, Accounting, Business, Computer Science, or a related field or MBA Finance. How will DataGrokr support you in your growth: You will be actively encouraged to attain certifications, lead technical workshops and conduct meetups to grow your own technology acumen and personal brand. You will work in a open culture that promotes commitment over compliance, individual responsibility over rules and bringing out the best in everyone

Posted 1 month ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Pune

Remote

Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME

Posted 1 month ago

Apply

4.0 - 8.0 years

20 - 30 Lacs

Noida

Work from Office

Role & responsibilities Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Preferred candidate profile 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft, or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks 5-7 years Experience working in a Services Delivery Organization directly reporting to the client Strong communication skills. Excellent interpersonal skills for building and maintaining positive relationships. Exceptional collaboration skills with the ability to work effectively with customers and internal teams. Experienced in gathering business requirements and translating them into actionable technical plans, and aligning teams for successful execution. Strong analytical, troubleshooting, and problem-solving skills. Proven ability to lead and mentor junior team members. Self-motivated with a strong commitment to delivering high-quality results under tight deadlines.

Posted 1 month ago

Apply

6.0 - 10.0 years

27 - 30 Lacs

Bengaluru

Work from Office

We are looking for an experienced Azure Data Factory (ADF) Developer to design, develop, and optimize data integration and ETL pipelines on Azure. The ideal candidate will have strong expertise in ADF, Azure Synapse, Azure Databricks, and other Azure data services. They should be skilled in ETL processes, data warehousing, and cloud-based data solutions while ensuring performance, security, and scalability. Key Responsibilities: Design and develop ETL pipelines using Azure Data Factory (ADF) to ingest, transform, and process data. Integrate ADF with other Azure services like Azure Synapse Analytics, Azure Data Lake, Azure Databricks, and SQL Database. Develop data transformations using Mapping Data Flows, SQL, and Python/PySpark. Optimize ADF performance, data flow, and cost efficiency for scalable data solutions. Automate data pipelines, scheduling, and orchestration using triggers and event-driven workflows. Troubleshoot ADF pipeline failures, performance bottlenecks, and debugging issues. Work with Azure Monitor, Log Analytics, and Application Insights for data pipeline monitoring. Ensure data security, governance, and compliance with Azure security best practices. Collaborate with data engineers, cloud architects, and business stakeholders to define data strategies. Implement CI/CD for data pipelines using Azure DevOps, Git, and Infrastructure as Code (Terraform, ARM templates, or Bicep).

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 9 Lacs

Noida

Work from Office

Role & responsibilities Design and build advanced applications using Canvas Apps , Model-Driven Apps , and Power Pages . Architect and implement robust business processes using Power Automate (cloud flows, instant flows, scheduled flows, desktop RPA) with advanced error handling, condition logic, and external system integrations. Write and optimize complex Power Fx formulas for dynamic behavior, conditional logic, calculations, and UI interactivity within Canvas Apps. Develop custom PowerApps Component Framework (PCF) components using TypeScript, JavaScript, and HTML/CSS . Integrate Power Platform solutions with Dataverse , SQL Server , SharePoint , and other external systems via REST APIs and custom connectors. Implement CI/CD pipelines and solution ALM using Azure DevOps . Leverage Azure Functions , Logic Apps , and API Management for scalable backend integration. Follow best practices in governance, security roles, solution management, and lifecycle management. Collaborate with business analysts, stakeholders, and other technical teams to translate business requirements into scalable solutions. Provide technical documentation, support, and training to internal users and client teams. Preferred candidate profile 5+ years of hands-on experience with Microsoft Power Platform , especially in building complex, enterprise-grade apps and flows. Expert-level experience with Power Automate , including: Complex multi-step workflows Dynamic approvals and role-based logic Integration with legacy systems and third-party APIs Desktop automation using Power Automate Desktop (RPA) Deep understanding of and fluency in Power Fx for Canvas Apps.

Posted 1 month ago

Apply

5.0 - 7.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Job Overview: We are seeking a highly skilled Technical Data Analyst for a remote contract position (6 to 12 months) to help build a single source of truth for our high-volume direct-to-consumer accounting and financial data warehouse. You will work closely with Finance & Accounting teams and play a pivotal role in dashboard creation, data transformation, and migration from Snowflake to Databricks. Key Responsibilities: 1. Data Analysis & Reporting Develop month-end accounting and tax dashboards using SQL in Snowflake (Snowsight) Migrate and transition reports/dashboards to Databricks Gather, analyze, and transform business requirements from finance/accounting stakeholders into data products 2. Data Transformation & Aggregation Build transformation pipelines in Databricks to support balance sheet look-forward views Maintain data accuracy and consistency throughout the Snowflake Databricks migration Partner with Data Engineering to optimize pipeline performance 3. ERP & Data Integration Support integration of financial data with NetSuite ERP Validate transformed data to ensure correct ingestion and mapping into ERP systems 4. Ingestion & Data Ops Work with Fivetran for ingestion and resolve any pipeline or data accuracy issues Monitor data workflows and collaborate with engineering teams on troubleshooting Required Skills & Qualifications: 5+ years of experience as a Data Analyst (preferably in Finance/Accounting domain) Strong in SQL, with proven experience in Snowflake and Databricks Experience in building financial dashboards (month-end close, tax reporting, balance sheets) Understanding of financial/accounting data: GL, journal entries, balance sheet, income statements Familiarity with Fivetran or similar data ingestion tools Experience with data transformation in a cloud environment Strong communication and stakeholder management skills Nice to have: Experience working with NetSuite ERP Apply Now: Please share your updated resume with the following details: Full Name Total Experience Relevant Experience in SQL, Snowflake, Databricks Experience in Finance or Accounting domain Current Location Availability (Notice Period) Current and Expected Rate location-remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad Contract Duration: 6 Months to 1 Year

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 17 Lacs

Bengaluru

Work from Office

Looking for a skilled SAP BODS Professional with 6 to 11 years of experience. The ideal candidate will have expertise in SAP BODS and S4 Hana. This position is based in Bangalore, Hyderabad, and Chennai. Roles and Responsibility Design, develop, and implement SAP BODS solutions to meet business requirements. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain technical documentation for SAP BODS projects. Troubleshoot and resolve complex technical issues related to SAP BODS. Provide training and support to end-users on SAP BODS applications. Ensure data quality and integrity by implementing data validation and testing procedures. Job Requirements Strong knowledge of SAP BODS and S4 Hana technologies. Experience with data migration and integration projects. Excellent problem-solving skills and attention to detail. Ability to work independently and as part of a team. Strong communication and interpersonal skills. Familiarity with agile development methodologies and version control systems. Notice period: Immediate.

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Chandigarh, Dadra & Nagar Haveli

Work from Office

We are looking for a highly skilled and experienced professional to join our team at IDESLABS PRIVATE LIMITED, located in . The ideal candidate will have 6-8 years of experience. Roles and Responsibility Collaborate with cross-functional teams to design and implement Anaplan models. Develop and maintain complex financial models using Anaplan's modeling capabilities. Analyze business requirements and provide solutions using Anaplan's data integration features. Create reports and dashboards to visualize key performance indicators. Work closely with stakeholders to understand business needs and provide training on model usage. Troubleshoot issues related to Anaplan models and provide technical support. Job Requirements Strong understanding of Anaplan concepts, including blocks, rules, and workflows. Experience working with large datasets and integrating them into Anaplan models. Excellent analytical and problem-solving skills with attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Strong knowledge of financial concepts and principles, including accounting and budgeting. Familiarity with other financial planning tools and software is an added advantage. Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Hyderabad,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim

Posted 1 month ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Bengaluru

Work from Office

We are looking for a skilled Alteryx Professional with 6 to 11 years of experience. The ideal candidate will have expertise in SQL and be proficient in using Alteryx tools. This position is based in Chennai and Bangalore, and immediate joiners or candidates with a notice period within 15 days are preferred. Roles and Responsibility Design, develop, and implement data integration solutions using Alteryx. Collaborate with cross-functional teams to identify business requirements and develop technical solutions. Develop and maintain complex SQL queries for data extraction and manipulation. Troubleshoot and resolve issues related to data flow and integration. Ensure data quality and integrity by implementing data validation and testing procedures. Provide technical support and training to end-users on Alteryx tools and techniques. Job Requirements Strong understanding of SQL concepts and ability to write complex queries. Proficiency in Alteryx tools and technologies. Experience working with large datasets and developing scalable data integration solutions. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders. Familiarity with data warehousing concepts and ETL processes.

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Mumbai

Work from Office

Job Summary : Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Roles & Responsibilities: As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrowwith a positive impact on the business, society and the planet. Together, lets innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse. Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse. Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on clients current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with clients systems. Qualification Professional & Technical Skills: MBA from Tier-1 B-school 5+ years of experience of working as a Integration architect on Kinaxis RapidResponse End-to-end implementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods, preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 10 Lacs

Gurugram

Work from Office

Job Summary : Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Roles & Responsibilities: As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrowwith a positive impact on the business, society and the planet. Together, lets innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse. Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse. Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on clients current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with clients systems. Qualification Professional & Technical Skills: MBA from Tier-1 B-school 5+ years of experience of working as a Integration architect on Kinaxis RapidResponse End-to-end implementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods, preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to coding standards and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data processing and analytics workflows.- Experience with cloud-based data solutions and architectures.- Familiarity with programming languages such as Python or Scala.- Knowledge of data integration techniques and ETL processes. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Project Role : Enterprise Technology Architect Project Role Description : Architect complex end-to-end IT solutions across the enterprise. Apply the latest technology and industry expertise to create better products and experiences. Must have skills : PLM for Capital Asset Lifecycle Management Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a visionary and hands-on Senior Digital Thread Architect to lead the design, integration, and implementation of a scalable, cross-domain digital thread ecosystem across product lifecycle stages. The ideal candidate will bridge engineering, manufacturing, supply chain, and operations using advanced platforms like PLM, MES, IoT, simulation, and digital twinswith a strong emphasis on AI-driven automation and real-time data integration. Roles & Responsibilities:1. Digital Thread Architecture & Integration:Design and govern a modular digital thread architecture that connects design, simulation, manufacturing, operations, and service. 2. Define data flows, semantic models, APIs, and platform interoperability across systems (PLM, ERP, MES, SCADA, Digital Twin). 3. Lead deployment and integration of tools like NVIDIA, Siemens Teamcenter, PTC Windchill, Dassault, Ansys, and industrial IoT platforms. 4. Platform & Twin Enablement:Architect scalable digital twin frameworks for assets, processes, and systems. 5. Drive integration of real-time data from sensors, PLCs, and edge devices into digital twin environments. 6. Collaborate on simulation and AI/ML model orchestration. 7. Develop custom interfaces, adapters, and microservices to enable data flow between systems (e.g.,Siemens, Aveva, SAP, Jira etc). 8. Build and manage data pipelines and middleware to facilitate interoperability between systems. 9. Define and enforce standards for traceability, version control, and configuration management across engineering and manufacturing domains. 8. Model Governance & Lifecycle:Establish traceability and versioning strategies for data, models, and simulations across the product lifecycle. 9. Implement lifecycle governance for MBSE, CAD/CAE models, BOMs, and software-defined products. 10. Ensure security, quality, and compliance within the digital thread ecosystem. 11. Collaboration & Strategy:Engage with engineering, IT, manufacturing, and business teams to align initiatives and roadmaps. 12. Provide architectural input to Industry 4.0 strategy, enterprise digital platforms, and infrastructure plans. 13. Evaluate and recommend new tools, standards (e.g., ISO 10303, SysML, USD), and emerging technologies. Professional & Technical skills:1. 10+ years in digital transformation, PLM, or manufacturing systems. 2. Deep knowledge of the following:PLM systems (Teamcenter, Windchill, Aras) MBSE / SysML / model-based engineering workflows Real-time data integration / IIoT platforms Simulation tools / Digital twin frameworks Proven experience in enterprise architecture or platform design roles. Hands-on experience with APIs, data modeling, integration frameworks, and cloud-native environments. Background in industries such as Aerospace, industrial, Consumer Products, Lifescience, or utilities. Additional Info:1. The candidate should have a minimum of 12 years of experience in PLM/ALM digital thread architecture. 2. A Bachelors or master's in engineering, Computer Science, Systems Engineering, or related field is required. 3. This position is based at Bengaluru location. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Project Role : Security Delivery Practitioner Project Role Description : Assist in defining requirements, designing and building security components, and testing efforts. Must have skills : Informatica PowerCenter Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Delivery Practitioner, you will assist in defining requirements, designing and building security components, and testing efforts. Your typical day will involve collaborating with various teams to ensure that security measures are effectively integrated into the project lifecycle. You will engage in discussions to understand security needs, contribute to the design of security frameworks, and participate in testing to validate the effectiveness of security solutions. Your role will be pivotal in ensuring that security considerations are embedded in all aspects of project delivery, fostering a culture of security awareness and compliance within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions to enhance team knowledge on security practices.- Monitor and evaluate the effectiveness of security measures implemented across projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Good To Have Skills: Experience with Python (Programming Language).- Strong understanding of data integration and ETL processes.- Experience with data quality and governance frameworks.- Familiarity with security compliance standards and best practices. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users while adhering to best practices in software development. You will also be responsible for troubleshooting issues and implementing solutions that enhance the overall functionality and performance of the applications. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to foster their professional growth and development.- Continuously evaluate and improve development processes to enhance team efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Knowledge of data warehousing concepts and practices. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Coimbatore office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : IBM InfoSphere DataStage Good to have skills : Snowflake Data WarehouseMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of innovative solutions that meet client needs. You will be responsible for delivering high-quality code while adhering to best practices and standards in software development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM InfoSphere DataStage.- Must To Have Skills: Experience with Snowflake Data Warehouse.- Strong understanding of ETL processes and data integration techniques.- Minimum 4 years of experience with database management and SQL.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 7 years of experience in IBM InfoSphere DataStage & Snowflake.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Microsoft SQL Server Integration Services (SSIS) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of software specifications and design.- Collaborate with cross-functional teams to gather requirements and provide feedback. Professional & Technical Skills: - 4-5 years experience with Datawarehousing concepts.- Proficiency in any Data Modelling Tools-Familiarity with ETL processes and Tools. Strong MS SQL server knowledge.-Strong communication skills & ITSM concepts.-WhereScape RED data warehouse automation tool knowledge (good to have) - Must To Have Skills: Proficiency in Microsoft SQL Server Integration Services (SSIS).- Strong understanding of database management and data integration techniques.- Familiarity with data modeling and database design principles.- Ability to troubleshoot and optimize SQL queries for performance. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft SQL Server Integration Services (SSIS).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the required standards and functionality. You will also be responsible for developing new features and addressing any issues that arise, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services.- Strong understanding of data integration and ETL processes.- Experience with data quality and data profiling techniques.- Familiarity with database management systems and SQL.- Ability to troubleshoot and resolve technical issues effectively. Additional Information:- The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10.0 - 15.0 years

7 - 11 Lacs

Noida

Work from Office

R1 RCM India is proud to be recognized amongst India's Top 50 Best Companies to Work For TM 2023 by Great Place To Work Institute. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare simpler and enable efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 14,000 strong in India with offices in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Description : We are seeking a highly skilled and motivated Data Cloud Architect to join our Product and technology team. As a Data Cloud Architect, you will play a key role in designing and implementing our cloud-based data architecture, ensuring scalability, reliability, and optimal performance for our data-intensive applications. Your expertise in cloud technologies, data architecture, and data engineering will drive the success of our data initiatives. Responsibilities: Collaborate with cross-functional teams, including data engineers, data leads, product owner and stakeholders, to understand business requirements and data needs. Design and implement end-to-end data solutions on cloud platforms, ensuring high availability, scalability, and security. Architect delta lakes, data lake, data warehouses, and streaming data solutions in the cloud. Evaluate and select appropriate cloud services and technologies to support data storage, processing, and analytics. Develop and maintain cloud-based data architecture patterns and best practices. Design and optimize data pipelines, ETL processes, and data integration workflows. Implement data security and privacy measures in compliance with industry standards. Collaborate with DevOps teams to deploy and manage data-related infrastructure on the cloud. Stay up-to-date with emerging cloud technologies and trends to ensure the organization remains at the forefront of data capabilities. Provide technical leadership and mentorship to data engineering teams. Qualifications: Bachelors degree in computer science, Engineering, or a related field (or equivalent experience). 10 years of experience as a Data Architect, Cloud Architect, or in a similar role. Expertise in cloud platforms such as Azure. Strong understanding of data architecture concepts and best practices. Proficiency in data modeling, ETL processes, and data integration techniques. Experience with big data technologies and frameworks (e.g., Hadoop, Spark). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with data warehousing solutions (e.g., Redshift, Snowflake). Strong knowledge of security practices for data in the cloud. Excellent problem-solving and troubleshooting skills. Effective communication and collaboration skills. Ability to lead and mentor technical teams. Additional Preferred Qualifications: Bachelors degree / Master's degree in Data Science, Computer Science, or related field. Relevant cloud certifications (e.g., Azure Solutions Architect) and data-related certifications. Experience with real-time data streaming technologies (e.g., Apache Kafka). Knowledge of machine learning and AI concepts in relation to cloud-based data solutions. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Navi Mumbai

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users and stakeholders. You will also be responsible for developing new features and functionalities, contributing to the overall success of the projects you are involved in, and ensuring high-quality deliverables through rigorous testing and validation processes. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with programming languages such as Python or Scala.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies