Jobs
Interviews

2451 Data Integration Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

Spaulding Ridge is an advisory and IT implementation firm dedicated to helping global organizations achieve financial clarity in their daily sales and operational decisions. We believe in the personal aspect of business and prioritize building strong relationships with clients, partners, our team, and the global community. Our employees are committed to assisting clients in transforming their businesses from strategy development through to implementation and business transformation. We are currently looking for a skilled and experienced Salesforce Integration Architect to join our Data Solutions team. This role is ideal for an architect passionate about designing and implementing robust data integration and migration strategies within complex enterprise environments, with a specific focus on Salesforce and related data platforms. As a Salesforce Integration Architect at Spaulding Ridge, your responsibilities will include: - Leading Salesforce Data Integration & Migration: Architecting, designing, and overseeing the implementation of complex data integration and migration solutions involving Salesforce. - Leveraging Integration Platforms: Driving solution design and development using leading integration platforms such as MuleSoft, Boomi, Celigo, and Workato to ensure optimal data connectivity and performance. - Expertise in Salesforce Data Movement: Demonstrating strong experience with Salesforce APIs (e.g., Bulk API, SOAP, REST) and implementing best practices for high-volume data movement to and from Salesforce. - Specialization in DBAmp & Database Integration: Utilizing and recommending DBAmp or other data migration tools for efficient, large-scale data replication and integration between Salesforce and relational databases (SQL/Oracle DB). - Proof of Concept Development: Designing and building compelling Proof of Concepts (POCs) specifically for data integration with various platforms such as Salesforce, Anaplan, Workday, NetSuite, SQL/Oracle DB, and Snowflake. - Technical Leadership & Mentorship: Providing technical leadership and guidance, managing, mentoring, and developing junior integration resources on design patterns, data governance, and integration technologies. - Pre-Sales and Estimation: Actively participating in pre-sales activities, solutioning, estimation, and scoping, including leading oral sessions and developing POCs for potential clients focused on data integration challenges. - Problem Identification & Solutioning: Identifying critical data integration issues and structural problems, generating actionable recommendations using standard methodologies. - Advisory Role: Serving as a trusted advisor to clients on intelligent data solutions and integration strategies. Qualifications: - 10+ years of professional consulting or relevant industry experience with a focus on data integration, migration, and enterprise system implementations. - 7+ years hands-on experience developing data integration solutions within MuleSoft Anypoint Platform or Boomi Platform. - Strong expertise in working with Salesforce APIs for high-volume data movement. - Proven experience with DBAmp for Salesforce data replication and integration with external databases. - Demonstrated experience in Application Integration Architecture Solution Design and Development. - Ability to lead a team on data design, integration patterns, and data technologies. - Understanding of Quote-to-Cash/Order-to-Cash processes from a data integration perspective. - Certifications in MuleSoft, Boomi, and other integration tools are highly desirable. - Knowledge of data integration with NetSuite processes, Anaplan, or OneStream is a plus. - Experience with CI/CD pipelines for integration deployments. At Spaulding Ridge, we are committed to creating an inclusive workplace that values diversity and fosters a culture of trust and belonging. We believe in offering Equal Employment Opportunity and providing reasonable accommodation to applicants with disabilities. If you require accommodation during the application process, please reach out to our VP of Human Resources, Cara Halladay, at challaday@spauldingridge.com.,

Posted 3 weeks ago

Apply

1.0 - 5.0 years

0 - 0 Lacs

chennai, tamil nadu

On-site

You will be responsible for Backend Software Engineering and play a key role in developing server-side applications. Your primary tasks will include writing clean and efficient code in languages such as Python, Java, Ruby, or Node.js to implement various backend functionalities. As a Backend Software Engineer, you will also be involved in designing and managing databases, including data modeling, indexing, and optimization. Additionally, you will create and maintain APIs (Application Programming Interfaces) to facilitate communication between frontend and backend systems. Your role will also involve integrating external systems and services with the backend application, implementing security measures and authentication mechanisms to safeguard sensitive data, and ensuring scalability of backend systems to handle a large number of concurrent users efficiently. Furthermore, you will be responsible for conducting comprehensive testing and debugging of backend code to identify and resolve any issues. You will also be tasked with optimizing backend code and database queries to enhance application performance and user experience.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a skilled MDM Solutions Developer using Profisee, you will play a crucial role in the design, development, implementation, and maintenance of our MDM solutions. Your expertise in data governance, data quality, and data integration will be instrumental in ensuring the accuracy, consistency, and completeness of our master data. This position demands strong technical skills, exceptional communication abilities, and effective collaboration with cross-functional teams. In your role, you will lead the design and development of MDM solutions using Profisee. This includes creating data models, workflows, business rules, and user interfaces. You will be responsible for translating business requirements into technical specifications and configuring the Profisee platform to cater to specific business needs. Developing and implementing data quality rules and integration processes between Profisee and other enterprise systems will also be part of your responsibilities. Throughout the MDM implementation lifecycle, you will be involved in requirements gathering, design, development, testing, deployment, and support. This entails executing test plans, troubleshooting and resolving issues, as well as deploying and configuring Profisee environments. Additionally, you will contribute to data governance efforts by enforcing policies, defining data ownership, and ensuring compliance with data privacy regulations. Monitoring the performance and stability of the MDM environment, providing ongoing support and maintenance, and proactively addressing data quality and performance issues are key aspects of this role. Collaboration with business users, IT staff, and stakeholders to understand data requirements, effective communication with technical and non-technical audiences, and mentoring junior team members on MDM best practices and Profisee platform usage will also be part of your responsibilities.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Data Architect at Colt, you will play a crucial role in shaping the data landscape, ensuring alignment with business objectives, and driving innovation through effective data management practices. Your responsibilities will include developing and articulating a comprehensive data strategy that aligns with the organization's vision, mission, and long-term goals. You will collaborate with senior leadership, including the Chief Data Officer (CDO), to define data-related priorities and roadmaps. Your role will involve leading a team of skilled data architects, collaborating with cross-functional stakeholders, and defining the strategic direction for data initiatives. You will own and drive the future state data architecture, ensuring scalability, flexibility, and adherence to industry best practices. Additionally, you will establish and maintain data architecture standards, guidelines, and principles across the organization. In terms of data modeling and design, you will be responsible for ensuring high-quality and consistent data modeling (conceptual, logical, physical). You will lead the development and maintenance of logical data models (LDMs) and associated physical models, collaborating with development teams to ensure adherence to architectural and modeling standards. Stakeholder engagement is a key aspect of your role, as you will partner with the Data Management team to drive the group's data strategy, collaborate with business units to extract greater value from data assets, and engage with key stakeholders to identify technical opportunities for enhancing data product delivery. You will also provide consultative advice to business leaders and organizational stakeholders, making actionable recommendations to guide investment decisions. As a team leader, you will build and lead a federated team of Data Architects within the function and across the organization. You will guide and mentor team members, fostering a culture of excellence and continuous learning. Quality assurance will also be a focus, ensuring the quality of data designs proposed by the team and upholding data management principles and best practices. To excel in this role, you should have a Master's or Bachelor's degree in a related field of study, along with ten or more years of experience in data architecture. Certifications such as TOGAF, Certified Architect (CA), Zachmann, and SAFE agile are required. Additionally, you should possess a range of skills, including knowledge of business ecosystems, familiarity with information management practices, proficiency with data warehousing solutions, expertise in data modeling tools and techniques, and experience with cloud platforms and data integration tools. Colt offers a dynamic work environment where you can grow and develop your skills. In addition to competitive salaries and incentive plans, Colt provides a range of benefits and local rewards packages to promote work-life balance. Benefits include flexible working arrangements, access to an online learning platform, business mentoring, and more. Join Colt and be part of a growing business that values its people.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

At Guidewire, we take pride in supporting our customers" mission to safeguard the world's most valuable investments. Insurance plays a crucial role in protecting our homes, businesses, and other assets, providing aid in times of need caused by natural disasters or accidents. Our goal is to provide a platform that enables Property and Casualty (P&C) insurers to offer the necessary products and services for individuals to recover from life's most challenging events. We are seeking a product management professional to join our Analytics and Data Services (ADS) team at Guidewire. The ADS team is dedicated to defining and designing new capabilities for the insurance market through our cutting-edge software solutions. In this role, you will collaborate with a diverse team of 50 engineers, data scientists, and risk modelers to create a dynamic cyber insurance data and analytics product suite. This suite will leverage data and machine learning to address various use cases, including cyber risk underwriting, pricing, enterprise risk management, and cyber threat assessment, which is identified as the #1 risk to US national security. Reporting to the Cyence Product Management team, you will play a vital role in driving innovation within an entrepreneurial culture. You will thrive in an environment where our core values of Integrity, Rationality, and Collegiality are ingrained in our daily operations. As a potential candidate, you should have a background in software, data, and analytics, along with experience working in a fast-paced environment involving multiple teams across different locations. You must be a problem-solver who is enthusiastic about developing top-notch products and overcoming market challenges. Your attention to detail, ability to motivate others, and collaboration skills will be key to supporting various teams, including Platform, UX, modeling, ML, Data Science, Quality Assurance, and GTM. Your responsibilities will include: - Vision: Envisioning innovative solutions, promoting our cyber vision, and driving breakthroughs to simplify complexity. - Technical Mastery: Collaborating with software and data teams, implementing best practices in product management, and owning the end-to-end requirements documentation process. - Product Leadership: Cultivating a culture of curiosity and craftsmanship, inspiring and developing R&D teams, and establishing product goals that drive motivation. - Execution: Achieving business outcomes through forward-thinking products, contributing to the creation and communication of roadmaps, and building trust through transparency and consistent delivery. Qualifications we are looking for: - Minimum 3 years of experience as a product manager, demonstrating a track record of delivering complex team projects on-time and with high quality. - 3+ years of experience in technical data management, integration architecture of cloud solutions, and security. - Strong desire to address complex insurance challenges using a B2B SaaS product model. - Excellent attention to detail and communication skills. - Proactive, focused, and quick to take ownership of tasks. - Familiarity with tools such as Aha, Atlassian suite, databases, prototyping tools, and technical knowledge of big data and cloud technologies is preferred. - Conceptual understanding of microservices, distributed systems, AWS, and the Big Data ecosystem. - Comfortable with data ingestion, cataloging, integration, and enrichment concepts. - Previous experience with B2B SaaS companies and software engineering is advantageous. - Bachelor's or Master's degree in engineering, analytics, mathematics, or software development. - Ability to overlap at least 2 hours with US time zones 3-4 days a week. About Guidewire: Guidewire is the trusted platform for P&C insurers to engage, innovate, and grow efficiently. Our platform integrates digital, core, analytics, and AI services, delivered as a cloud service. With over 540 insurers in 40 countries relying on Guidewire, we support new ventures as well as the largest and most complex organizations worldwide. As a partner to our customers, we continuously evolve to ensure their success. With a remarkable implementation track record of over 1600 successful projects, supported by the industry's largest R&D team and partner ecosystem, we are dedicated to accelerating integration, localization, and innovation through our Marketplace. For more information, please visit www.guidewire.com and follow us on Twitter: @Guidewire_PandC.,

Posted 3 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Profisee MDM Consultant at Quantum Integrators, you will be responsible for designing, developing, implementing, and maintaining MDM solutions using Profisee. Your expertise in data governance, data quality, and data integration will be crucial in ensuring the accuracy, consistency, and completeness of our master data. This role requires strong technical skills, excellent communication abilities, and effective collaboration with cross-functional teams. Your responsibilities will include leading the design and development of MDM solutions using Profisee, translating business requirements into technical specifications, and configuring the Profisee platform to meet specific business needs. You will develop and implement data quality rules and processes, design data integration processes between Profisee and other enterprise systems, and participate in the full MDM implementation lifecycle. Additionally, you will contribute to the development and enforcement of data governance policies, work with data stewards to define data ownership and accountability, and ensure compliance with data privacy regulations and security policies. Monitoring the performance and stability of the MDM environment, providing ongoing support and maintenance for the MDM solution, and proactively addressing potential issues related to data quality and MDM performance will also be part of your role. Effective collaboration with business users, IT staff, and stakeholders, along with clear communication with technical and non-technical audiences, will be essential. You will participate in project meetings, provide regular status updates, and mentor and train junior team members on MDM best practices and the Profisee platform. If you have 3-8 years of experience, strong technical skills, and a passion for creating and sustaining competitive advantage through MDM solutions, we invite you to join our team at Quantum Integrators.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for the following roles: For Data Governance Lead: - You should have a minimum of 8-10 years of experience in the Informatica Platform, with expertise in Informatica, AXON, and EDC. - The job requires working from the office all 5 days in Chennai. - Immediate to 15 days notice period is required. - Your responsibilities will include sizing hardware, installing and configuring Informatica products on-premises and cloud, administering and configuring Informatica Products, and handling EDC setup. - You must have hands-on experience in EDC development activities like data discovery, data domain creation, data profiling, data lineage, and data curation. - Experience in architecting data governance solutions using Informatica tool AXON is essential. - Integration with other tools and Informatica tools like EDC, IDQ is necessary. - Ability to understand business context and translate it into AXON templates and facets is required. For MDM Lead: - We are seeking a resource with 8+ years of experience, including 5+ years of relevant experience in MDM. - You will be responsible for developing and configuring Informatica MDM on-premises solutions, including Hub Console, IDD, Match & Merge, and Hierarchy Manager. - The job requires working 5 days from the office in Chennai. - Immediate to 20 days notice period is expected. - Experience with SQL, PL/SQL, and relational databases such as Oracle, Teradata is required. - Understanding of data modelling, data integration, and ETL processes is necessary. Both roles are full-time and permanent positions with benefits including health insurance, provident fund, and a yearly bonus. The work schedule includes day shift, evening shift, fixed shift, Monday to Friday, and morning shift. To apply, you should have over 8 years of relevant experience and be able to join within 0-15 days as the work location is in person at our Chennai office.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As an IT Project Manager/Architect for Data Platform & Monitoring within Global Operations and Supply Chain IT, your primary responsibility is to lead the architecture, technical implementation, and overall management of the data platform and monitoring program. Your role is critical in the planning and execution of a strategic program that includes developing a centralized data platform to consolidate manufacturing systems data across all sites and implementing robust observability and monitoring capabilities for global manufacturing systems and applications. Success in this role demands strong coordination and communication skills to work seamlessly across cross-functional teams, ensuring alignment with organizational objectives, timelines, and delivery standards. You will be leading a team of 10-15 Global Operations Supply Chain team members in the core manufacturing and supply chain digital platform domain. Your responsibilities will include developing a comprehensive project plan, defining project scope, goals, and objectives, identifying potential risks, leading a diverse cross-functional project team, establishing a collaborative environment, and working closely with business stakeholders to gather and document functional and technical requirements for the IT systems implementation solution. You will also lead the implementation of manufacturing IT systems, provide updates to the leadership team, and coordinate cross-functional teams and stakeholders to gather business and technical requirements, translating them into a clear, actionable 3-year data platform roadmap. Minimum qualifications for this role include a Bachelor's degree (required), with an advanced degree preferred, along with a minimum of 10 years of relevant experience in IT project or program management roles and 4+ years of team management experience of 10+ team members. Prior experience in regulated or validated industries is a strong plus. Strong documentation, organizational, and communication skills are essential, along with familiarity with project management tools and the ability to understand the customer's business problem and design effective solutions. Proven ability to deliver quality results within defined timelines, understanding of application lifecycle processes and system integration concepts, and the ability to thrive in a fast-paced, team-oriented environment are also required. Skills needed for this role include a strong background in IT project management, especially in manufacturing or supply chain domains, experience in leading multi-function cross-team collaboration between IT and Business, managing program timelines, risks, status, and escalations, understanding and working within processes and tools, solid knowledge of SDLC and Agile/Waterfall/Hybrid project management principles, experience with project management tools like DevOps, strong knowledge of MS PowerPoint, MS Excel, MS Projects, experience managing Project Costing, Budget Forecasting, and Resource Management, and working knowledge of manufacturing IT systems like ERP, MES, etc.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Data Product Analyst at Wells Fargo, you will be responsible for participating in low to moderate complexity data product initiatives. Your role will involve identifying opportunities for data roadmap improvements within your scope of responsibilities to drive data enablement and capabilities across platforms and utilities. You will review and analyze basic business, operational, or technical assignments that require research and evaluation to drive data enablement strategies. Additionally, you will present recommendations for resolving data product situations, collaborate with stakeholders to understand business requirements, and manage datasets focusing on consumer needs and data governance standards. Moreover, you will participate in the creation and maintenance of data product roadmaps, gather data requirements, and communicate data problems and initiatives effectively to all audiences. Required qualifications include 2+ years of data product or data management experience, or equivalent demonstrated expertise in maintaining and improving data quality across the organization. Your responsibilities will also involve participating in analysis to identify and remediate data quality issues, adhering to data governance standards, and designing data governance and data quality policies. Furthermore, you will support regulatory analysis and reporting requirements, work with business and technology partners to document metadata about systems, and assess the current state of data quality. Desired qualifications for this role include experience in large enterprise data initiatives, managing data entry processes, resolving data quality issues, banking business or technology experience, and familiarity with BI tools and cloud concepts. In addition, knowledge of T-SQL, database, data warehousing, ETL concepts, BI solutions, Agile principles, and various technical skills are preferred for this position. As a Data Product Analyst, you are expected to assist in implementing data processes, monitor data flows, ensure consistent data definition across systems, collaborate with data engineers, and resolve data quality issues. The posting end date for this job is 17 Jul 2025, with the possibility of early closure due to the volume of applicants. Wells Fargo values equal opportunity and encourages applications from all qualified candidates. The company maintains a drug-free workplace and requires candidates to represent their own experiences during the recruiting and hiring process. If you require a medical accommodation during the application or interview process, you can visit Disability Inclusion at Wells Fargo for assistance. Third-party recordings are prohibited unless authorized by Wells Fargo, and candidates should adhere to the company's recruitment and hiring requirements.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

A highly skilled and experienced Adobe RTCDP and Adobe Target Expert is sought to join the team. You should deeply understand RTCDP principles and technologies, with a strong focus on practical implementation and a proven ability to deliver successful outcomes. Your role will involve designing, configuring, and managing the RTCDP solution and implementing personalization activities on the web experience. This includes ensuring seamless data integration, profile unification, audience segmentation, and activation for personalized customer experiences. Your expertise will be crucial in leveraging the platform to drive value for the business through data-driven insights and optimized customer journeys. You will serve as the subject matter expert for Adobe RTCDP, supporting internal teams and stakeholders. Your responsibilities will include designing and implementing RTCDP solutions, ingesting and transforming data from various sources, configuring and managing unified customer profiles, building and activating audience segments, ensuring data governance and compliance, integrating data using RTCDP APIs and SDKs, diagnosing and resolving data-related issues, collaborating with cross-functional teams, sharing best practices, training internal teams, and deploying personalized customer journeys using Adobe Target. What You'll Be Doing: - Serve as the subject matter expert for Adobe RTCDP, supporting internal teams and stakeholders. - Design and implement RTCDP solutions, including data schema creation, identity resolution, audience segmentation, and activation. - Ingest and transform data from various sources into the RTCDP, ensuring quality and compliance. - Configure and manage unified customer profiles by stitching data across multiple channels and touchpoints. - Build and activate audience segments to downstream systems for personalized experiences. - Ensure adherence to data governance, privacy regulations, and security standards. - Use RTCDP APIs and SDKs for integrating data and enabling real-time updates. - Diagnose and resolve issues related to data ingestion, identity resolution, and system performance. - Collaborate with cross-functional teams including marketing, analytics, development, and IT. - Share best practices and train internal teams on optimal RTCDP usage. - Deploy personalized customer journeys and experience activities using Adobe Target. What We'd Love To See: - Bachelor's degree in Computer Science, Data Engineering, Marketing Technology, or a related field. - At least 3 years of hands-on experience with RTCDP implementations. - Demonstrated success in executing at least two end-to-end RTCDP projects. - Deep understanding of data modeling, ingestion, and transformation in a CDP environment. - Proficiency in identity resolution and audience segmentation. - Experience with SQL, Python, or JavaScript for data manipulation and integration. - Working knowledge of APIs, SDKs, and real-time data streaming tools. - Familiarity with data governance frameworks and compliance regulations like GDPR and CCPA. - Strong communication, interpersonal, and analytical skills. - Ability to manage multiple projects independently. It'd Be Great If You Had: - Experience with Adobe Cloud Platform, especially Adobe Experience Platform (AEM) and Adobe Target. - Background in consulting or agency settings with client-facing roles. - Familiarity with marketing automation, analytics, or personalization platforms. - Broader experience with Adobe cloud technologies.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. You will work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers, and consumers, worldwide. ZSers drive impact by bringing a client-first mentality to each engagement, partnering collaboratively to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning, bold ideas, courage, and passion to drive life-changing impact to ZS. At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systems that shape us as individuals and make us unique. Your personal interests, identities, and desire to learn are integral to your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. **What you'll do:** - Design, build, and manage data workflows and pipelines in Dataiku DSS. - Integrate data from multiple sources including AWS, databases, APIs, and flat files. - Collaborate with data engineering and business teams to translate requirements into scalable data solutions. - Implement data validation, error handling, and monitoring processes within Dataiku. - Support model deployment, scheduling, and performance optimization within the Dataiku environment. - Maintain documentation and version control for Dataiku projects and pipelines. **What you'll bring:** - 3+ years of experience in data engineering or analytics development roles. - Hands-on experience with Dataiku DSS, SQL, Python, and data wrangling. - Familiarity with AWS services, APIs, and data integration tools. - Understanding of data quality, pipeline performance optimization, and analytics workflows. **Additional Skills:** - Strong communication skills, both verbal and written, with the ability to structure thoughts logically during discussions and presentations. - Capability to simplify complex concepts into easily understandable frameworks and presentations. - Proficiency in working within a virtual global team environment, contributing to the timely delivery of multiple projects. - Travel to other offices as required to collaborate with clients and internal project teams. **Perks & Benefits:** ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options, internal mobility paths, and collaborative culture empower you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. **Travel:** Travel is a requirement at ZS for client-facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed, providing opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. **Considering applying ** At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. **To Complete Your Application:** Candidates must possess or be able to obtain work authorization for their intended country of employment. An online application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

siliguri, west bengal

On-site

As a CRM Executive/CRM Administrator at an Educational Institute in Siliguri (North Bengal), you will be responsible for administering and maintaining the institution's CRM system, which may include platforms such as Salesforce, Zoho, Google Meet, and Zoom. Your key responsibilities will involve designing and implementing automation workflows, lead scoring, and student lifecycle journeys. You will ensure seamless CRM integration with other platforms like websites, ERPs, email marketing tools, and student portals. Creating and managing dashboards and custom reports for leadership and admissions teams will be part of your routine tasks. Monitoring lead pipelines, tracking prospect activity, and providing actionable insights will also be crucial. Regular data audits, clean-ups, and backups will be necessary to maintain data integrity. Managing user access, roles, and training across departments will be essential for effective CRM utilization. Collaborating with IT to implement technical improvements, APIs, and third-party plug-ins will be a collaborative effort. Troubleshooting system issues, bugs, and support tickets in coordination with vendors or CRM providers will require your attention to detail and problem-solving skills. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related technical field. A minimum of 3-4 years of hands-on experience with CRM platforms, preferably in the education sector, is required. Proficiency in CRM customization, workflow automation, and reporting is essential. You should have a working knowledge of APIs, data integration, and cloud platforms. Strong command over Excel, SQL queries, or BI tools like Power BI and Tableau is necessary. Familiarity with HTML, CSS, or JavaScript for email templates or CRM front-end tweaks will be an added advantage. Your ability to translate functional requirements into technical solutions and strong communication skills to bridge technical and non-technical stakeholders will be critical for success in this role. Preferred skills include CRM certifications such as Salesforce Administrator or Zoho CRM Certified Professional, experience with CRM migration or CRM-ERP integration projects, knowledge of education technology platforms like Moodle, Blackboard, or Canvas, and an understanding of data protection laws such as GDPR and FERPA. In summary, as a CRM Executive/CRM Administrator, you will play a pivotal role in optimizing the institution's CRM system to enhance operational efficiency and student engagement. Your technical expertise, problem-solving abilities, and communication skills will be key assets in this dynamic and rewarding position.,

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Gurugram

Work from Office

As a Software Engineer - Data Reporting Services at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

4.0 - 6.0 years

7 - 12 Lacs

Hyderabad

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Chennai

Work from Office

As a Senior Software Engineer - ETL - Python at Incedo, you will be responsible for designing and developing ETL workflows to extract, transform, and load data from various sources to target systems. You will work with data analysts and architects to understand business requirements and translate them into technical solutions. You will be skilled in ETL tools such as Informatica or Talend and have experience in programming languages such as SQL or Python. You will be responsible for writing efficient and reliable code that is easy to maintain and troubleshoot. Roles & Responsibilities: Develop, maintain, and enhance software applications for Extract, Transform, and Load (ETL) processes Design and implement ETL solutions that are scalable, reliable, and maintainable Develop and maintain ETL code, scripts, and jobs, ensuring they are efficient, accurate, and meet business requirements Troubleshoot and debug ETL code, identifying and resolving issues in a timely manner Collaborate with cross-functional teams, including data analysts, business analysts, and project managers, to understand requirements and deliver solutions that meet business needs Design and implement data integration processes between various systems and data sources Optimize ETL processes to improve performance, scalability, and reliability Create and maintain technical documentation, including design documents, coding standards, and best practices. Technical Skills Skills Requirements: Proficiency in programming languages such as Python for writing ETL scripts. Knowledge of data transformation techniques such as filtering, aggregation, and joining. Familiarity with ETL frameworks such as Apache NiFi, Talend, or Informatica. Understanding of data profiling, data quality, and data validation techniques. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

As a Senior Big Data Platform Engineer at Incedo, you will be responsible for designing and implementing big data platforms to support large-scale data integration projects. You will work with data architects and data engineers to define the platform architecture and build the necessary infrastructure. You will be skilled in big data technologies such as Hadoop, Spark, and Kafka and have experience in cloud computing platforms such as AWS or Azure. You will be responsible for ensuring the performance, scalability, and security of the big data platform and troubleshooting any issues that arise. Roles & Responsibilities: Designing, developing and maintaining large-scale big data platforms using technologies like Hadoop, Spark and Kafka Creating and managing data warehouses, data lakes and data marts Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Troubleshooting and resolving big data platform issues Collaborating with other teams to ensure the consistency and integrity of data Technical Skills Skills Requirements: Experience with big data processing technologies such as Apache Hadoop, Apache Spark, or Apache Kafka. Understanding of distributed computing concepts such as MapReduce, Spark RDDs, or Apache Flink data streams. Familiarity with big data storage solutions such as HDFS, Amazon S3, or Azure Data Lake Storage. Knowledge of big data processing frameworks such as Apache Hive, Apache Pig, or Apache Impala. Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Network Operations - Utilities.: Experience: 3-5 Years.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

As a Senior Data Reporting Services Specialist at Incedo, you will be responsible for creating reports and dashboards for clients. You will work with clients to understand their reporting needs and design reports and dashboards that meet those needs. You will be skilled in data visualization tools such as Tableau or Power BI and have experience with reporting tasks such as data analysis, dashboard design, and report publishing. Roles & Responsibilities: Design and develop reports and dashboards to help businesses make data-driven decisions. Develop data models and perform data analysis to identify trends and insights. Work with stakeholders to understand their reporting needs and develop solutions that meet those needs. Proficiency in data visualization tools like Tableau, Power BI, and QlikView. Technical Skills Skills Requirements: Strong knowledge of SQL and data querying tools such as Tableau, Power BI, or QlikView Experience in designing and developing data reports and dashboards Familiarity with data integration and ETL tools such as Talend or Informatica Understanding of data governance and data quality concepts Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Chennai

Work from Office

Required Skill Set Candidates with 8+ years of experience with exposure and experience in PIM solution definition and architecting Must have good understanding of MDM architectures and business processes specifically for solutioning PIM solutions leveraging Inriver Must have Minimum 2 end-to-end Implementation experience with Inriver PIM with architecting one such solution and should be familiar with tools architecture and components (data management, data model, digital asset management, workflows, data model and extensions,supplier portal, data quality, integration aspects) Must have hands-on experience with Informatica PIM tool that includes data modeling , designing and implementing components for imports, exports, data migration and associated data cleansing/transformations, data validations rules, catalog management, techniques for managing digital assets and unstructured content in PIM, etc. Should have designed & implemented automated workflows in the tool for data management processes Knowledge and experience in integrating PIM with Data Quality tool (IDQ) for implementing Product data related DQ processes Experience in integrating PIM with ETL (data integration) and EAI tools for batch & real time integrations Experience / understanding of data services suite of products (Data quality , Data integration etc.) will be an added advantage Understanding/experience on integrating PIM with external content management tools will be useful Candidate should have excellent communication skills with ability to interface with customers to drive PIM requirement workshops to elicit and document functional and non-functional requirements Roles & Responsibilities Leading customer discussions and workshops for requirement elicitation and converting business requirements into PIM functional requirements Architecting, designing and implementing Inriver solutions as per client requirements - configuring, extending and customizing various components of PIM platform Providing technical leadership and leading other implementation team members (integration, data quality, BPM) throughout the implementations Build best practices, reusable components and accelerators for PIM implementations Mentor junior team members on PIM solution design and implementations Support practice by leading PIM solution definitions for different customers Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Informatica MDM. Experience: 8-10 Years.

Posted 3 weeks ago

Apply

1.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

ETRM Data Engineer: Key Responsibilities Design, develop, and maintain scalable data pipelines and ETRM systems. Work on data integration projects within the Energy Trading and Risk Management (ETRM) domain. Collaborate with cross-functional teams to integrate data from ETRM trading systems like Allegro, RightAngle, and Endur. Optimize and manage data storage solutions in Data Lake and Snowflake. Develop and maintain ETL processes using Azure Data Factory and Databricks. Write efficient and maintainable code in Python for data processing and analysis. Ensure data quality and integrity across various data sources and platforms. Ensure data accuracy, integrity, and availability across various trading systems. Collaborate with traders, analysts, and IT teams to understand data requirements and deliver robust solutions. Optimize and enhance data architecture for performance and scalability Mandatory Skills: Python/ pyspark Fast API Pydantic SQL Alchemy Snowflake or SQL Data Lake Azure Data Factory (ADF) CI\CD, Azure fundamentals , GIT Integration of data solutions with CETRM trading Systems(Allegro, RightAngle, Endur) Good to have: Databricks Streamlit Kafka Power BI Kubernetes Fast Stream

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Database Architecting Experience : 5-8 Years.

Posted 3 weeks ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

2 - 6 Lacs

Gurugram

Work from Office

About the Opportunity Job TypeApplication 29 July 2025 Title Analyst Programmer Department WPFH Location Gurgaon Level 2 Intro Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like youre part of something bigger. About your team The successful candidate would join the Data team . Candidate would be responsible for building data integration and distribution experience to work within the Distribution Data and Reporting team and its consumers. The team is responsible for developing new, and supporting existing, middle tier integration services and business services, and is committed to driving forwards the development of leading edge solutions. About your role This role would be responsible for liaising with the technical leads, business analysts, and various product teams to design, develop & trouble shoot the ETL jobs for various Operational data stores. The role will involve understanding the technical design, development and implementation of ETL and EAI architecture using Informatica / ETL tools. The successful candidate will be able to demonstrate an innovative and enthusiastic approach to technology and problem solving, will display good interpersonal skills and show confidence and ability to interact professionally with people at all levels and exhibit a high level of ownership within a demanding working environment. Key Responsibilities Work with Technical leads, Business Analysts and other subject matter experts. Understand the data model / design and develop the ETL jobs Sound technical knowledge on Informatica to take ownership of allocated development activities in terms of working independently Working knowledge on Oracle database to take ownership of the underlying SQLs for the ETL jobs (under guidance of the technical leads) Providing the development estimates Implement standards, procedures and best practices for data maintenance, reconciliation and exception management. Interact with cross functional teams for coordinating dependencies and deliverables. Essential Skils Technical Deep knowledge and Experience of using the Informatica Power Centre tool set min 3 yrs. Experience in Snowflake Experience of Source Control Tools Experience of using job scheduling tools such as Control-M Experience in UNIX scripting Strong SQL or Pl/SQL experience with a minimum of 2 years experience Experience in Data Warehouse, Datamart and ODS concepts Knowledge of data normalisation/OLAP and Oracle performance optimisation techniques 3 + Yrs Experience of either Oracle or SQL Server and its utilities coupled with experience of UNIX/Windows Functional 3 + years experience of working within financial organisations and broad base business process, application and technology architecture experience Experience with data distribution and access concepts with ability to utilise these concepts in realising a proper physical model from a conceptual one Business facing and ability to work alongside data stewards in systems and the business Strong interpersonal, communication and client facing skills Ability to work closely with cross functional teams About you B.E./B.Tech/MBA/M.C.A/Any other bachelors Degree. At least 3+years of experience in Data Integration and Distribution Experience in building web services and APIs Knowledge of Agile software development life-cycle methodologies

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 Snowflake Data Engineer Overall Experience 5+ years of experience in Snowflake and Python. Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Create and maintain optimal data pipeline architecture. Good knowledge of cloud platforms like AWS/Azure/GCP Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (Snow pipe & others), time travel and data sharing and other Snowflake capabilities Good knowledge of Python/Py Spark, advanced features of Python Support business development efforts (proposals and client presentations). Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

9 - 11 Lacs

Ahmedabad

Work from Office

We are hiring for one of our client an IT company based from Ahmedabad,Gujarat Job Title :- SAS Developer Experience: 2 to 5 years Location:- PAN INDIA (willing to relocate) Required Candidate profile SAS Base, Macros, SQL, DI, VA, Viya RDBMS: Oracle, SQL Server, Teradata Data validation, analytics, reporting Banking data models & regulatory reporting SAS Enterprise Guide, AML/KYC

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies