Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
indore, madhya pradesh
On-site
You are looking for a Senior Associate - Salesforce Developer who will be responsible for designing, developing, and implementing customized Salesforce solutions. You should have a strong expertise in Apex, Lightning Web Components (LWC), and Salesforce integrations while adhering to best practices for scalability and performance. This role requires problem-solving skills, a proactive approach, and collaboration with cross-functional teams, including Salesforce Administrators, Business Analysts, and Architects. Your primary responsibilities will include Salesforce development and customization, integration and API management, data management and security, testing, deployment, DevOps, collaboration, and documentation. In terms of Salesforce development and customization, you will be developing, testing, and deploying Apex classes, triggers, batch jobs, and schedulers. You will also design and implement Lightning Web Components (LWC) and Aura Components, as well as create and maintain custom objects, fields, workflows, process automation, and validation rules. For integration and API management, you will be developing REST and SOAP API integrations with external systems, working with Platform Events, Change Data Capture, and Asynchronous Apex for efficient data processing, and collaborating on third-party integrations using tools like Mulesoft, Boomi, or ETL solutions. In terms of data management and security, you will ensure proper data governance, security, and compliance with sharing rules, profiles, and permission sets. You will also perform data migration and transformation using Data Loader, Workbench, or ETL tools, and optimize SOQL/SOSL queries to maintain governor limits. For testing, deployment, and DevOps, you will write and maintain Apex test classes to meet the required 75%+ code coverage, conduct code reviews, and enforce Salesforce best practices. In terms of collaboration and documentation, you will work closely with Business Analysts and Architects to understand business requirements, document technical solutions, design patterns, and best practices, and mentor junior developers and contribute to knowledge-sharing sessions. The primary skills required for this role include 4+ years of hands-on experience as a Salesforce Developer, strong expertise in Apex, Lightning Web Components (LWC), SOQL/SOSL, experience with Flows, Process Builder, and other declarative automation tools, hands-on experience with Salesforce security, experience with Salesforce API integrations, strong debugging skills, knowledge of Salesforce deployment tools, and more. Secondary skills that are good to have include experience with Sales Cloud, Service Cloud, or Experience Cloud, knowledge of DevOps tools, familiarity with integration platforms, understanding of Agile/Scrum methodologies, and strong problem-solving skills. Preferred certifications for this role include Salesforce Platform Developer I (PDI), Salesforce Platform Developer II (PDII), Salesforce JavaScript Developer I, and Salesforce App Builder. You should also have proven experience in business development, sales, or a related field, strong leadership and team management skills, excellent communication and presentation skills, the ability to develop and implement strategic plans, an analytical mindset, a proactive and results-oriented approach, familiarity with emerging technologies and industry trends, experience in building and managing effective sales teams, the ability to adapt to a fast-paced and dynamic work environment, strong negotiation and interpersonal skills, and knowledge of CRM software and sales management tools.,
Posted 1 week ago
2.0 - 8.0 years
0 Lacs
haryana
On-site
You are as unique as your background, experience and point of view. Here, you'll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Role Summary: Data and Analytics Services, the Lead Analytics Consultant is responsible for developing innovative visual analytics solutions and enabling faster and better decision making for Sun Life. Our growing mandate to deliver Data Analytics, Artificial Intelligence and Data Solutions requires an experienced data visualization practitioner to accelerate the development of our strategic analytics projects in support of our business stakeholders. Main Accountabilities: Work directly with data engineers, business data analysts and business Operations teams throughout the entire engagement life cycle including technical analysis, design, development, implementations and consulting efforts. Provide plans and estimates for reporting initiatives. Maintain standards for analysis and reporting at all levels including best practice data visualization techniques. Facilitate group meetings of various sizes for solution design, decision making, problem-solving and solution implementation. Turn complex data into easy to understand visual insights that are aligned to business objectives. Develop a user experience that is simple, yet flexible. Incorporate data governance and access controls in QuickSight/ Tableau. Enable visual storytelling through QuickSight/ Tableau and visual analytics tools. Education and Experience: Minimum graduate degree in Mathematics, Computer Science, Engineering or equivalent. Expertise in visual analytics and design principles. Total experience: 8+ years. 5+ years of experience in Tableau/Power BI/Qlik View or other visualization development experience and designing dashboards / decision enablement tools. SQL Programming, PL/SQL. Developing rich portfolio of design uses cases demonstrating excellent user experience. Working in an agile development environment including rapid prototyping during sprints. 2+ years of working experience in QuickSight/ Tableau. Competencies: Experience with PostgreSQL. Rich experience in end-to-end testing of dashboards with respect to data and features. Experience in creating test cases covering all aspects. Visual design expertise with a solid understanding of the best practices around dashboard and visual perception. End-to-End development experience and ability to create complex calculations including parameters, expressions, actions & filters and implement advanced dashboard practices in QuickSight/ Tableau. Leveraging QuickSight/ Tableau for data organization and scheduling. Ability to understand data modeling based on user specs. Designing custom landing pages in QuickSight/ Tableau for enhanced user experience. Ability to work independently, manage engagements or parts of large engagements directly with business partners. Solid understanding of how to consolidate and transform data to meaningful and actionable information. Ability to draw out meaningful business insights by synthesizing information from multiple sources. Strong oral and written communication skills, including presentation and storytelling with data. Demonstrated ability to transform User Stories into smart and elegant analytical apps for decision making. Strong problem-solving and troubleshooting skills with the ability to exercise mature judgment. A willingness to listen, challenge, communicate, and respect team member's ideas and opinions openly. Comfortable with ambiguity and effective in delivering iterative solutions. Additional Assets: Any Visualization Tool certification. Experience with AWS data sources and other AWS services. Experience with Agile or design-driven development of analytics applications. Exposure of Data Analytics projects including predictive modeling, data mining & statistical analysis. Job Category: Advanced Analytics Posting End Date: 16/10/2024,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
As a highly motivated and experienced Data Engineer, you will be responsible for designing, developing, and implementing solutions that enable seamless data integration across multiple cloud platforms. Your expertise in data lake architecture, Iceberg tables, and cloud compute engines like Snowflake, BigQuery, and Athena will ensure efficient and reliable data access for various downstream applications. Your key responsibilities will include collaborating with stakeholders to understand data needs and define schemas, designing and implementing data pipelines for ingesting, transforming, and storing data. You will also be developing data transformation logic to make Iceberg tables compatible with the data access requirements of Snowflake, BigQuery, and Athena, as well as designing and implementing solutions for seamless data transfer and synchronization across different cloud platforms. Ensuring data consistency and quality across the data lake and target cloud environments will be crucial in your role. Additionally, you will be analyzing data patterns and identifying performance bottlenecks in data pipelines, implementing data optimization techniques to improve query performance and reduce data storage costs, and monitoring data lake health to proactively address potential issues. Collaboration and communication with architects, leads, and other stakeholders to ensure data quality meet specific requirements will also be an essential part of your role. To be successful in this position, you should have a minimum of 4+ years of experience as a Data Engineer, strong hands-on experience with data lake architectures and technologies, proficiency in SQL and scripting languages, and experience with data governance and security best practices. Excellent problem-solving and analytical skills, strong communication and collaboration skills, and familiarity with cloud-native data tools and services are also required. Additionally, certifications in relevant cloud technologies will be beneficial. In return, GlobalLogic offers exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. You will have the opportunity to collaborate with a diverse team of highly talented individuals in an open, laidback environment. Work-life balance is prioritized with flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional development opportunities include Communication skills training, Stress Management programs, professional certifications, and technical and soft skill trainings. GlobalLogic provides competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS(National Pension Scheme), extended maternity leave, annual performance bonuses, and referral bonuses. Fun perks such as sports events, cultural activities, food on subsidized rates, corporate parties, dedicated GL Zones, rooftop decks, and discounts for popular stores and restaurants are also part of the vibrant office culture at GlobalLogic. About GlobalLogic: GlobalLogic is a leader in digital engineering, helping brands design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise, GlobalLogic helps clients accelerate their transition into tomorrow's digital businesses. Operating under Hitachi, Ltd., GlobalLogic contributes to driving innovation through data and technology for a sustainable society with a higher quality of life.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
We are looking for an experienced professional who possesses the required mathematical and statistical expertise along with natural curiosity and a creative mind to uncover hidden opportunities in data, aiming to realize its full potential. You will be responsible for developing modern Data Warehouse solutions using Databricks and AWS/Azure Stack. This includes providing forward-thinking solutions in the data engineering and analytics space, collaborating with DW/BI leads on ETL pipeline development requirements, triaging issues in existing pipelines, working with the business to understand reporting needs, and developing data models to fulfill those needs. You will also assist team members in resolving technical challenges, drive technical discussions with client architects, and orchestrate data pipelines via Airflow. As for qualifications, you should have a Bachelor's and/or Master's degree in computer science or equivalent experience, with at least 3+ years of experience in Data & Analytics. Communication and presentation skills are essential. You must have a minimum of 2 years" experience in Databricks implementations and large-scale data warehouse end-to-end implementations. Being a Databricks certified architect is a must. Proficiency in SQL and experience with scripting languages like Python, Spark, and Pyspark for data manipulation and automation is required. Additionally, you should have a solid understanding of cloud platforms (AWS, Azure, GCP) and their integration with Databricks. Familiarity with data governance, data management practices, and exposure to tools like Data sharing, Unity catalog, DBT, replication tools, and performance tuning will be advantageous. Tredence is a company that focuses on delivering powerful insights into profitable actions by combining strengths in business analytics, data science, and software engineering. Headquartered in the San Francisco Bay Area, Tredence serves clients in the US, Canada, Europe, and Southeast Asia. We are an equal opportunity employer that values diversity and is dedicated to creating an inclusive environment for all employees. For more information, please visit our website: [Tredence Website](https://www.tredence.com/),
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Department: Development Location: Pune, India Description Our bright team FastTrack their career with international exposure and ways of working based on agile development best practices from globally renowned technology consultancies. Key Responsibilities Responsibilities: Data Architect Creating data models that specify how data is formatted, stored, and retrieved inside an organisation. This comprises data models that are conceptual, logical, and physical. Creating and optimising databases, including the selection of appropriate database management systems (DBMS) and the standardisation and indexing of data. Creating and maintaining data integration processes, ETL (Extract, Transform, Load) workflows, and data pipelines to seamlessly transport data between systems. Collaborating with business analysts, data scientists, and other stakeholders to understand data requirements and align architecture with business objectives. Stay current with industry trends, best practices, and advancements in data management through continuous learning and professional development. Establishing processes for monitoring and improving the quality of data within the organisation. Implement data quality tools and practices to detect and resolve data issues. Requirements and Skills: Data Architect Prior experience in designing Data Warehouse, data modelling, database design, and data administration is required. Database Expertise: Knowledge of data warehousing ideas and proficiency in various database systems (e.g., SQL). Knowledge of data modelling tools such as Visual Paradigm is required. Knowledge of ETL methods and technologies (for example, Azure ADF, Events). Expertise writing complex stored procedures. Good understanding of Data Modelling Concepts like Star Schema ,SnowFlake etc Strong problem-solving and analytical skills are required to build effective data solutions. Excellent communication skills are required to work with cross-functional teams and convert business objectives into technical solutions. Knowledge of Data Governance: Understanding data governance principles, data security, and regulatory compliance. Knowledge of programming languages such as .net can be advantageous.,
Posted 1 week ago
1.0 - 5.0 years
0 Lacs
vadodara, gujarat
On-site
Job Title: Data Architect Experience : 3 to 4 Location : Vadodara , Gujarat Contact : 9845135287 Job Summary We are seeking a highly skilled and experienced Data Architect to join our team. As a Data Architect, you will play a crucial role in assessing the current state of our data landscape and working closely with the Head of Data to develop a comprehensive data strategy that aligns with our organisational goals. Your primary responsibility will be to understand and map our current data environments and then help develop a detailed roadmap that will deliver a data estate that enables our business to deliver on its core objectives. Main Duties & Responsibilities The role core duties include but are not limited to: Assess the current state of our data infrastructure, including data sources, storage systems, and data processing pipelines. Collaborate with the Data Ops Director to define and refine the data strategy, taking into account business requirements, scalability, and performance. Design and develop a cloud-based data architecture, leveraging Azure technologies such as Azure Data Lake Storage, Azure Synapse Analytics, and Azure Data Factory. Define data integration and ingestion strategies to ensure smooth and efficient data flow from various sources into the data lake and warehouse. Develop data modelling and schema design to support efficient data storage, retrieval, and analysis. Implement data governance processes and policies to ensure data quality, security, and compliance. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to understand data requirements and provide architectural guidance. Conduct performance tuning and optimization of the data infrastructure to meet business and analytical needs. Stay updated with the latest trends and advancements in data management, cloud technologies, and industry best practices. Provide technical leadership and mentorship to junior team members. Key Skills Proven work experience as a Data Architect or in a similar role, with a focus on designing and implementing cloud-based data solutions using Azure technology. Strong knowledge of data architecture principles, data modelling techniques, and database design concepts. Experience with cloud platforms, particularly Azure, and a solid understanding of their data-related services and tools. Proficiency in SQL and one or more programming languages commonly used for data processing and analysis (e.g., Python, R, Scala). Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong analytical and problem-solving skills, with the ability to translate business requirements into scalable and efficient data solutions. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Working relationships Liaison with stakeholders at all levels of the organisation Communication: Communicate with leadership and colleagues in relation to all business activities Highly articulate and able to explain complex concepts in bite size chunks Strong ability to provide clear written reporting and analysis Personal Qualities Ability to work to deadlines Commercially mindful and able to deliver solution to maximise value Good time management skills and ability to work to deadlines Strong analytical skills Accurate with excellent attention to detail Personal strength and resilience Adaptable and embraces change Reliable, conscientious and hardworking Approachable and professional Show willingness to learn however recognise limits of ability and when to seek advice Knowledge / Key Skills: Essential Desirable Experience of Azure Development and design principals Enterprise level Data warehousing design and implementation Architecture Principles Proficiency in SQL development. Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong experience mapping existing data landscape and developing roadmap to deliver business requirements. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Knowledge of Enterprise Architecture frameworks (Eg. TOGAF) Programming languages such as R, Python, Scala etc Job Type: Full-time Experience: total work: 1 year (Preferred) Work Location: In person,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You are a Data Solution Lead with over 10 years of experience in Data Governance, Data Modeling, Data Architecture, and Data Lineage, particularly within the BFSI sector. Your primary responsibilities include collaborating with business stakeholders to gather and analyze data requirements, designing enterprise data models, ensuring seamless data integration, implementing data governance policies, metadata management, and data lineage tracking. Additionally, you will be responsible for developing data catalogs, business glossaries, and dictionaries, as well as improving data quality, compliance, and automation in data processes. To excel in this role, you must possess expertise in Data Governance, Data Modeling, and Architecture, along with strong SQL and data migration experience. Knowledge of the BFSI domain is preferred. Excellent stakeholder management and communication skills are crucial for effective collaboration, and the ability to automate data processes will contribute to enhancing efficiency within the organization.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You should have experience in understanding and translating data, analytic requirements, and functional needs into technical requirements while collaborating with global customers. Your responsibilities will include designing cloud-native data architectures to support scalable, real-time, and batch processing. You will be required to build and maintain data pipelines for large-scale data management in alignment with data strategy and processing standards. Additionally, you will define strategies for data modeling, data integration, and metadata management. Your role will also involve having strong experience in database, data warehouse, and data lake design and architecture. You should be proficient in leveraging cloud platforms such as AWS, Azure, or GCP for data storage, compute, and analytics services. Experience in database programming using various SQL flavors is essential. Moreover, you will need to implement data governance frameworks encompassing data quality, lineage, and cataloging. Collaboration with cross-functional teams, including business analysts, data engineers, and DevOps teams, will be a key aspect of this role. Familiarity with the Big Data ecosystem, whether on-premises (Hortonworks/MapR) or in the Cloud, is required. You should be able to evaluate emerging cloud technologies and suggest enhancements to the data architecture. Proficiency in any orchestration tool like Airflow or Oozie for scheduling pipelines is preferred. Hands-on experience in utilizing tools such as Spark Streaming, Kafka, Databricks, and Snowflake is necessary. You should be adept at working in an Agile/Scrum development process and optimizing data systems for cost efficiency, performance, and scalability.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
Job Description Direct IT infrastructure development, implement advanced security protocols and innovative solutions to drive business growth. Manage financial planning and budgeting, align financial strategies with organizational goals to optimize resources. Foster cross-departmental collaboration to improve organizational efficiency and resilience Supervise hardware, networking, and data security, including Windows and Linux server administration Handle IT assets inventory, vendor relationships, and system updates to ensure operational continuity Configure and provide support for network security across multiple locations. Administrate Windows Server and network security protocols, troubleshooting hardware and software issues. Key Responsibilities Manage information technology and computer systems Plan, organize, control and evaluate IT and electronic data operations Design, develop, implement and coordinate systems, policies and procedures Ensure security of data, network access and backup systems Act in alignment with user needs and system functionality to contribute to organizational policy Identify problematic areas and implement strategic solutions in time Audit systems and assess their outcomes Preserve assets, information security and control structures Handle annual budget and ensure cost effectiveness Requirement Proven working experience of 2+ years Excellent knowledge of technical management, information analysis and of computer hardware/software systems Expertise in data centre management and data governance Hands-on experience with computer networks, network administration and network installation,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
The Data Governance Specialist - Finance & Treasury will play a pivotal role in defining and implementing a Data Governance strategy for Finance & Treasury. You will need to partner closely with the Finance, Fin Ops, and Treasury Management Teams, Business Leaders (Data Providers), CDO, GLRR, Risk, CFOs, COOs, and CIOs to assist in various activities. These activities include, but are not limited to, assisting in the delivery of the S166 Liquidity remediation activity across Finance & Treasury, implementing BAU Liquidity processes to comply with Data Quality Management Standard (DQMS), collaborating with the S166 Liquidity Programme teams, supporting the Head of Data Governance & Compliance, conducting Data Analysis on data lineage flows, escalating breaches of Data Quality Management Framework (DQMF), working with Technology to establish a standardised toolset, supporting the Head of DG in ensuring DG BU Metrics via MDM, and more. Ensuring full compliance with DQMS and promptly escalating any high elevated data risk issues for timely resolution will be a key strategy. Working closely with upstream business functions to track and monitor remediation activity, overseeing the remediation activity to support s166 Liquidity work, and ensuring integrity and quality of DQMF artefacts will be important business processes. You will also be responsible for developing training and awareness programs for Data governance across all Finance and Treasury teams, fostering a culture of data management, identifying, assessing, monitoring, controlling, and mitigating risks relevant to F&T data governance, and ensuring accurate and quality updates are presented in the DQ Governance forums. The role will require you to work with a wide range of key stakeholders such as Group CFO, Head of Finance, Group Treasurer, Finance, Fin Ops, and Treasury Management Teams, Head of Data Management, Finance & Treasury, Head of Data Governance, Finance & Treasury, BCBS239 programme team, Business COOs & Business Leaders, CFOs, CIOs, CDO, Risk, Audit and Compliance, External Consultants / Agents, and Regulators. Qualifications required for this role include an MBA (Fin) or Masters in Finance/Accountancy/Economics or affiliated subjects, DCAM Professional Certification, minimum of 7 to 10 years of experience in Data governance and Data management, good knowledge of Finance Domains and BU metrics, ability to analyze data to drive greater insight for business, and proficiency in working with MS-Excel and SQL codes. Experience in visualization tools such as Tableau, Power BI, or Qlik would be a plus. If you are interested in this opportunity, please visit our website via the Apply button below for further information and to apply.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
The Corporate Third Party Oversight (CTPO) group at JPMorgan Chase is responsible for developing, deploying, overseeing, and reporting on a program that ensures effective use of third parties and affiliates to achieve strategic goals. The program focuses on ensuring third parties meet high standards in areas such as client service, quality, control, regulatory compliance, business resiliency, and information protection. As a Vice President, Third Party Reporting within CTPO, you will lead and support activities related to data feeds, regulatory & audit submissions, and data governance. Collaboration with CTPO Reporting Utility members in Europe and UK, as well as with the broader CTPO team globally, will be a key aspect of your role. You will enhance the framework for regulatory reporting across JPMC legal entities and ensure timely escalation and resolution of risks, assumptions, and dependencies. Responsibilities: - Lead a lean team in automating data feeds, managing submissions as projects, and supporting data governance in alignment with organizational priorities. - Spearhead efforts in hiring, training, and integrating new personnel into the CTPO framework to support other entities as the scope expands. - Collaborate with CTPO Reporting Utility members in Europe and UK, wider CTPO team members globally, and Outsourcing governance teams at entity or location levels during the design of submission templates, data governance, and submission processes. - Develop a thorough understanding of internal processes for data collection and maintenance that contribute to Regulatory reports, guiding the team in report design and submissions accordingly. - Review business requirements as part of automating data feeds, oversee testing of changes, and manage stakeholder communications for these modifications. - Partner with stakeholders to enhance the framework for Regulatory reporting across JPMC legal entities by identifying and reducing manual touchpoints within the submission process. - Ensure prompt escalation, communication, and resolution of risks, assumptions, and dependencies. Required qualifications, capabilities, and skills: - Strong people management, leadership, and communication skills (verbal and written). - Proficiency in Business Analysis including interpreting reporting requirements, reviewing technical specifications, and leading UAT testing. - Strong data literacy and Excel skills for handling large datasets. - Ability to enhance efficiency and continuously improve processes. - Aptitude for forming collaborative relationships with business partners. - Skills in risk management, compliance, oversight, and control. - Ability to work under tight timelines and evolving external requirements. Preferred qualifications, capabilities, and skills: - Experience in investment banking regulatory, compliance, or operational teams. - Project management experience. - Prior experience in regulatory reporting or designing reporting solutions. - Familiarity with BI-tools like Alteryx and/or Tableau.,
Posted 1 week ago
0.0 - 4.0 years
0 Lacs
hyderabad, telangana
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are inviting applications for the role of Management Trainee, Master Data Management Responsibilities Execute changes to master data (e.g. customer, vendor, materials etc.) as approved by the business Coordinate material and finish products master data setup, validations and periodic maintenance System monitoring and user management related activities Participate in projects and initiatives across multiple functional areas and regions Work with business units and process specialists to resolve master data issues Ensure data quality, compliance and consistency of master data across business systems Support the business with required procedures, submit incidents and change requests when needed Assist business process specialists in defining standard operating procedures, process flows and related documentation Review cleansed records as per quality control parameters Classify the Material / Service as per UNSPSC and standard taxonomies (Noun / Modifier) Ensure data is conforming to all cleansing standards and governance policies Technically evaluate the material and fill the correct specification values against the attributes Review the technical data sheets / drawings / item images and other technical documents to extract relevant specifications of materials Resolve duplicates based on match of duplicate line items Supervise vendor outreach program, ensure clear and crisp communication with external and internal parties Qualifications we seek in you! Minimum Qualifications MBA Supply Chain Management / Operations Management M.Tech / B.Tech Mechanical / Industrial / Electronics Analytically minded and methodical problem solver Good command on database tools like SQL/Oracle & understanding of database architecture Able to efficiently prioritize work and timely advise partners on the progress Has exposure in more than 2 industries (Automobile / Electronics) on service/spare parts supply chain and Master data management Preferred Qualifications Knowledge of SQL, Access, Excel and Excel macros Knowledge of ERP/MRP/DRP functionality Knowledge of Master data management tools Job Management Trainee Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Apr 1, 2025, 1:43:55 PM Unposting Date Ongoing Master Skills List Operations Job Category Full Time,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
Vertiv is a $5.7B global organization with nearly 27,000 employees, specializing in designing, building, and servicing critical infrastructure for data centers, communication networks, and commercial/industrial facilities. We offer power, thermal, and infrastructure management solutions to support the growing mobile and cloud computing markets. As a Data Migration Item MDM, your primary responsibility will be to manage the extract, transform, and load (ETL) processes for item-related data from/to Oracle PD/PIM and Oracle EBS. You will be part of the Data Migration team involved in ETL activities within Oracle Fusion cloud PD and PIM, updating item attributes and BOM, as well as loading new item, document, attachments, and BOM information. Additionally, you will handle Oracle EBS-related migration of all master and transactional data, ensuring the updating of item attribution and BOM information. Key Responsibilities: - Execute data migration tasks following a defined strategy and utilizing specific tools - Identify and escalate risks and issues promptly for resolution - Ensure data quality throughout the migration process and validate data fitness for purpose - Utilize Fusion Data Migration tools such as FBDI, HDL, ADFDI, and Fusion Web Services - Collaborate with team members to cleanse data effectively - Conduct data migration audit, reconciliation, and exception reporting - Work with subject matter experts and project team to define data migration requirements - Understand data usage implications across multiple functional work streams - Support data integrity and governance initiatives, including source data analysis and mapping - Manage master and transactional data including creation, updates, and deletion Requirements: - Bachelor's Degree in Information Technology, Process Management, or related field - Minimum of 4 years of experience in item/product data migration (ETL) - 2+ years of experience in Oracle Fusion and Oracle EBS data migration roles - Strong business knowledge and understanding of technology trends - Excellent communication skills, both written and verbal - Ability to work independently, show initiative, and accept new challenges - Sound judgment and decision-making skills based on analysis, experience, and wisdom - Adaptable to changes in priorities and strategic direction - Professionalism in performance and demeanor - Effective teamwork and leadership to achieve goals consistently Join our team to play a critical role in managing data migration processes and ensuring data integrity across critical systems.,
Posted 1 week ago
10.0 - 17.0 years
0 Lacs
hyderabad, telangana
On-site
We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,
Posted 1 week ago
3.0 - 6.0 years
18 - 30 Lacs
Mumbai
Work from Office
Hello Connections, Greetings from Teamware Solutions !! We are #Hiring for Top Investment Bank Position: Data Analyst Location: Mumbai Experience Range: 3 to 6 Years Notice Period: Immediate to 30 Days Must have skills: data Analyst , data catalog, Data analytics, data governance along with collibra. What You'll Do - As part of the Data & Analytics Group (DAG) and reporting to the Head of the India DAG locally, the individual is responsible for the following 1. Review, analyze, and resolve data quality issues across IM Data Architecture 2. Coordinate with data owners and other teams to identify root-cause of data quality issues and implement solutions. 3. Coordinate the onboarding of data from various internal / external sources into the central repository. 4. Work closely with Data Owners/Owner delegates on data analysis and development data quality (DQ) rules. Work with IT on enhancing DQ controls. 5. End-to-end analysis of business processes, data flow s, and data usage to improve business productivity through re-engineering and data governance. 6. Manage change control process and participate in user acceptance testing (UAT) activities. What Were Looking For 1. Minimum 3- 6 years experience in data analysis, data catalog & Collibra. 2. Experience in data analysis and profiling using SQL is a must 3. Knowledge in coding, Python is a plus 4. Experience in working with cataloging tools like Collibra 5. Experience working with BI reporting tools like Tableau, Power BI is preferred. Preferred Qualifications: 1. Bachelors Degree required and any other relevant academic course a plus. 2. Fluent in English Apply now : francy.s@twsol.com
Posted 1 week ago
9.0 - 14.0 years
35 - 40 Lacs
Chennai
Hybrid
The Infrastructure Data & Analytics team unifies FinOps, Data Science and Business Intelligence to enable Technology cost transparency, infrastructure performance optimization and commercial efficiency for the enterprise through consistent, high-quality data and predictive analytics. This team within Global Infrastructure aims to establish and reinforce a culture of effective metrics, data-driven business processes, architecture simplification, and cost awareness. Metric-driven cost optimization, workload-specific forecasting and robust contract management are among the tools and practices required to drive accountability for delivering business solutions that derive maximum value. The result will provide a solid foundation for decision-making around cost, quality and speed. We are seeking a strong, data-driven Senior Technical Program Manager who knows that delivering on that promise takes foresight, planning and agility. The Sr. Technical Program Manager will be a key member of the team, and will leverage their technical knowledge and project management skills to drive delivery of our data architecture target state implementation, data model migration, and data automation workstreams that underpin our Infrastructure Data Visualization Portal and other capabilities. They will translate business decisions into data analytics and visualization requirements, prioritize the teams sprint backlog, and support engagement with data providers to ensure data is accessed and ingested consistently and correctly. This individual will be responsible for ensuring excellent and timely execution following agile practices and implementing appropriate agile ceremonies to manage risks and dependencies. This individual will require a unique blend of strong data analytics and leadership skills to manage and prioritize the data requirements across our suite of data and analytics tools and dashboards. They will bring passion for data-driven decisions, user experience, and execution to the role. Key responsibilities include: Steer execution of data architecture and data model migrations to meet the needs of FinOps, Data Science and Business Intelligence teams, as well as other key partners Lead technical program conversations on architectureal approach, system design and data management and compliance Actively manage backlog for data migration, automation, and ingestion workstreams Develop and maintain data source and feature request ticketing process in Jira Partner across ID&A teams to ensure data requirements are met and timeline risks are managed and mitigated Establish appropriate agile processes to track and manage dependencies across disciplines in staying on track to meet short-term and long-term implementation roadmaps Collaborate with product teams to refine, prioritize, and deliver data and feature requirements through technical acumen, customer-first perspective, and enterprise mindset Support development of appropriate reporting processes to measure OKRs and performance metrics for delivery of our data lake architecture Create an environment of continuous improvement by steering and delivering reflective conversation and regular retrospectives, project standups, workshops, communications, and shared processes to ensure transparency of development process and project performance Facilitate stakeholder engagement, decision-making, and building trust across data providers and critical stakeholders Work with IT Asset Management, Enterprise Architecture, and Business & Vendor Management teams to define enterprise-scalable solutions that meet the needs of multiple stakeholders Partner with data engineering teams to develop, test and deliver the defined capabilities and rapidly iterate new solutions Facilitate and prepare content for leadership updates on delivery status and key decisions needed to support project delivery and de-risk implementation obstacles Partner in PI planning meetings and other Agile ceremonies for the team: pressure testing plans for feasibility and capacity Monitor and ensure compliance with SDLC standards Ensure and instill documentation best practices to ensure designs meet requirements and processes are repeatable Leverage the evolving technical landscape as needed, including AI, Big Data, Machine Learning and other technologies to deliver meaningful business insights Establish ongoing metrics and units of measurement to clearly define success and failure points and to guide feature/capability prioritization based on business priorities Draft impactful and comprehensive communications, presentations, and talking points for key business reviews, executive presentations, and discussions; escalate and facilitate resolution of risks, issues, and changes tied to product development Act as point of contact for internal inquiries and key partnerships across Technology and business teams Minimum Requirements: 8+ years of experience delivering data lake or backend data platform capabilities and features built using modern technology and data architecture techniques Proven track record for managing large, complex features or products with multiple partners Technical understanding of event-driven architectures, API-first design, cloud-native technologies, and front-end integration patterns in order to discuss technical challenges about system design and solutioning Ability to create clarity and execute plans in ambiguity, and to inspire change without direct authority Self-starter who is able to provide thought leadership and prioritization with limited guidance and in a complex environment Experience in data analytics, data architecture, or data visualization Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross-team communication Experience facilitating Agile, Scrum or other rapid application development teams to deliver technology solutions on time, on budget, and to spec Capable of leading technology and culture change with excellent strategic and technical thought leadership, and strong program management skills High attention to organization and detail in a deadline-driven work environment Proven ability to solve problems and resolve issues with appropriate communications and escalation criteria Outstanding oral and written communication skills with strong personal presence; active listening skills, summarization skills, and lateral thinking to uncover and react to emerging opportunities Deep understanding of the full lifecycle of rpodcut development, from concept to delivery, including Test Driven Development (TDD) Understanding of complex software delivery including build, test, deployment, and operations; conversant in AI, Data Science, and Business Intelligence concepts and technology stack Experience working with technology business management, technology infrastructure or enterprise architecture teams a plus Experience with design and coding across one or more platforms and languages a plus Bachelors degree in computer science, data engineering, data analytics, or other technical discipline, or equivalent work experience preferred
Posted 1 week ago
0.0 - 4.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Who We Are Saveo is a managed marketplace bringing primary & secondary medicine market together, offering faster delivery, cheaper procurement & better technology to pharmacies. It runs with an aim that no prescription shall bounce in India. Pharmacy in itself is a living organism having requirements at different fronts : acquisition of new customers, retaining customers, managing inventory, managing suppliers, finances etc. The Indian pharmaceutical industry is highly fragmented with 6.5lakh retailers and 65000 distributors, unlike the US where there are just 5 major distributors covering 93% of the market share. We aim to streamline this supply chain by being a single distribution point and empower these 6.5lakh micro-entrepreneurs with technology and sustainability. What We're Looking For check Bachelors degree in Data Science, Analytics, Healthcare Management, or a related field or Fresher from Tier 1 college ( recent Passouts) check Proficiency in Excel, SQL, and data visualization tools (e.g., Tableau, Power BI) Strong analytical and problem-solving skills Knowledge of healthcare workflows and basic understanding of medical terminologies Excellent communication and collaboration abilities Responsibilities Collect, clean, and analyze healthcare operations data (e.g., patient records, and workflow metrics) Generate reports and dashboards to track key performance indicators (KPIs) Identify trends and inefficiencies in operational processes Assist in forecasting and planning for resource allocation Ensure compliance with data privacy regulations (e.g., HIPAA) Collaborate with teams to implement data-driven solutions
Posted 1 week ago
3.0 - 6.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Base Location: Bengaluru Minimum Qualification: Graduate Preferred Experience: 3 to 6 Years of relevant experience Key Result Areas Data stewardship Reviewing the sales and inventory data from multiple applications reviewing any errors or exceptions and investigating and making corrections as necessary to ensure that sales and inventory logs are clean and closed on a timely basis so that this data can be passed on to other systems such as Merchandising, Inventory, and Financials. Ensuring period end reporting is completed in a timely manner. Reviewing any errors or exceptions and investigating and making corrections as necessary Identify reasons for Over Shorts and correct Identify any recurring patterns of errors or Over Shorts and raise concerns to manager Troubleshoot & Support Troubleshoot and ensure functional user's query resolution. Work closely with the users to educate end consumers to ensure data handling and processing across various applications Identify Automation opportunities for Data Governance Enable & facilitate daily report generation to aid business decision making; ensure data accuracy. Monitor all reports across all applications and ensure query resolution within the SLA People Management Develop strong working relationships with key IT and Business function. Communicate with other departments/formats efficiently and effectively. Manage and mentor support team
Posted 1 week ago
3.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title: SSIS Developer Location: Hyderabad, Bangalore, Chennai, Pune Experience: 7- 12 Years Key Responsibilities: Design, develop, and deploy robust ETL processes using SSIS. Develop OLAP cubes and tabular models using SSAS for business intelligence reporting. Build and optimize data models to support analytical and reporting needs. Design and implement data warehouse architectures and schemas (Star, Snowflake). Perform data integration, cleansing, and transformation from multiple sources. Ensure data quality and consistency across all platforms. Work closely with business analysts and stakeholders to understand data requirements. Monitor and improve ETL performance and resolve any issues in data pipelines. Document technical processes and data flows. Required Skills & Qualifications: 3+ years of hands-on experience with SSIS and SSAS. Strong understanding of data modeling concepts (conceptual, logical, and physical models). Proven experience in designing and implementing data warehouses. Proficient in T-SQL and SQL Server. Experience with performance tuning of ETL processes. Familiarity with BI tools like Power BI, Tableau (preferred but not mandatory). Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Preferred Qualifications: Experience with Azure Data Factory, Synapse Analytics, or other cloud-based data services. Knowledge of data governance and data quality best practices. Bachelors degree in Computer Science, Information Systems, or a related field.
Posted 1 week ago
4.0 - 8.0 years
5 - 9 Lacs
Pune
Work from Office
We are Allvue Systems, the leading provider of software solutions for the Private Capital and Credit markets. Whether a client wants an end-to-end technology suite, or independently focused modules, Allvue helps eliminate the boundaries between systems, information, and people. We re looking for ambitious, smart, and creative individuals to join our team and help our clients achieve their goals. Working at Allvue Systems means working with pioneers in the fintech industry. Our efforts are powered by innovative thinking and a desire to build adaptable financial software solutions that help our clients achieve even more. With our common goals of growth and innovation, whether you re collaborating on a cutting-edge project or connecting over shared interests at an office happy hour, the passion is contagious. We want all of our team members to be open, accessible, curious and always learning. As a team, we take initiative, own outcomes, and have passion for what we do. With these pillars at the center of what we do, we strive for continuous improvement, excellent partnership and exceptional results. Come be a part of the team that s revolutionizing the alternative investment industry. Define your own future with Allvue Systems! Design, implement, and maintain data pipelines that handle both batch and real-time data ingestion. Integrate various data sources (databases, APIs, third-party data) into Snowflake and other data systems. Work closely with data scientists and analysts to ensure data availability, quality, and performance. Troubleshoot and resolve issues related to data pipeline performance, scalability, and integrity. Optimize data processes for speed, scalability, and cost efficiency. Ensure data governance and security best practices are implemented Possesses a total experience of 5 to 8 years, including over 4+ years of expertise in data engineering or related roles. Strong experience with Snowflake, Kafka, and Debezium. Proficiency in SQL, Python, and ETL frameworks. Experience with data warehousing, data modeling, and pipeline optimization. Strong problem-solving skills and attention to detail. Experience in the financial services or fintech industry is highly desirable
Posted 1 week ago
1.0 - 4.0 years
2 - 6 Lacs
Bengaluru
Work from Office
Job Summary: We are seeking a highly skilled and experienced AWS Engineer to join our technology team in Bangalore. The ideal candidate will have a solid background in AWS services, data engineering, and ML model operationalization. You will be responsible for designing, building, and managing scalable, secure, and cost-optimized cloud-based data pipelines and infrastructure solutions. Roles and Responsibilities: Design, develop, and maintain scalable data pipelines using AWS services such as Glue, Lambda, S3, Redshift, and EMR Collaborate with Data Scientists and ML Engineers to operationalize machine learning models using Amazon SageMaker Implement data transformation and feature engineering workflows Optimize ETL/ELT processes while ensuring data quality, consistency, and governance Manage both structured and unstructured data using Athena, DynamoDB, and RDS Build and manage CI/CD pipelines for data and ML workflows using AWS CodePipeline, CodeBuild, and Step Functions Monitor and troubleshoot AWS data infrastructure for performance, reliability, and cost-efficiency Ensure data security and regulatory compliance across all AWS operations Required Skills & Experience: 4+ years of hands-on experience with AWS Cloud Services Proficiency with core AWS data services: Glue, Lambda, S3, Redshift, EMR Experience operationalizing ML models using SageMaker Strong understanding of ETL/ELT workflows and performance tuning Familiarity with Athena , DynamoDB , and RDS Experience with CI/CD pipelines using CodePipeline, CodeBuild, Step Functions Knowledge of data governance, security , and compliance standards Excellent problem-solving, communication, and team collaboration skills
Posted 1 week ago
3.0 - 5.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Junior Global Process Solution Key User- P2P Location: Bangalore, IN, 562122 Position Type: Professional Job Description The Junior Process & Solution Key User - Vendor Master Data supports process/solution development, improvements and implementation of standard processes/solution on a local site and organizational unit including any adaptations or variants. The Process & Solution Key User will support the end users and support the full utilization of process and solutions. Ensure accurate and consistent Vendor Master Data practices, supporting data governance initiatives and data quality Key responsibilities and competencies: Bring business knowledge and needs/ requirements from all users to the Business Process Developer/Solution Leader in process/solution development and improvement activities Develop and maintain Vendor master data management processes and standards, ensure the accuracy, completeness and consistency of master data across various systems and platforms Conduct data quality assignments and implement corrective actions as needed Analyse business issues and business requests from a process and solution perspective before initiating a formal change request Collect, analyse, propose and help prioritize change requests from the users represented towards the Business Process Developer or the Solution Leader Participate in acceptance tests (process and solution) Approve/Reject user acceptance test (i.e. new solution releases) Identify root cause to define process and solution improvement area and propose solution or escalate Review and accept process/solution development and improvement proposals Be the single point of contact for end users (i.e. how to questions regarding the process/solution(s) incl. access requests) Address IT end user questions and act as single point of escalation to the ITS support Accept escalation of process / IT solution maintenance and support issues and development and improvement proposals on behalf of the end users Register a change request (CR) with IT team Communicate and anchor process/solution improvement proposals Support implementation of standard process/es and solution(s) Support the definition of process measurement/s Identify training needs, plan and secure training in cooperation with Business Process Developer and / or Solution Leader Ensure Internal Control compliance and External Audit requirements Perform process training and give support to end users Perform SAP trainings for end users Represent the users in user groups/reference groups or similar forums Pre-requisites : Minimum 4 years of professional experience gained in accounting area (Vendor Master Data Experience strongly preferred) Possess strong organizational and time management skills Effective communication skill both written and verbal Should be open for any shifts Must be well organized and a self-starter Must be able to follow standard filing procedures Detail oriented, professional attitude, reliable System knowledge: 1. Various SAP ECC or S/4 2. Microsoft Office proficiency We value your data privacy and therefore do not accept applications via mail. Who we are and what we believe in We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Finance contributes to realizing the vision of the Volvo Group by developing and providing a wide range of expert services from financial planning to accounting, business controlling, M&As, financial reporting and investor relations. With Volvo Group Finance you will be part of a global and diverse team of highly skilled professionals who work with passion, trust each other and embrace change to stay ahead. We make our customers win. Job Category: Finance Organization: Group Finance Travel Required: No Travel Required Requisition ID: 22874 View All Jobs Do we share the same aspirations? Every day, Volvo Group products and services ensure that people have food on the table, children arrive safely at school and roads and buildings can be constructed. Looking ahead, we are committed to driving the transition to sustainable and safe transport, mobility and infrastructure solutions toward a net-zero society. Joining Volvo Group, you will work with some of the world s most iconic brands and be part of a global and leading industrial company that is harnessing automated driving, electromobility and connectivity. Our people are passionate about what they do, they aim for high performance and thrive on teamwork and learning. Everyday life at Volvo is defined by a climate of support, care and mutual respect. If you aspire to grow and make an impact, join us on our journey to create a better and more resilient society for the coming generations.
Posted 1 week ago
1.0 - 3.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Junior Global Process Solution Key User Location: Bangalore, IN, 562122 Position Type: Professional Job Description The Junior Process & Solution Key User - Vendor Master Data supports process/solution development, improvements and implementation of standard processes/solution on a local site and organizational unit including any adaptations or variants. The Process & Solution Key User will support the end users and support the full utilization of process and solutions. Ensure accurate and consistent Vendor Master Data practices, supporting data governance initiatives and data quality Key responsibilities and competencies: Bring business knowledge and needs/ requirements from all users to the Business Process Developer/Solution Leader in process/solution development and improvement activities Develop and maintain Vendor master data management processes and standards, ensure the accuracy, completeness and consistency of master data across various systems and platforms Conduct data quality assignments and implement corrective actions as needed Analyse business issues and business requests from a process and solution perspective before initiating a formal change request Collect, analyse, propose and help prioritize change requests from the users represented towards the Business Process Developer or the Solution Leader Participate in acceptance tests (process and solution) Approve/Reject user acceptance test (i.e. new solution releases) Identify root cause to define process and solution improvement area and propose solution or escalate Review and accept process/solution development and improvement proposals Be the single point of contact for end users (i.e. how to questions regarding the process/solution(s) incl. access requests) Address IT end user questions and act as single point of escalation to the ITS support Accept escalation of process / IT solution maintenance and support issues and development and improvement proposals on behalf of the end users Register a change request (CR) with IT team Communicate and anchor process/solution improvement proposals Support implementation of standard process/es and solution(s) Support the definition of process measurement/s Identify training needs, plan and secure training in cooperation with Business Process Developer and / or Solution Leader Ensure Internal Control compliance and External Audit requirements Perform process training and give support to end users Perform SAP trainings for end users Represent the users in user groups/reference groups or similar forums Pre-requisites : Minimum 4 years of professional experience gained in accounting area (Vendor Master Data Experience strongly preferred) Possess strong organizational and time management skills Effective communication skill both written and verbal Should be open for any shifts Must be well organized and a self-starter Must be able to follow standard filing procedures Detail oriented, professional attitude, reliable System knowledge: 1. Various SAP ECC or S/4 2. Microsoft Office proficiency We value your data privacy and therefore do not accept applications via mail. Who we are and what we believe in We are committed to shaping the future landscape of efficient, safe, and sustainable transport solutions. Fulfilling our mission creates countless career opportunities for talents across the group s leading brands and entities. Applying to this job offers you the opportunity to join Volvo Group . Every day, you will be working with some of the sharpest and most creative brains in our field to be able to leave our society in better shape for the next generation. We are passionate about what we do, and we thrive on teamwork. We are almost 100,000 people united around the world by a culture of care, inclusiveness, and empowerment. Group Finance contributes to realizing the vision of the Volvo Group by developing and providing a wide range of expert services from financial planning to accounting, business controlling, M&As, financial reporting and investor relations. With Volvo Group Finance you will be part of a global and diverse team of highly skilled professionals who work with passion, trust each other and embrace change to stay ahead. We make our customers win. Job Category: Finance Organization: Group Finance Travel Required: No Travel Required Requisition ID: 22942 View All Jobs Do we share the same aspirations? Every day, Volvo Group products and services ensure that people have food on the table, children arrive safely at school and roads and buildings can be constructed. Looking ahead, we are committed to driving the transition to sustainable and safe transport, mobility and infrastructure solutions toward a net-zero society. Joining Volvo Group, you will work with some of the world s most iconic brands and be part of a global and leading industrial company that is harnessing automated driving, electromobility and connectivity. Our people are passionate about what they do, they aim for high performance and thrive on teamwork and learning. Everyday life at Volvo is defined by a climate of support, care and mutual respect. If you aspire to grow and make an impact, join us on our journey to create a better and more resilient society for the coming generations.
Posted 1 week ago
3.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
We are looking for an experienced Data Product Owner to lead the development of reusable, scalable data solutions for AI and Generative AI (GenAI) applications. In this role, you will work closely with data engineering, analytics, and AI teams to define, prioritize, and deliver high-impact data products that empower innovation and efficiency across AI initiatives. Key Responsibilities : Product Vision and Roadmap : Define and communicate a clear vision for reusable data assets that support AI and GenAI, aligning with business goals and AI strategy. Stakeholder Engagement : Collaborate with data scientists, engineers, and business leaders to gather requirements, prioritize features, and manage expectations. Solution Design : Drive the creation of modular, scalable data solutions to enable efficient model training, validation, and deployment for various AI and GenAI use cases. Data Quality and Governance : Ensure data products adhere to data quality, security, and compliance standards, enabling trustworthy and accurate AI outcomes. Performance Monitoring : Track product performance and usage, using feedback to enhance features and prioritize future iterations. Experience : 5+ years as a Product Owner/Manager in data or AI, with a focus on building scalable data solutions. Technical Knowledge : Strong understanding of data infrastructure, ETL/ELT processes, data governance, cloud data platforms, data science usecase and model lifecycle management. AI/GenAI Familiarity : Knowledge of AI/GenAI data requirements, workflows, and reusable data design. Data Quality & Governance : Experience ensuring data compliance, security, and quality standards. Soft Skills : Excellent communication, prioritization, and stakeholder management skills to align cross-functional teams and drive product vision. Requirements : Proven experience as a Product Owner or Product Manager in data or AI. Strong understanding of data infrastructure, including ETL/ELT, cloud data storage, and model lifecycle management. Familiarity with AI/GenAI applications and their data needs. Excellent communication and prioritization skills to manage cross-functional teams.
Posted 1 week ago
1.0 - 4.0 years
2 - 5 Lacs
Gurugram
Work from Office
LocationBangalore/Hyderabad/Pune Experience level8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBTautomated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France