Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0.0 - 4.0 years
0 Lacs
haryana
On-site
As the Financial Services (FSO) division of Ernst & Young, you will have the unique opportunity to be part of a professional services organization dedicated exclusively to the financial services marketplace. Joining our multi-disciplinary teams from around the world, you will play a crucial role in delivering a global perspective. Aligned with key industry groups such as asset management, banking and capital markets, insurance, and private equity, we offer integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning opportunities, and individually tailored coaching, you will undergo continuous professional development. Our focus is on developing exceptional leaders who collaborate effectively to fulfill our commitments to all stakeholders, thereby contributing significantly to building a better working world for our people, clients, and communities. Excited to be a part of this journey This is just the beginning, as the exceptional EY experience will stay with you for a lifetime. As a future FSO Technology Consultant at EY, you will be part of a team that helps clients navigate complex industry challenges and leverage technology to enhance business operations. Your role will involve addressing business and strategic challenges such as business and solution architecture, digital transformation, project management, and design of digital operating models. Additionally, you will work on technical matters including data science, advanced analytics, IoT, data governance, blockchain, artificial intelligence, and robotic process automation. Joining our team means working on critical projects within the financial services landscape, with opportunities to transition between teams as both you and our dynamic business continue to grow and evolve. Your contributions will be instrumental in propelling EY to new heights. We are currently seeking individuals for the following positions: - Cybersecurity - Digital - Platform - Data & Analytics To qualify for a role in our team, you must have: - A Bachelor's or Master's Degree in (Business) Engineering, Computer Science, Information Systems Management, Mathematics, (applied) Economics, or a related field with an interest in cutting-edge technologies. - Strong analytical skills. - Knowledge of project management methodologies, including agile, traditional, and hybrid approaches. - Proficiency in English at an advanced level. - Experience in team leadership. - Exceptional oral and written communication abilities. If you believe you meet the above criteria, we encourage you to apply at your earliest convenience. The exceptional EY experience awaits you, ready for you to shape and build upon.,
Posted 1 week ago
13.0 - 17.0 years
0 Lacs
punjab
On-site
The Deputy General Manager - Finance Master Data Lead at Bunge, reporting to the Global EDM Head, plays a crucial role in leading and supporting enterprise master data programs to drive Bunge's strategic initiatives encompassing Digital programs, Data Integration, and S4 Hana implementation for Finance data domain, including Cost center, GL, Profit center, and Company code. In this role, you will be accountable for overseeing the development of business solutions, ensuring Data Quality, and successfully implementing designed solutions across all geographic regions and Bunge's businesses. Your responsibilities will include collaborating with multiple stakeholders from Business, IT, and other areas globally to define and achieve mutually agreed outcomes within the master data domains. As a techno-functional expert in Master Data Management for Finance data type, you will work closely with Business Subject Matter Experts to understand detailed requirements for providing effective solutions, Business Data Owners to ensure alignment with domain priorities, and Business functional area leaders to ensure scope and deliverables meet business needs. You will collaborate with IT teams to gather requirements effectively, work alongside technical Solution Architects to ensure alignment with solution direction, engage with Bunge Business Services leaders to ensure globally standardized solutions, and coordinate with Delivery Partner teams to ensure high-quality delivery meeting business expectations. Key functions of this role include spearheading the end-to-end business requirements, engaging with stakeholders to gather requirements, defining project scope and deliverables, leading business UAT, managing large, complex global projects, and ensuring successful implementation of master data solutions without disrupting business processes. Additionally, you will build relationships with internal and external service providers, guide project teams, maintain an in-depth understanding of master data processes, lead Continuous Improvement initiatives, and contribute to strategic directions governing master data. To be successful in this role, you must have a minimum of 13-15 years of professional data management experience, including at least 8-10 years of experience in providing business solutions and working with SAP HANA & MDG/MDM. An educational background in CA, M.Com, ICWA, B.Tech, or MBA Finance is preferred. Moreover, you should possess hands-on knowledge of technologies such as SAP MDG, S4 HANA, Data Lake, Data Model, and MDM, as well as expertise in Master Data Management for Finance data type and experience with Data Dictionaries and Metadata management. Strong leadership, service delivery, and project management skills are essential for this role, along with the ability to work effectively in virtual teams across different cultures and time zones. Bunge, a world leader in sourcing, processing, and supplying oilseed and grain products, aims to create sustainable products and opportunities for farmers and consumers worldwide. With a global network and a dedicated workforce, Bunge is committed to feeding and fueling a growing world while upholding its values of innovation and sustainability.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be the visionary Group Data Product Manager (GPM) for AI/ML & Metadata Management, responsible for leading the development of advanced AI/ML-powered metadata solutions. Your primary focus will be on establishing a cohesive and intuitive Data Platform tailored to cater to a variety of user roles including data engineers, producers, and consumers. Your role involves integrating various tools to create a unified platform that will significantly improve data discoverability, governance, and operational efficiency on a large scale.,
Posted 1 week ago
10.0 - 18.0 years
2 - 3 Lacs
Hyderabad
Work from Office
Experience needed: 12-18 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are looking for an experienced and visionary Data Architect - Azure Data & Analytics to lead the design and delivery of scalable, secure, and modern data platform solutions leveraging Microsoft Azure and Microsoft Fabric . This role requires deep technical expertise in the Azure Data & Analytics ecosystem, strong experience in designing cloud-native architectures, and a strategic mindset to modernize enterprise data platforms. Key Responsibilities: Architect and design Modern Data Platform solutions on Microsoft Azure, including ingestion, transformation, storage, and visualization layers. Lead implementation and integration of Microsoft Fabric , including OneLake, Direct Lake mode, and Fabric workloads (Data Engineering, Data Factory, Real-Time Analytics, Power BI). Define enterprise-level data architecture , including data lakehouse patterns, delta lakes, data marts, and semantic models. Collaborate with business stakeholders, data engineers, and BI teams to translate business needs into scalable cloud data solutions. Design solutions using Azure-native services such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure SQL, and Azure Event Hubs. Establish best practices for data security, governance, DevOps, CI/CD pipelines, and cost optimization. Guide implementation teams on architectural decisions and technical best practices across the data lifecycle. Develop reference architectures and reusable frameworks for accelerating data platform implementations. Stay updated on Microsofts data platform roadmap and proactively identify opportunities to enhance data strategy. Assist in developing RFPs, architecture assessments, and solution proposals. Requirements Required Skills & Qualifications: Proven 12-18 years of experience which includes designing and implementing cloud-based modern data platforms on Microsoft Azure. Deep knowledge and understanding of Microsoft Fabric architecture , including Data Factory, Data Engineering, Synapse Real-Time Analytics, and Power BI integration. Expertise in Azure Data Services : Azure Synapse, Data Factory, Azure SQL, ADLS Gen2, Azure Functions, Azure Purview, Event Hubs, etc. Experience of data warehousing, lakehouse architectures, ETL/ELT , and data modeling. Experience in data governance, security, role-based access (Microsoft Entra/Azure AD) , and compliance frameworks. Strong leadership and communication skills to influence both technical and non-technical stakeholders. Familiarity with DevOps and infrastructure-as-code (e.g., ARM templates, Bicep, Terraform) is a plus. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert , Azure Data Engineer Associate , or Microsoft Fabric Certification . Experience with real-time data streaming , IoT , or machine learning pipelines in Azure. Familiarity with multi-cloud data strategies or hybrid deployments is an advantage.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a candidate for the position, you should have a strong understanding of Service-Oriented Architecture (SOA) and Microservices, particularly in the context of a cloud data platform. Your knowledge of full-stack development, encompassing front-end and back-end technologies such as Angular, React, and Node.js, will be crucial for collaborating on data access and visualization layers. In addition, experience with both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery, is required for effective database management. You will be responsible for understanding data governance frameworks and implementing Role-Based Access Control (RBAC), encryption, and data masking in cloud environments to ensure data security. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks will be essential for streamlining processes. Strong analytical skills are necessary for troubleshooting complex data platform and microservices issues. Your qualifications should include a B.Tech or M.Tech degree to qualify for this role. As part of your key job responsibilities, you will be tasked with designing and building scalable data pipelines and microservices on Google Cloud Platform (GCP) to support real-time and batch processing. Designing and implementing SOA and microservices-based architectures, ensuring modular, flexible, and maintainable data solutions, will be a core focus of your role. Utilizing your full-stack expertise, you will contribute to the seamless integration of front-end and back-end components, enabling robust data access and UI-driven data exploration. Leading data ingestion and integration efforts from various sources into the data platform, standardizing and optimizing data for analytics, will also be part of your responsibilities. Leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Functions, among others, you will build and manage data platforms aligned with business needs. Implementing and managing data governance, access controls, and security best practices while utilizing GCP's native security features will be crucial. Continuous monitoring and improvement of performance, scalability, and efficiency of data pipelines and storage solutions will be required. Collaboration with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering is essential. Lastly, automating data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency will be a key focus area in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will play a crucial role in enhancing the Analytics capabilities for our businesses. Your responsibilities will include engaging with key stakeholders to comprehend Fidelity's sales, marketing, client services, and propositions context. You will collaborate with internal teams such as the data support team and technology team to develop new tools, capabilities, and solutions. Additionally, you will work closely with IS Operations to expedite the development and sharing of customized data sets. Maximizing the adoption of Cloud-Based Data Management Services will be a significant part of your role. This involves setting up sandbox analytics environments using platforms like Snowflake, AWS, Adobe, and Salesforce. You will also support data visualization and data science applications to enhance business operations. In terms of stakeholder management, you will work with key stakeholders to understand business problems and translate them into suitable analytics solutions. You are expected to facilitate smooth execution, delivery, and implementation of these solutions through effective engagement with stakeholders. Your role will also involve collaborating with the team to share knowledge and best practices, including coaching on deep learning and machine learning methodologies. Taking independent ownership of projects and initiatives within the team is crucial, demonstrating leadership and accountability. Furthermore, you will be responsible for developing and evaluating tools, methodologies, or infrastructure to address long-term business challenges. This may involve enhancing modelling software, methodologies, data requirements, and optimization environments to elevate the team's capabilities. To excel in this role, you should possess 5 to 8 years of overall experience in Analytics, with at least 4 years of experience in SQL, Python, open-source Machine Learning Libraries, and Deep Learning. Experience working in an AWS Environment, preferably using Snowflake, is preferred. Proficiency in analytics applications such as Python, SAS, SQL, and interpreting statistical results is necessary. Knowledge of SPARK, Hadoop, and Big Data Platforms will be advantageous.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
As a Principal Data Engineer at Skillsoft, you will play a crucial role in driving the advancement of Enterprise data infrastructure by designing and implementing the logic and structure for how data is set up, cleansed, and stored for organizational usage. You will be responsible for developing a Knowledge Management strategy to support Skillsoft's analytical objectives across various business areas. Your role will involve building robust systems and reusable code modules to solve problems, working with the latest open-source tools and platforms to build data products, and collaborating with Product Owners and cross-functional teams in an agile environment. Additionally, you will champion the standardization of processes for data elements used in analysis, establish forward-looking data and technology objectives, manage a small team through project deliveries, and design rich data visualizations and interactive tools to communicate complex ideas to stakeholders. Furthermore, you will evangelize the Enterprise Data Strategy & Execution Team mission, identify opportunities to influence decision-making with supporting data and analysis, and seek additional data resources that align with strategic objectives. To qualify for this role, you should possess a degree in Data Engineering, Information Technology, CIS, CS, or related field, along with 7+ years of experience in Data Engineering/Data Management. You should have expertise in building cloud data applications, cloud computing, data engineering/analysis programming languages, and SQL Server. Proficiency in data architecture, data modeling, and experience with technology stacks for Metadata Management, Data Governance, and Data Quality are essential. Additionally, experience in working cross-functionally across an enterprise organization and an Agile methodology environment is preferred. Your strong business acumen, analytical skills, technical abilities, and problem-solving skills will be critical in this role. Experience with app and web analytics data, CRM, and ERP systems data is a plus. Join us at Skillsoft and be part of our mission to democratize learning and help individuals unleash their edge. If you find this opportunity intriguing, we encourage you to apply and be a part of our team dedicated to leadership, learning, and success at Skillsoft. Thank you for considering this role.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As one of the world's leading asset managers, Invesco is dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, we provide a wide range of investment strategies and vehicles to our clients around the world. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, come explore your potential at Invesco. Make a difference every day! --- As one of the world's leading asset managers, we are solely dedicated to delivering an investment experience that helps people get more out of life. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, explore your potential at Invesco. --- The Department: Invesco understands data and the products created from data are the lifeblood of our business. The Distribution Data Office (DDO) is a linchpin of an ecosystem of data-first functions that will work together seamlessly to enable Invesco to achieve the true value of data and data products. DDO will empower Invesco to leverage data as a strategic asset by making quality, trusted data and content available to the right people, at the right time, in the right format, in the most efficient way possible to enable both large transformations and day-to-day operations. --- Your Role: The Data Product Owner sits within the Distribution Data Office. They maintain good relationships with internal partners and collaborate with their peers in business units within the Distribution domain. They develop a strong understanding of the distribution data, including both how it is produced and consumed across domains and how to apply strategy and practices to drive Distributions data capabilities forward. They have a good understanding of strategy, capabilities, and stated data operating model and associated roles and responsibilities, and work with Data Governance to assure this op model is established and optimized across all roles related to Distribution data. All work is done in partnership with their counterparts in Investments and Corporate Shared Services to ensure consistency of the data services and deliverables, as well as in partnership with their Technology counterparts. --- The Data Product Owner executes on the defined data strategy and roadmaps to improve the quality, availability, and usage of their data. They are accountable for the development and execution of detailed deliverables for the areas directly related to distribution data and participate in large transformational projects as well as internal initiatives and BAU/continuous enhancement. They have a good understanding of their business constituents" needs and use cases as it relates to distribution data, and they work with other functions to understand how this data ties to other data throughout the enterprise to ultimately create business-facing data products. They translate these use cases and requirements into actionable backlogs of work for themselves and the delivery teams, performing complex hands-on-keys work and collaborating with the broader delivery teams, both business analysts and technical engineering squads. --- This individual: - Must be comfortable working in an agile development environment - Have an ability to handle multiple requests from varied stakeholders, in a way that maintains clear priority, and ability to adjust when needed - Be an effective translator of business needs into technical requirements - Have an inquisitive and innovative mindset with a demonstrated ability to recognize opportunities to create distinctive value - Demonstrate an ability to build relationships, collaborate, mentor, motivate, and influence internal and external teams - Be able to work independently when needed, take initiative, and get projects completed with great attention to detail and on time. --- You will be responsible for: - Executes and drives distribution data deliverables within the defined overall data strategy and roadmap (functional analyses, requirements gathering and translation to specific data deliverables, ensuring adherence to architectural & technical standards, etc.) as related to their projects - Partners with business and technology peers to ensure priorities are agreed and deliverables executed with diligence and in accordance with business needs - Articulates dependencies across projects requiring distribution data as an input, escalating risks to leadership as required - Captures business use cases and requirements related to distribution data, translates to data requirements, and partners with technology in functional and technical design discussions to ensure the end data product meets business needs - Decomposes requirements into executable Epics and high-level User Stories for themselves and any impacted Squads. - Work with leadership to ensure that data related business requirements align with broader business goals - Perform business analysis and detailed requirements capturing activities within project(s) and delivery squads as required - Understand impacts of necessary technology and architecture priorities such as Cloud-first strategy and Tech Debt remediation and ensure the same are reflected in solution recommendations - Provide data-related input to Business Change Management activities as part of their projects to drive business user engagement and adoption and mitigate impact - Acts as a change agent and drives adoption of data capabilities within their projects and with direct business partners - Interfacing with and articulating the value of data deliverables to direct business partners and individual users - Playing a consultative role to identify opportunities for the people, process, and tools within the delivery team to improve efficiency and effectiveness - Coaching individual members of the team as needed to optimize the efficiency of the delivery team - Identify opportunities for continuous improvement of data management processes to reduce complexity, improve data quality, and increase efficiency throughout the data delivery lifecycle --- The experience you bring: - Minimum of 2 years of Product Ownership with asset management data or related experience - Knowledge of data capabilities, practices, and frameworks including concepts related to master data management, data governance, business intelligence, and analytics and their practical applications to deliver data products - Knowledge of common data platforms and how data technologies function, with a lens towards practical application to business needs - History of working on large, complex projects preferably within an Agile framework - Intellectual curiosity to gain a deep understanding of commercial business drivers and client needs in the investment management industry, and how efficient use of data & content can facilitate meeting those needs - Excellent interpersonal skills and demonstrated ability to work effectively with their project colleagues, peers within the Distribution Data Office, and peers across the enterprise - Exceptional intellectual horsepower and passion for excellence - Comfortable dealing with ambiguity - Solid business acumen including the ability to think strategically, exhibit sound business judgment, and demonstrate a strong drive-for-results --- Nice to haves: - Background in Financial Services or Asset Management is a plus - Background in Sales and Marketing data is a plus - Working with multiple delivery squads a plus - Hands-on data engineering or data delivery experience a plus --- What's in it for you Our people are at the very core of our success, and we strive to provide employees with a competitive total rewards package which includes: - 401(k) matching - Flex time off - Health and wellness benefits - Work flexibility programs - Parental leave benefits --- The above information has been designed to indicate the general nature and level of work performed by employees within this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned to this job. The job holder may be required to perform other duties as deemed appropriate by their manager from time to time. --- Full Time / Part Time: Full time --- Worker Type: Employee --- Job Exempt (Yes / No): Yes --- Workplace Model: At Invesco, our workplace model supports our culture and meets the needs of our clients while providing flexibility our employees value. As a full-time employee, compliance with the workplace policy means working with your direct manager to create a schedule where you will work in your designated office at least three days a week, with two days working outside an Invesco office. --- Why Invesco: In Invesco, we act with integrity and do meaningful work to create an impact for our stakeholders. We believe our culture is stronger when we all feel we belong, and we respect each other's identities, lives, health, and well-being. We come together to create better solutions for our clients, our business, and each other by building on different voices and perspectives. We nurture and encourage each other to ensure our meaningful growth, both personally and professionally. --- We believe in a diverse, inclusive, and supportive workplace where everyone feels equally valued, and this starts at the top with our senior leaders having diversity and inclusion goals. Our global focus on diversity and inclusion has grown exponentially, and we encourage connection and community through our many employee-led Business Resource Groups (BRGs). --- What's in it for you As an organization, we support personal needs, diverse backgrounds, and provide internal networks, as well as opportunities to get involved in the community and in the world. --- Our benefit policy includes but is not limited to: - Competitive Compensation - Flexible, Hybrid Work - 30 days Annual Leave + Public Holidays - Life Insurance - Retirement Planning - Group Personal Accident Insurance - Medical Insurance for Employee and Family - Annual Health Check-up - 26 weeks Maternity Leave - Paternal Leave - Adoption Leave - Near site Childcare Facility - Employee Assistance Program - Study Support - Employee Stock Purchase Plan - ESG Commitments and Goals - Business Resource Groups - Career Development Programs - Mentoring Programs - Invesco Cares - Dress for your Day --- In Invesco, we offer development opportunities that help you thrive as a lifelong learner in a constantly evolving business environment and ensure your constant growth. Our AI-enabled learning platform delivers curated content based on your role and interest. We ensure our managers and leaders also have many opportunities to advance their skills and competencies that become pivotal in their continuous pursuit of performance excellence. --- To know more about us: About Invesco: https://www.invesco.com/corporate/en/home.html About our Culture: https://www.invesco.com/corporate/en/about-us/our-culture.html About our D&I policy: https://www.invesco.com/corporate/en/our-commitments/diversity-and-inclusion.html About our CR program: https://www.invesco.com/corporate/en/our-commitments/corporate-responsibility.html --- Apply for the role @ Invesco Careers: https://careers.invesco.com/india/,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
noida, uttar pradesh
On-site
FCM is one of the world's largest travel management companies and a trusted partner for national and multinational organizations. With a 24/7 reach in 97 countries, FCM's flexible technology anticipates and resolves client needs, backed by experts offering in-depth local knowledge and a commitment to duty of care. As part of the ASX-listed Flight Centre Travel Group, FCM delivers the best market-wide rates, unique added-value benefits, and exclusive solutions. A leader in the travel tech space, FCM offers proprietary client solutions and provides specialist services through FCM Consulting and FCM Meetings & Events. We are seeking a skilled Azure Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, working with Azure cloud services, and designing and implementing scalable data solutions. You will play a crucial role in developing, optimizing, and maintaining data pipelines and architectures, ensuring data quality and availability across various platforms. Key Responsibilities: - Design, develop, and maintain data pipelines and ETL processes using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. - Build and optimize data storage solutions using Azure Data Lake, Azure SQL Database, and Azure Cosmos DB. - Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. - Implement data quality checks, data governance, and security best practices across data platforms. - Monitor, troubleshoot, and optimize data workflows for performance and scalability. - Develop and maintain data models, data cataloging, and metadata management. - Automate data integration and transformation processes using Azure DevOps and CI/CD pipelines. - Stay up-to-date with emerging Azure technologies and data engineering trends. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 6+ years of experience in data engineering with a focus on Azure cloud services. - Proficiency in Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. - Strong experience with SQL, Python, or other scripting languages. - Familiarity with data modeling, ETL design, and big data tools such as Hadoop or Spark. - Experience with data warehousing concepts, data lakes, and data pipelines. - Understanding of data governance, data quality, and security best practices. - Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment. Preferred Skills: - Azure certification (e.g., Microsoft Certified: Azure Data Engineer Associate) is a plus. - Experience with Azure Logic Apps, Azure Functions, and API Management. - Knowledge of Power BI, Tableau, or other data visualization tools.,
Posted 1 week ago
7.0 - 12.0 years
0 Lacs
maharashtra
On-site
As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Data Engineer at Blis, you will be part of a globally recognized and award-winning team that specializes in big data analytics and advertising. We collaborate with iconic brands like McDonald's, Samsung, and Mercedes Benz, providing precise audience insights to help them target their ideal customers effectively. Upholding ethical data practices and privacy rights is at the core of our operations, and we are committed to ensuring outstanding performance and reliability in all our systems. Working at Blis means being part of an international company with a diverse culture, spanning across four continents and comprising over 300 team members. Headquartered in the UK, we are financially successful and poised for continued growth, offering you an exciting opportunity to contribute to our journey. Your primary responsibility as a Data Engineer will involve designing and implementing high-performance data pipelines on Google Cloud Platform (GCP) to handle massive amounts of data efficiently. With a focus on scalability and automation, you will play a crucial role in building secure pipelines that can process over 350GB of data per hour and respond to 400,000 decision requests each second. Your expertise will be instrumental in driving improvements in data architecture, optimizing resource utilization, and delivering fast, accurate insights to stakeholders. Collaboration is key at Blis, and you will work closely with product and engineering teams to ensure that our data infrastructure evolves to support new initiatives seamlessly. Additionally, you will mentor and support team members, fostering a collaborative environment that encourages knowledge sharing, innovation, and professional growth. To excel in this role, you should have at least 5 years of hands-on experience with large-scale data systems, with a strong focus on designing and maintaining efficient data pipelines. Proficiency in Apache Druid and Imply platforms, along with expertise in cloud-based services like GCP, is essential. You should also have a solid understanding of Python for building and optimizing data flows, as well as experience with data governance and quality assurance practices. Furthermore, familiarity with event-driven architectures, tools like Apache Airflow, and distributed processing frameworks such as Spark will be beneficial. Your ability to apply complex algorithms and statistical techniques to large datasets, along with experience in working with relational databases and non-interactive reporting solutions, will be valuable assets in this role. Joining the Blis team means engaging in high-impact work in a data-intensive environment, collaborating with brilliant engineers, and being part of an innovative culture that prioritizes client obsession and agility. With a global reach and a commitment to diversity and inclusion, Blis offers a dynamic work environment where your contributions can make a tangible difference in the world of advertising technology.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
About the Company: Birlasoft is a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. The company takes pride in its consultative and design thinking approach, driving societal progress by enabling customers to run businesses with unmatched efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, Birlasoft boasts a 12,500+ professional team committed to upholding the Group's 162-year legacy. The core values of the company prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating a dedication to building inclusive and sustainable communities. Join Birlasoft in shaping a future where technology seamlessly aligns with purpose. About the Role: Birlasoft is seeking a seasoned Data Functional Analyst with a strong background in Projects to join the team. Responsibilities: - Ensure data quality during migration from Oracle EBS to Oracle Fusion within the Projects domain, maintaining data integrity and accuracy. - Collaborate with business stakeholders to gather data quality with organizational goals. - Implement data quality improvement strategies, providing support and guidance to the technical team. - Monitor quality improvement progress, address issues promptly, and facilitate stakeholder sessions for a smooth transition to Oracle Fusion. Qualifications: Any Graduation Required Skills: - In-depth knowledge of data models, methodologies, and best practices especially related to project domain data such as project definitions, tasks, budgets, resources, time and expenses, billing, project controls, and performance reporting. - Good understanding of data quality standards and the ability to implement and enforce them. - Strong interpersonal skills and relevant Oracle certifications in Projects domain. - Prior experience in maintaining high-level data quality during a data migration project through data cleansing, standardization, new data construction, validation, and de-duplication. - Strong understanding of data systems, databases, and data governance. Preferred Skills: - Knowledge of industry best practices in data management. - Experience with data modeling and database design. - Ability to communicate complex data concepts effectively. Equal Opportunity Statement: Birlasoft is committed to diversity and inclusivity in all aspects of the company.,
Posted 1 week ago
3.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Data Governance Consultant at KPMG in Bangalore, you will play a key role in developing and implementing data governance strategies and frameworks. Your responsibilities will include leading data quality management initiatives, managing metadata, collaborating with stakeholders on change management processes, and providing guidance on best practices to internal teams. To be successful in this role, you should have a minimum of 3 years of experience in a data governance role, with a total experience ranging from 3 to 12 years. Proficiency in data governance concepts, frameworks, and data quality management principles is essential. Experience with metadata management tools and change management processes will be beneficial. Excellent communication, stakeholder management skills, and the ability to work effectively in a cross-functional team environment are also required. KPMG offers a competitive salary package, health insurance coverage, opportunities for professional development and growth, and a dynamic and collaborative work environment. If you have a background in data governance practices and tools and are passionate about driving data quality and implementing data governance frameworks, we invite you to join our team in Bangalore. This is an Equal Opportunity Employer and we encourage candidates with a Full-Time education background in B.E/B.Tech/BCA/MBA/MCA/BBA/MBA to apply.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Product Owner for the GCP Data Migration Project at Clairvoyant, you will play a crucial role in leading the initiative and ensuring successful delivery of data migration solutions on Google Cloud Platform. With your deep understanding of cloud platforms, data migration processes, and Agile methodologies, you will collaborate with cross-functional teams to define the product vision, gather requirements, and prioritize backlogs to align with business objectives and user needs. Your key responsibilities will include defining and communicating the product vision and strategy, leading requirement gathering sessions with stakeholders, collaborating with business leaders and technical teams to gather and prioritize requirements, creating user stories and acceptance criteria, participating in sprint planning, establishing key performance indicators, identifying and mitigating risks, and fostering a culture of continuous improvement through feedback collection and iteration on product features and processes. To be successful in this role, you should have 10-12 years of experience in product management or product ownership, particularly in data migration or cloud projects. You must possess a strong understanding of Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, and Data Transfer Services, as well as experience with data migration strategies and tools including ETL processes and data integration methodologies. Proficiency in Agile methodologies, excellent analytical and problem-solving skills, strong communication skills, and a Bachelor's degree in Computer Science, Information Technology, Business, or a related field are essential qualifications. Additionally, experience with data governance and compliance in cloud environments, familiarity with project management and collaboration tools like JIRA and Confluence, understanding of data architecture and database management, and Google Cloud certifications such as Professional Cloud Architect and Professional Data Engineer are considered good to have qualifications. At Clairvoyant, we provide opportunities for engineers to develop and grow, work with a team of hardworking and dedicated peers, and offer growth and mentorship opportunities. We value diversity and encourage individuals with varying skills and qualities to apply, as we believe there might be a suitable role for you in the future. Join us in driving innovation and growth in the technology consulting and services industry!,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an innovative, analytical, and growth-minded Lead Product Manager at CDK Global, you will take ownership of the Enterprise Data Warehouse and Governance initiatives. Your main responsibility will be to define and execute the strategy for data platforms, ensuring accuracy, accessibility, and scalability. Collaborating with engineering, business, and analytics teams, you will deliver innovative SaaS-based data solutions for seamless integration, governance, and insights for enterprise clients. Additionally, you will play a crucial role in delivering an OEM Analytics solution that enables better business decisions for OEMs and Dealers based on actionable insights and predictive analytics. Your role will involve owning customer-facing OEM Analytics and identifying market opportunities and customer pain points to grow the business. Your key responsibilities will include defining product strategy and building roadmaps for enterprise data warehouse and governance platforms, prioritizing requirements, driving execution and delivery, building new features for seamless integrations with the CDK Data Platform, overseeing the product lifecycle, and ensuring compliance with regulatory standards. You will collaborate with various teams at CDK to ensure successful go-to-market plans, conduct customer research, implement operating models, and mitigate risks associated with data governance. To be successful in this role, you should have a Bachelor's degree in business, Computer Science, Engineering, or equivalent industry experience, along with 6 to 8+ years of product management experience in Enterprise SaaS and 5+ years of experience with data governance and scaling data platforms. You should possess a strong understanding of data warehousing, ETL processes, API integration, compliance frameworks, and data governance principles. Experience with agile development methodologies, working in a product-led environment, and collaborating with globally distributed teams is essential. Qualifications also include a proven track record of delivering automated and scalable enterprise data platforms, influencing and driving strategy across multiple stakeholders, critical thinking skills, excellent communication abilities, and a data-driven mindset. Financial acumen, willingness to travel, and technical knowledge of SQL, Python, or cloud platforms are preferred qualifications. At CDK, we value inclusion and diversity to inspire meaningful connections among our people, customers, and communities. If you are authorized to work in the US and are looking to join a dynamic environment where your skills and expertise can make a real impact, we encourage you to apply.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
The Chief Data & Analytics Office (CDAO) at JPMorgan Chase is responsible for accelerating the firm's data and analytics journey. This includes ensuring the quality, integrity, and security of the company's data, as well as leveraging this data to generate insights and drive decision-making. The CDAO is also responsible for developing and implementing solutions that support the firm's commercial goals by harnessing artificial intelligence and machine learning technologies to develop new products, improve productivity, and enhance risk management effectively and responsibly. Within CDAO, The Firmwide Chief Data Office (CDO) is responsible for maximizing the value and impact of data globally, in a highly governed way. It consists of several teams focused on accelerating JPMorgan Chase's data, analytics, and AI journey, including data strategy, data impact optimization, privacy, data governance, transformation, and talent. As a Senior Associate at JPMorgan Chase within the Chief Data & Analytics team, you will be responsible for working with stakeholders to define governance and tooling requirements and building out the BCBS Data Governance framework. In addition, you will be responsible for delivering tasks in detailed project plans for the BCBS deliverables owned by the Firmwide CDO. Lastly, you will play a role in developing and syndicating the content used for the BCBS governance meetings. **Job Responsibilities:** - Deliver on the BCBS book of work owned by the Firmwide CDO - Support the definition, prioritization, and resolution of governance and requirements decisions needed by the BCBS program - Collect, synthesize, analyze, and present project data and findings - Conduct analyses to identify issues and formulate recommendations - Develop regular, compelling communications on project status - Research data governance requirements and potential solutions - Collaborate effectively across organizations, functions, and geographies **Required qualifications, capabilities, and skills:** - Formal training or certification on Data Governance concepts and 3+ years applied experience - Diverse problem-solving experience - Excellent communication skills (oral and written) and the ability to work effectively in cross-functional teams - Excellent project management and organizational skills, with the ability to manage multiple deliverables and work simultaneously - Strong interpersonal leadership and influencing skills - Proficiency in MS Excel and PowerPoint **Preferred qualifications, capabilities, and skills:** - Familiarity with data management and governance, big data platforms, or data architecture is preferred - BS/BA degree or equivalent experience / Bachelor's degree in Business, Finance, Economics, or other related area,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
The Data Governance Business Analyst role involves being responsible for various tasks including assisting in identifying data quality issues, measuring and reporting, ensuring data policy adoption and compliance, and handling regulatory and audit response and action tracking. You must possess qualities such as being dynamic, flexible, and adaptable to quickly changing needs. Handling ambiguity and complexity, as well as managing multiple responsibilities, are essential. Effective communication and presentation skills are necessary to guide, influence, and convince others. You will collaborate with multiple teams to implement data issue resolution solutions and execute controls for the Data Risk and Control framework. Responsibilities also include overseeing data-related issues, conducting root cause analysis workshops, tracking ownership and target dates, and reporting metrics. Additionally, you will support business lines and global functions in requirement gathering, solution roll out, building controls around key risk indicators, and managing specific data issue resolution projects or new Key Risk Indicators implementation. The ideal candidate should have at least 10 years of relevant experience in Data Governance, Data Management, Process Engineering, or a related area. Proficiency in handling complexity, ambiguity, and a fast-changing work environment is crucial. Advanced knowledge of Project Management methodologies and tools is required, along with strong leadership, interpersonal, and relationship-building skills. Experience in working within Consulting/Audit with Big-4 firms and familiarity with Risk and Finance functions and multiple asset classes are advantageous. Education: Bachelor's/University degree, Master's degree preferred Citi is an equal opportunity and affirmative action employer.,
Posted 1 week ago
10.0 - 15.0 years
15 - 20 Lacs
Pune
Work from Office
Experience: 10-12years Availability: Immediate- 15days Role & responsibilities Own and manage Master Data Management (MDM) activities for SAP projects. De-duplication of Masters Lead data migration and cutovers in SAP S/4HANA projects (Greenfield, Migration, or Rollouts). Establish and implement MDM best practices and data management capabilities. Define data management principles, policies, and lifecycle strategies. Monitor data quality with consistent metrics and reporting. Work with MDM stakeholders to drive data governance and compliance. Track and manage MDM objects, ensuring timely delivery. Conduct training sessions for teams on ECC & S/4HANA MDM. Participate in daily stand-ups, issue tracking, and dashboard updates. Identify risks and process improvements for MDM. Required Skills & Qualifications: Minimum 10-12 years of experience in SAP MDM. Strong knowledge of ECC, SAP S/4HANA, Data Migration, and Rollouts. Experience in data governance, lifecycle management, and compliance. Familiarity with JIRA KANBAN boards, ticketing tools, and dashboards. Strong problem-solving and communication skills. Ability to work with the team especially ABAP, Middleware, Functionals. Knowledge on Excel is a MUST ABAP knowledge is preferable SAP training or certifications are an asset Team player, with strong communication skills and with a collaborative spirit Able to coach, support, train and develop junior consultants Customer oriented, result driven & focused on delivering quality
Posted 1 week ago
8.0 - 13.0 years
15 - 20 Lacs
Pune
Hybrid
EY is hiring for Leading Client for Data Governance Senior Analyst role for Pune location Role & responsibilities Coordinating with Data Srewards/Data Owners to enable identification of Critical data elements for SAP master Data Supplier/Finance/Bank master. Develop and maintain a business-facing data glossary and data catalog for SAP master data (Supplier, Customer, Finance (GL, Cost Center, Profit Center etc), capturing data definitions, lineage, and usage for relevant SAP master Data Develop and implement data governance policies, standards, and processes to ensure data quality, data management, and compliance for relevant SAP Master Data (Finance, Supplier and Customer Master Data) Develop both end-state and interim-state architecture for master data, ensuring alignment with business requirements and industry best practices. Define and implement data models that align with business needs and Gather requirements for master data structures. Design scalable and maintainable data models by ensuring data creation through single source of truth Conduct data quality assessments and implement corrective actions to address data quality issues. Collaborate with cross-functional teams to ensure data governance practices are integrated into all SAP relevant business processes. Manage data cataloging and lineage to provide visibility into data assets, their origins, and transformations in SAP environment Facilitate governance forums, data domain councils, and change advisory boards to review data issues, standards, and continuous improvements. Collaborate with the Data Governance Manager to advance the data governance agenda. Responsible to prepare data documentation, including data models, process flows, governance policies, and stewardship responsibilities. Collaborate with IT, data management, and business units to implement data governance best practices and migrate from ECC to S/4 MDG Monitor data governance activities, measure progress, and report on key metrics to senior management. Conduct training sessions and create awareness programs to promote data governance within the organization. Demonstrate deep understanding of SAP (and other ERP system such as JD Edwards etc.) master data structures such as Vendor, Customer, Cost center, Profit Center, GL Accounts etc. Summary: SAP Master Data (Vendor, Customer, GL, Cost Center, etc.) Data Governance Implementation (Transactional & Master Data) Data Modeling & Architecture (S/4HANA, ECC) Data Cataloging, Lineage, and Quality Assessment Governance Forums & Change Advisory Boards Experience in S/4HANA Greenfield implementations Migration Experience (ECC to S/4 MDG) Preferred candidate profile 8-14 years in data governance and SAP master data Strong understanding of upstream/downstream data impacts Expert in data visualization
Posted 1 week ago
4.0 - 7.0 years
15 - 27 Lacs
Gurugram
Remote
Job Title: Data Steward Location: Remote Job Type: Fulltime Years of Experience 6-8 years About Straive: Straive is a market leading Content and Data Technology company providing data services, subject matter expertise, & technology solutions to multiple domains. Data Analytics & Al Solutions, Data Al Powered Operations and Education & Learning form the core pillars of the company’s long-term vision. The company is a specialized solutions provider to business information providers in finance, insurance, legal, real estate, life sciences and logistics. Straive continues to be the leading content services provider to research and education publishers. Data Analytics & Al Services: Our Data Solutions business has become critical to our client's success. We use technology and Al with human experts-in loop to create data assets that our clients use to power their data products and their end customers' workflows. As our clients expect us to become their future fit Analytics and Al partner, they look to us for help in building data analytics and Al enterprise capabilities for them. With a client-base scoping 30 countries worldwide, Straive’s multi-geographical resource pool is strategically located in eight countries - India, Philippines, USA, Nicaragua, Vietnam, United Kingdom, and the company headquarters in Singapore. Website: https://www.straive.com/ Job Summary: We are seeking a Lead Data Steward to support enterprise data initiatives by ensuring the accuracy, completeness, and quality of data pipelines across data platforms. The ideal candidate will bring strong expertise in data quality practices, and data management principles to help establish trusted data foundations that drive business intelligence, analytics, and operational reporting. This role will be a critical part of our data team, to support data governance efforts, and enhance our data architecture. Key Responsibilities: Define, document, and maintain clear and consistent business definitions, data standards, and business rules for critical data elements within their assigned data domains. Ensure accurate and up-to-date business metadata (e.g., data definitions, ownership, lineage, quality rules) is captured and maintained in the enterprise data catalog. Design and maintain efficient ETL/ELT pipelines to ingest, transform, and deliver high-quality data across systems. Deploy the master data governance framework and use the supporting data management tools. Define data quality metrics and monitor data quality performance against established targets. Conduct regular data governance reviews and provide recommendations for process improvements. Ensure data accuracy, consistency, and integrity Implement and monitor data quality rules, validation checks, and exception handling. Collaborate with teams to align data delivery with business and reporting requirements. Document data processes, standards, and lineage to support governance and compliance. Transform raw, complex data into actionable insights through effective visualization techniques. Qualifications: 4+ years of experience in Data Governance, Operations of Data Governance Experience with data management tools (e.g., data catalogs, MDM systems, data quality platforms). Experience with data profiling and data quality assessment techniques. Solid understanding of data quality principles, data governance concepts, and master data management. Proficient in SQL and scripting for data transformation and troubleshooting. Experience with Data Governance tools such as Collibra Proactive problem-solver with a strong sense of ownership and accountability. Strong communication skills Bachelor’s degree in information systems, Computer Science, or a related technical field.
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
As a Data Engineer, you will play a key role in designing, developing, and maintaining our data infrastructure and pipelines. You will collaborate closely with the rest of our Data and Analytics Engineering team and with engineering and operations teams to ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be essential in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support our companys growth. This is an exceptional opportunity for someone who relishes the chance to engage with cutting-edge technology, influence the development of a world-class data ecosystem and work in a fast-paced environment on a small, high-impact team. Our core data stack makes heavy use of Snowflake and dbt Core, orchestrated in Prefect and Argo in our broader AWS-based ecosystem. Most of our wide range of data sources are loaded with Fivetran or Segment, but we use custom Python when it s the right tool for the job. What you'll do Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt. Collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions. Design, build and maintain tooling that enables users and services to interact with our data platform, including CI/CD pipelines for our data lakehouse, unit/integration/validation testing frameworks for our data pipelines, and command-line tools for ad-hoc data evaluation. Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy. Optimize and tune data pipelines for improved performance, scalability, and reliability. Monitor data pipelines and proactively address any issues or bottlenecks to ensure uninterrupted data flow. Develop and maintain documentation for data pipelines, ensuring knowledge sharing and smooth onboarding of new team members. Implement data governance and security measures to ensure compliance with industry standards and regulations. Keep up to date with emerging technologies and trends in data engineering and recommend their adoption as appropriate. What will help you succeed Must-haves 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines. Strong Python skills, especially in the context of data orchestration. Strong understanding of database management and design, including experience with Snowflake or an equivalent platform. Proficiency in SQL Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts. Experience with Argo, Prefect, Airflow, or similar data orchestration tools. Excellent problem-solving and analytical skills with a strong attention to detail. Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business. Strong communication skills. Nice-to-haves Undergraduate and/or graduate degree in math, statistics, engineering, computer science, or related technical field Experience with our stack: AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, along with some ancillary tools Experience with DevOps practices, especially CI/CD Previous experience managing enterprise-level data pipelines and working with large datasets Experience in the energy sector Benefits: Competitive compensation based on market standards. We are working on a hybrid model with remote first policy Apart from Fixed Base Salary potential candidates are eligible for following benefits Flexible Leave Policy Office is in the heart of the city in case you need to step in for any purpose. Medical Insurance (1+5 Family Members) We provide comprehensive coverage including accident policy and life Insurance. Annual performance cycle Quarterly team engagement activities and rewards & recognitions L&D programs to foster professional growth A supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency
Posted 1 week ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai
Work from Office
Context KPMG entities in India are professional service firms(s). These Indian member firms are affiliated with KPMG international limited. We strive to provide rapid, performance-based, industry-focused and technology-enabled service, which reflect a shared knowledge of global and local industries and out experience of the Indian business environment. We are creating a strategic solution architecture horizontal team to own, translate and drive this vision into various verticals, business or technology capability block owners and strategic projects. Job Description Role Objective: Senior ETL Developer will design, develop, and optimize Talend data pipelines, ensuring the seamless integration of data from multiple sources to provide actionable insights for informed decision-making across the organization. Sound understanding of databases to store structured and unstructured data with optimized modelling techniques. Should have good exposure on data catalog and data quality modules of any leading product (preferably Talend). Location- Mumbai Years of Experience - 3-5 yrs Roles & Responsibilities: Business Understanding: Collaborate with business analysts and stakeholders to understand business needs and translate them into ETL solution. Arch/Design Documentation: Develop comprehensive architecture and design documentation for data landscape. Dev Testing & Solution: Implement and oversee development testing to ensure the reliability and performance of solution. Provide solutions to identified issues and continuously improve application performance. Understanding Coding Standards, Compliance & Infosecurity: Adhere to coding standards and ensure compliance with information security protocols and best practices. Non-functional Requirement: Address non-functional requirements such as performance, scalability, security, and maintainability in the design and development of Talend based ETL solution. Technical Skills: Core Tool exposure - Talend Data Integrator, Talend Data Catalog, Talend Data Quality, Relational Database (PostgreSQL, SQL Server, etc.) Core Concepts - ETL, Data load strategy, Data Modelling, Data Governance and management, Query optimization and performance enhancement Cloud exposure - Exposure of working on one of the cloud service providers (AWS, Azure, GCP, OCI, etc.) SQL Skills- Extensive knowledge and hands-on experience with SQL, Query tuning, optimization, and best practice understanding Soft Skills- Very good communication and presentation skills Must be able to articulate the thoughts and convince key stakeholders Should be able to guide and upskill team members Good to Have: Programming Language: Knowledge and hands-on experience with languages like Python and R. Relevant certifications related to the role .
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Bengaluru
Work from Office
KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focussed and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. The person will work on a variety of projects in a highly collaborative, fast-paced environment. The person will be responsible for software development activities of KPMG, India. Part of the development team, he/she will work on the full life cycle of the process, develop code and unit testing. He/she will work closely with Technical Architect, Business Analyst, user interaction designers, and other software engineers to develop new product offerings and improve existing ones. Additionally, the person will ensure that all development practices are in compliance with KPMG s best practices policies and procedures. This role requires quick ramp up on new technologies whenever required. Bachelor s or master s degree in computer science, Information Technology, or a related field. . Role: Power BI Developer Location: Chennai Experience: 6 to 8 years Responsibilities: Data Visualization : Design, develop, and maintain interactive data visualizations and reports using Power BI. Data Modeling : Create and optimize data models to support business requirements. Data Integration : Integrate Power BI reports into other applications for enhanced business capabilities. Collaboration : Work with business stakeholders to understand their data visualization and business intelligence needs. Performance Optimization : Monitor and optimize the performance of Power BI reports and dashboards. Security : Implement row-level security on data and ensure compliance with data governance policies. Advanced Calculations : Use DAX (Data Analysis Expressions) to perform advanced calculations on data sets. Documentation : Document processes and methodologies used in developing Power BI solutions. Experience : Proven experience in data analysis, data visualization, and business intelligence. Technical Skills : Proficiency in Power BI, DAX, and data modeling. Analytical Skills : Strong analytical and problem-solving skills. Communication : Excellent communication and teamwork skills. Certifications : Relevant certifications such as Microsoft Certified: Data Analyst Associate are a plus
Posted 1 week ago
3.0 - 7.0 years
5 - 8 Lacs
Bengaluru
Work from Office
JD for SAP BW on Hana. Key Responsibilities: Design and implement data models using SAP BW on HANA / BW/4HANA Develop and maintain CompositeProviders, ADSOs, Open ODS Views , and InfoObjects Create ETL data flows using SAP BW ETL tools and integrate data from SAP (ECC/S/4HANA) and external systems Optimize performance of queries and data models leveraging HANA views and native HANA capabilities Work with BEx Queries and integrate with SAP Analytics Cloud (SAC) or other BI tools Implement and support real-time data replication using SLT or ODP frameworks Support data governance, data quality, and metadata management initiatives Participate in end-to-end project lifecycles: requirement gathering, design, development, testing, deployment, and support Collaborate with functional teams and business users to translate business needs into technical solutions Document technical designs, system configurations, and support procedures
Posted 1 week ago
2.0 - 5.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France