Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
17 - 22 Lacs
Pune
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various components of the data platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure that there is cohesive integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to gather requirements and provide insights that will shape the overall architecture of the data platform, ensuring it meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architectures.- Familiarity with data governance and compliance standards.- Ability to design and implement scalable data solutions. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required.must have skill AWS & PYTHON Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
17 - 22 Lacs
Navi Mumbai
Work from Office
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Architect, you will be responsible for architecting the data platform blueprint and implementing the design, which includes various data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure seamless integration between systems and data models, while also addressing any challenges that arise during the implementation process. You will engage in discussions with stakeholders to align the data architecture with business objectives, ensuring that the data platform meets the needs of the organization effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with architectural standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with data governance frameworks.- Strong understanding of data modeling techniques.- Familiarity with cloud-based data storage solutions.- Experience in implementing data integration strategies. Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Azure Data Services.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be responsible for analyzing requirements and translating them into effective data solutions, ensuring that the data platform meets the needs of various stakeholders. Additionally, you will participate in team meetings to share insights and contribute to the overall strategy of the data platform. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and best practices.- Experience with data modeling and database design.- Familiarity with cloud-based data solutions and architectures.- Knowledge of data governance and data quality principles. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
17 - 22 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : Informatica MDMMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring that all development aligns with best practices and organizational standards. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve application performance and user experience. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM InfoSphere DataStage.- Good To Have Skills: Experience with Informatica MDM.- Strong understanding of data integration and ETL processes.- Experience with database management and SQL.- Familiarity with data warehousing concepts and practices. Additional Information:- The candidate should have minimum 7.5 years of experience in IBM InfoSphere DataStage.- This position is based in Mumbai.- A 15 years full time education is required. Collaborate with key stakeholders to develop and enhance MDM solutions using IBM Datastage, Infosphere, and equivalent MDM platforms. Perform in ETL (Extract, Transform, Load) processes, ensuring data accuracy and consistency. Designing and implementing data models, workflows, and integration pipelines. Support data governance efforts by adhering to established policies and standards. Troubleshoot and resolve technical issues related to MDM systems. Contribute to documentation and knowledge sharing within the team. Translate pharmaceutical-specific master data requirements (customer, product, brand) into technical specifications. Design and oversee the delivery of functional and technical components for MDM solutions tailored to the industry. Collaborate with cross-functional teams to ensure alignment with business needs related to customer, product, and brand data. Qualification 15 years full time education
Posted 1 day ago
8.0 - 10.0 years
20 - 35 Lacs
Greater Noida
Work from Office
We are seeking a seasoned Informatica CDGC expert to work with Informatica team and lead the implementation and optimization of Informatica Cloud Data Governance and Catalog solutions. The ideal candidate will establish best practices, drive data governance initiatives, and mentor a team of data professionals to ensure a scalable and efficient governance framework aligned with business objectives Roles and Responsibilities Lead the end-to-end implementation of Informatica Cloud Data Governance and Catalog (CDGC) solutions, ensuring timely and high-quality delivery. Design, configure, and deploy data governance frameworks using Informatica CDGC aligned with organizational standards and compliance requirements. Develop and implement best practices for metadata management, data lineage, data quality, and stewardship within the Informatica CDGC environment. Collaborate with cross-functional teams, including data architects, engineers, analysts, and business stakeholders to drive data governance adoption. Provide expert guidance on data governance policies, workflows, and tool utilization to maximize the value of Informatica CDGC. Mentor and coach team members on technical and governance aspects of Informatica CDGC, fostering skill development and knowledge sharing. Troubleshoot and resolve complex technical issues related to Informatica CDGC deployment and integrations. Stay current with Informatica CDGC product updates, industry trends, and data governance best practices to continuously enhance governance capabilities. Create and maintain documentation, including architecture diagrams, configuration guides, and training materials. Support audit and compliance activities related to data governance and metadata management. Proven experience working with Informatica Data Governance and Catalog tools, preferably Cloud Data Governance and Catalog (CDGC). Strong understanding of data governance concepts, metadata management, data lineage, and data quality principles. Hands-on experience implementing and configuring Informatica CDGC solutions in enterprise environments. Proficiency with ETL/ELT processes, metadata integration, and data cataloging. Solid knowledge of data management frameworks and regulatory compliance (e.g., GDPR, CCPA). Excellent problem-solving and analytical skills with the ability to mentor and lead a team. Strong communication skills with experience working across technical and business stakeholders. Ability to create and deliver training sessions, workshops, and detailed technical documentation.
Posted 1 day ago
6.0 - 11.0 years
8 - 12 Lacs
Noida
Work from Office
Data Analyst III Who We Are Brightly, a Siemens company, is the global leader in intelligent asset management solutions. Brightly enables organizations to transform the performance of their assets with a sophisticated cloud-based platform that leverages several years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightlys complete suite of intuitive software- including CMMS, EAM, Strategic Asset Management, Sustainability and Community Engagement. Paired with award-winning training, support and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable. About The Job The Business Intelligence (BI) Analyst and Report Development professional for Brightly is a lead specialist in our Analytics and BI Services team responsible for building, testing, and maintaining software Product embedded reports, charts and dashboards in Power BI and/or QLIK. This position will also partner with and guide other Product report writers and end users in the development of their own reports. By providing best in class enterprise reporting, the Report Writer directly contributes towards Brightlys objective to differentiate with data. What Youll Be Doing Address reporting needs of applications by modernizing and building new embedded reports using Power BI or in some case QLIK Cloud Develop appropriate Semantic Models and Business Views, generate calculated fields based on application specific business logic and implement row level security (RLS) in the application reports or dashboards Support end-user community in the use of business intelligence tools creation of ad-hoc reports. Ongoing technical documentation for Brightly BI Services sustainability and scale including data sources, logic, processes, and limitations Work closely with multiple stakeholders such as Product Management, Analytics, Design, and Data Cloud teams Follow and influence reporting and data quality change control processes for proper configuration and application change management that will impact reports What You Need A Bachelor's degree in Business, Programming, Business Intelligence, Computer science or related field Minimum 6 years of experience developing reports in Power BI (some may be using similar tools) and familiarity with reports embedding in the applications Proficiency in using SQL, with experience in querying and joining tabular data structures, database management, and creating new variables required for reports Expertise in building intuitive, interactive dashboards and pixel perfect reports/Power BI paginated reporting. Advance level of knowledge in Power BI Desktop Reporting (Including all sub-components such as Power Query, Semantic Data Modelling, DAX and Visualizations) Strong experience and knowledge of Power BI Services (Ex. Gateway, B2B Applications, Workspaces etc.) Willingness to learn general international data security issues and follow data governance. Ability to communicate and collaborate in a remote team setting both reading, writing, and speaking English Ability to manage multiple priorities and adjust quickly to changing requirements and priorities Performs other related duties as assigned The Brightly Culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live Together We Are Brightly
Posted 1 day ago
8.0 - 13.0 years
16 - 20 Lacs
Noida
Work from Office
Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and sustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. About the Job At Brightly, our dedication to innovation drives our product management team to create products that address our customers' evolving needs. As a Senior Product Manager for Data, you will work in a collaborative, energetic, dynamic, and creative environment to drive the product strategy of market-leading data products for our emerging and existing vertical-focused products and services. Reporting to the Director of Product Management, you will play a crucial role in assisting in a forward-thinking Data & AI strategy aligned with market needs. What youll be doing Your key responsibilities are to develop and execute a comprehensive product strategy aligned with market demands and business goals through: Drive monetizationBuild new high value offers on our Snowflake Data cloud. Build Data-as-a-product to drive revenue. Enable Reporting, Analytics and AI roadmap. Data-Driven Decision MakingUtilize analytics and data to drive product decisions, measure success, and iterate on features. Market AnalysisStay up to date with industry trends, competitor products, and emerging technologies to ensure our data products remain competitive. Stakeholder ManagementCollaborate with stakeholders across the organization to align on product goals, priorities, financials, and timelines. Exposure working with Legal to ensure compliance and data governance, data integrity and data retention Customer FocusDeep empathy for users and a passion for creating delightful product experiences. User Research and InsightsConduct user research and gather insights to inform product decisions and ensure the data products meet user needs and expectations. Resource ManagementIdentify value driven opportunities (including establishing TAM, SAM, SOM), managing, understand and share financial outlook, align resources effectively to drive success for your domain. What youll need EducationBachelors degree and advanced degree in a technical field or an MBA from a top business school (IIM, XLRI, ISB) or equivalent experience. ExperienceOverall 8+ years of experience with at least 2 years in a business facing role, preferably in SaaS / PaaS. AdaptabilityComfortable navigating and prioritizing in situations of ambiguity, especially in the early stages of discovery and product development. Motivated self-starter with the ability to learn and adapt. Communication Skills: Strong communication and social skills; ability to work across teams with geographically remote team members, including the ability to frame complex concepts for a non-technical audience. InfluenceDemonstrated ability to influence without authority and communicate to multi-level audiences including growing and mentoring more junior product peers. Who we are Brightly, the global leader in intelligent asset management solutions, enables organizations to transform the performance of their assets. Brightlys sophisticated cloud-based platform leverages more than 20 years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightlys complete suite of intuitive software- including CMMS, EAM, Strategic Asset Management, IoT Remote Monitoring, Sustainability and Community Engagement. Paired with award-winning training, support, and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable communities. The Brightly culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish, and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live. Together we are Brightly.
Posted 1 day ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : BigID Data Intelligence Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements in Hyderabad. You will play a crucial role in the development and implementation of innovative solutions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Conduct code reviews and ensure coding standards are met- Implement best practices for application development Professional & Technical Skills: - Must To Have Skills: Proficiency in BigID Data Intelligence Platform- Strong understanding of data privacy and compliance regulations- Experience in data governance and data protection- Knowledge of data classification and discovery techniques- Hands-on experience in implementing data security solutions Additional Information:- The candidate should have a minimum of 5 years of experience in BigID Data Intelligence Platform- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 day ago
15.0 - 20.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality frameworks.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
2.0 - 5.0 years
7 - 11 Lacs
Chennai
Work from Office
Educational Bachelor of Engineering,BCA,BSc,BTech,MTech,MSc,MCA Service Line Enterprise Package Application Services Responsibilities You will be part of an innovative team that drives our Workato initiatives and to dive into business processes to determine root causes, quantify potential, and establish and drive improvement initiatives that make businesses more efficient. You will set up and maintain data models that will be the basis of the analyses and work together closely with the business analysts to generate the customized set of analytics that serve as a single source of truth for business performance measurement as well as data-driven decision making. You are responsible for setting data dictionary and maintaining data governance on the created structure. You identify the best possible strategy for data collection, ensure the data quality and work together with the stakeholders responsible for the data input to ensure we can correctly measure and track all necessary information. Collaborate with source system experts to ensure the source systems are set up correctly to gather all relevant information and support the most effective data structures. Create and maintain comprehensive documentation for data models, processes, and systems to facilitate knowledge sharing. Technical and Professional : You have a proven track record in using Workato and other B2B/EDI tools. You are a team player and can communicate data structural concepts and ideas to both technical and non-technical stakeholders. You have strong analytical skills and have an affinity with business concepts. Workato certification will be an advantage. Workato project experience will be a big plus. Preferred Skills: Technology-BPMI - B2B-Others Technology-EDI-EDI Tools
Posted 1 day ago
3.0 - 5.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
2.0 - 5.0 years
5 - 8 Lacs
Hyderabad
Work from Office
Educational Bachelor of Engineering,BCA,BSc,BTech,MTech,MCA,MSc Service Line Enterprise Package Application Services Responsibilities You will be part of an innovative team that drives our Fabric OMS initiatives and to dive into business processes to determine root causes, quantify potential, and establish and drive improvement initiatives that make businesses more efficient. You will set up and maintain data models that will be the basis of the analyses and work together closely with the business analysts to generate the customized set of analytics that serve as a single source of truth for business performance measurement as well as data-driven decision making. You are responsible for setting data dictionary and maintaining data governance on the created structure. You identify the best possible strategy for data collection, ensure the data quality and work together with the stakeholders responsible for the data input to ensure we can correctly measure and track all necessary information. Collaborate with source system experts to ensure the source systems are set up correctly to gather all relevant information and support the most effective data structures. Create and maintain comprehensive documentation for data models, processes, and systems to facilitate knowledge sharing. Technical and Professional : You have a proven track record in using Fabric OMS and other B2B/ B2C Understanding in configuring and customizing Fabric OMS and its suite of applications Strong knowledge of Order Management system such as Fabric OMS, E-Commerce is a strong plus Experience in Inventory Management, Order Orchestration Experience with Java, SQL ,Unix and Microservices. Preferred Skills: Domain-Digital Commerce-Digital Commerce Platforms-B2C Domain-Supply Chain Management-Inventory Management Technology-Supply Chain Management-Supply Chain Management - ALL Technology-BPMI - B2B-Others Domain-Automotive-Automotive Sales & After Sales Service-Order Management System
Posted 1 day ago
4.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Responsibilities: Data Quality Implementation & Monitoring (Acceldata & Demand Tools): Design, develop, and implement data quality rules and checks using Acceldata to monitor data accuracy, completeness, consistency, and timeliness. Configure and utilize Acceldata to profile data, identify anomalies, and establish data quality thresholds. Investigate and resolve data quality issues identified by Acceldata, working with relevant teams for remediation. Leverage DemandTools within our Salesforce environment to identify, merge, and prevent duplicate records across Leads, Contacts, and Accounts. Implement data standardization and cleansing processes within Salesforce using DemandTools. Develop and maintain data quality dashboards and reports using Acceldata to provide visibility into data health. Data Onboarding & Integration Quality: Collaborate with engineers and platform teams to understand data sources and pipelines built using Fivetran / ingestion tool. Ensure data transformations within Fivetran to maintain data integrity and quality. Develop and execute test plans and test cases to validate the successful and accurate onboarding of data into our snowflake environment. Metadata Management & Data Governance: Work with the Atlan platform to understand and contribute to the establishment of a comprehensive data catalog. Assist in defining and implementing data governance policies and standards within Atlan. Validate the accuracy and completeness of metadata within Atlan to ensure data discoverability and understanding. Collaborate on data lineage tracking and impact analysis using Atlan. Collaboration & Communication: Work closely with data engineers, platform team, data analysts, business stakeholders, and Salesforce administrators. Clearly communicate data quality findings, risks, and remediation steps. Participate in data governance meetings and contribute to the development of data quality best practices. Document data quality rules, processes, and monitoring procedures. Required Skills & Experience: Proven experience (e.g., 3+ years) as a Data Quality Engineer or similar role. Hands-on experience with Fivetran / data ingestion application for data integration and understanding its data transformation capabilities. Familiarity with Atlan or other modern data catalog and metadata management tools. Strong practical experience with Acceldata or similar data quality monitoring and observability platforms. Familiarity in using DemandTools for data quality management within Salesforce. Solid understanding of data quality principles, methodologies, and best practices. Strong SQL skills for data querying and analysis. Experience with data profiling and data analysis techniques. Excellent analytical, problem-solving, and troubleshooting skills. Strong communication and collaboration skills. Ability to work independently and manage tasks effectively in a remote environment. Preferred Skills & Experience: Experience with other data quality tools or frameworks. Knowledge of data warehousing concepts and technologies (e.g., Snowflake, BigQuery). Experience with scripting languages like Python for data manipulation and automation. Familiarity with Salesforce data model and administration.
Posted 1 day ago
12.0 - 20.0 years
15 - 20 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities Lead multiple master data engagements in US & EU across industry verticals like Manufacturing, Pharma, Retail, Energy, Utilities Etc Handle diverse teams based at Onsite & offshore for the delivery of the master data projects Participate in proposal conversions and ensure their 100% billability in projects Provide thought leadership on master data best practices, business value articulation, tool capability and suggestions, governance principles, concepts and techniques Mentoring Infosys teams & improving the consulting & delivery capability Carry out assessment of Clients master data landscape, data maturity, governance & quality, business pain points and identification of capability gaps, conduct design workshops Define master data strategy and roadmap initiatives Design master data solution architecture, internal & external application integration, security, performance and scalability strategies to support the various business capabilities Additional Responsibilities: Candidate should have experience in engaging with large organizations providing expert advice, consulting on various master data governance topics, defining strategy & roadmap. Able to influence decisions, lead multiple engagements, participate in proposal making and delivery. Must have good Client engagement and co-ordination skills. Possess good knowledge of SAP master data and data quality tools. In-depth knowledge of Techno Functional SAP MDG is a must.Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Vizag, Kolkata, Mysore, Hubli.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : Candidate should have 14+ years of experience in Leading master data projects, expert data governance consulting , strategy & roadmap definition, architecting MDM solutions, leading and influencing decisions, implementing MDG solutions Expert knowledge of SAP MDG solution options including on-premise & cloud Innovative mindset for AI/ML capability and Use cases in data governance Master data domain knowledge should include Material, Customer, Vendor and Finance , etc MDM Integration knowledge with ERP and other internal/external applications Preferred Skills: SAP MDG Technology-SAP Functional-SAP MDG
Posted 1 day ago
6.0 - 11.0 years
4 - 8 Lacs
Hyderabad
Work from Office
P2 C1 STS Primary- Informatica IDQ 9 or higher Secondary- PL/SQL skills JD- IDQ/Informatica Developer experience working on transformations, mapplet, mapping and Scorecards Informatica Axon tool experience Experience Translating business rules into IDQ Rules, IDQ application deployment, Schedulers Knowledge with IDQ repository objects IDQ Performance management & Access management Understands concept for Data governance Experience with IDQ and Axon integration Informatica PowerCenter experience (9 higher) Experience with unix scripting IBM Information analyzer Good to have
Posted 1 day ago
9.0 - 14.0 years
2 - 6 Lacs
Hyderabad
Work from Office
Broad expertise in Fincrime Monitoring, AML, KYC, Sanctions Screening, Payment Screening, Fraud etc Proven risk & regulatory experience in financial services gained through management consulting, banking or other relevant industry practitioner or regulatory roles End to end project implementation and cross functional stakeholder management experience including agile project delivery Seasoned business analyst with experience in requirements analysis, requirements management and documentation, exposure to tools like JIRA, Confluence etc hands on with SQL queries Bachelor degree from a reputable Institute. Master degree preferably in a quantitative field Business, Data Science, Statistics, Computer Science, etc. Comfortable with ideation, solution design and development of thought leadership materials and documents to support practice development efforts Exposure to leading vendor products like Actimize, Fircosoft etc is a plus Experience in Data Science, Analytics, AI ML. Gen AI, Data Management, Data Architectures, Data Governance, platforms and applications is a plus Exposure to Consultative sales business development, pre-sales, RFP and proposal development and client management experience is a plus
Posted 1 day ago
12.0 - 17.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.
Posted 1 day ago
9.0 - 14.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Kafka Data Engineer Data Engineer to build and manage data pipelines that support batch and streaming data solutions. The role requires expertise in creating seamless data flows across platforms like Data Lake/Lakehouse in Cloudera, Azure Databricks, Kafka for both batch and stream data pipelines etc. Responsibilities Strong experience in develop, test, and maintain data pipelines (batch & stream) using Cloudera, Spark, Kafka and Azure services like ADF, Cosmos DB, Databricks, NoSQL DB/ Mongo DB etc. Strong programming skills in spark, python or scala & SQL. Optimize data pipelines to improve speed, performance, and reliability, ensuring that data is available for data consumers as required. Create ETL pipelines for downstream consumers by transform data as per business logic. Work closely with Data Architects and Data Analysts to align data solutions with business needs and ensure the accuracy and accessibility of data. Implement data validation checks and error handling processes to maintain high data quality and consistency across data pipelines. Strong analytical and problem solving skills, with a focus on optimizing data flows and addressing impacts in the data pipeline. Qualifications 8+ years of IT experience with at least 5+ years in data engineering and cloud-based data platforms. Strong experience with Cloudera/any Data Lake, Confluent/Apache Kafka, and Azure Data Services (ADF, Databricks, Cosmos DB). Deep knowledge of NoSQL databases (Cosmos DB, MongoDB) and data modeling for performance and scalability. Proven expertise in designing and implementing batch and streaming data pipelines using Databricks, Spark, or Kafka. Experience in creating scalable, reliable, and high-performance data solutions with robust data governance policies. Strong collaboration skills to work with stakeholders, mentor junior Data Engineers, and translate business needs into actionable solutions. Bachelors or masters degree in computer science, IT, or a related field.
Posted 1 day ago
5.0 - 7.0 years
22 - 25 Lacs
Bengaluru
Work from Office
Role Overview: We are looking for a skilled Data Visualization Software Developer Engineer with 6-8 years of experience in developing interactive dashboards and data-driven solutions using Looker and LookerML. The ideal candidate will have expertise in Google Cloud Platform (GCP) and BigQuery and a strong understanding of data visualization best practices. Experience in the media domain (OTT, DTH, Web) will be a plus. Key Responsibilities: Design, develop, and optimize interactive dashboards using Looker and LookerML. Work with BigQuery to create efficient data models and queries for visualization. Develop LookML models, explores, and derived tables to support business intelligence needs. Optimize dashboard performance by implementing best practices in data aggregation and visualization. Collaborate with data engineers, analysts, and business teams to understand requirements and translate them into actionable insights. Implement security and governance policies within Looker to ensure data integrity and controlled access. Leverage Google Cloud Platform (GCP) services to build scalable and reliable data solutions. Maintain documentation and provide training to stakeholders on using Looker dashboards effectively. Troubleshoot and resolve issues related to dashboard performance, data accuracy, and visualization constraints. Maintain and optimize existing Looker dashboards and reports to ensure continuity and alignment with business KPIs Understand, audit, and enhance existing LookerML models to ensure data integrity and performance Build new dashboards and data visualizations based on business requirements and stakeholder input Collaborate with data engineers to define and validate data pipelines required for dashboard development and ensure the timely availability of clean, structured data Document existing and new Looker assets and processes to support knowledge transfer, scalability, and maintenance Support the transition/handover process by acquiring detailed knowledge of legacy implementations and ensuring a smooth takeover Required Skills & Experience: 6-8 years of experience in data visualization and business intelligence using Looker and LookerML. Strong proficiency in writing and optimizing SQL queries, especially for BigQuery. Experience in Google Cloud Platform (GCP), particularly with BigQuery and related data services. Solid understanding of data modeling, ETL processes, and database structures. Familiarity with data governance, security, and access controls in Looker. Strong analytical skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills. Expertise in Looker and LookerML, including Explore creation, Views, and derived tables Strong SQL skills for data exploration, transformation, and validation Experience in BI solution lifecycle management (build, test, deploy, maintain) Excellent documentation and stakeholder communication skills for handovers and ongoing alignment Strong data visualization and storytelling abilities, focusing on user-centric design and clarity Preferred Qualifications: Experience working in the media industry (OTT, DTH, Web) and handling large-scale media datasets. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus. Experience with Python or scripting languages for automation and data processing. Understanding of machine learning or predictive analytics is an advantage.
Posted 1 day ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP HANA DB Administration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HANA DB Administration.- Strong understanding of database management and optimization techniques.- Experience with application development frameworks and methodologies.- Familiarity with performance tuning and troubleshooting in SAP HANA.- Ability to implement security measures and data governance practices. Additional Information:- The candidate should have minimum 5 years of experience in SAP HANA DB Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 day ago
8.0 - 13.0 years
5 - 9 Lacs
Pune
Work from Office
Responsibilities / Qualifications: Candidate must have 5-6 years of IT working experience with at least 3 years of experience on AWS Cloud environment is preferred Ability to understand the existing system architecture and work towards the target architecture. Experience with data profiling activities, discover data quality challenges and document it. Experience with development and implementation of large-scale Data Lake and data analytics platform with AWS Cloud platform. Develop and unit test Data pipeline architecture for data ingestion processes using AWS native services. Experience with development on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Glue Data Catalog, Lake formation, Apache Airflow, Lambda, etc Experience with development of data governance framework including the management of data, operating model, data policies and standards. Experience with orchestration of workflows in an enterprise environment. Working experience with Agile Methodology Experience working with source code management tools such as AWS Code Commit or GitHub Experience working with Jenkins or any CI/CD Pipelines using AWS Services Experience working with an on-shore / off-shore model and collaboratively work on deliverables. Good communication skills to interact with onshore team.
Posted 1 day ago
9.0 - 14.0 years
12 - 16 Lacs
Pune
Work from Office
Skills requiredStrong SQL(minimum 6-7 years experience), Datawarehouse, ETL Data and Client Platform Tech project provides all data related services to internal and external clients of SST business. Ingestion team is responsible for getting and ingesting data into Datalake. This is Global team with development team at Shanghai, Pune, Dublin and Tampa. Ingestion team uses all Big Data technologies like Impala, Hive, Spark and HDFS. Ingestion team uses Cloud technologies such as Snowflake for cloud data storage. Responsibilities: You will gain an understanding of the complex domain model and define the logical and physical data model for the Securities Services business. You will also constantly improve the ingestion, storage and performance processes by analyzing them and possibly automating them wherever possible. You will be responsible for defining standards and best practices for the team in the areas of Code Standards, Unit Testing, Continuous Integration, and Release Management. You will be responsible for improving performance of queries from lake tables views You will be working with a wide variety of stakeholders source systems, business sponsors, product owners, scrum masters, enterprise architects and possess excellent communication skills to articulate challenging technical details to various class of people. You will be working in Agile Scrum and complete all assigned tasks JIRAs as per Sprint timelines and standards. Qualifications 5 8 years of relevant experience in Data Development, ETL and Data Ingestion and Performance optimization. Strong SQL skills are essential experience writing complex queries spanning multiple tables is required. Knowledge of Big Data technologies Impala, Hive, Spark nice to have. Working knowledge of performance tuning of database queries understanding the inner working of the query optimizer, query plans, indexes, partitions etc. Experience in systems analysis and programming of software applications in SQL and other Big Data Query Languages. Working knowledge of data modelling and dimensional modelling tools and techniques. Knowledge of working with high volume data ingestion and high volume historic data processing is required. Exposure to scripting language like shell scripting, python is required. Working knowledge of consulting project management techniques methods Knowledge of working in Agile Scrum Teams and processes. Experience in data quality, data governance, DataOps and latest data management techniques a plus. Education Bachelors degree University degree or equivalent experience
Posted 1 day ago
3.0 - 5.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,Bachelor Of Technology (Integrated),BCA,BSc,MTech,MSc,Master Of Business Management Service Line Enterprise Package Application Services Responsibilities You will be part of an innovative team that drives our Celonis initiatives and to dive into business processes to determine root causes, quantify potential, and establish and drive improvement initiatives that make businesses more efficient. You will set up and maintain data models that will be the basis of the analyses and work together closely with the business analysts to generate the customized set of analytics that serve as a single source of truth for business performance measurement as well as data-driven decision making. You are responsible for setting data dictionary and maintaining data governance on the created structure. You identify the best possible strategy for data collection, ensure the data quality and work together with the stakeholders responsible for the data input to ensure we can correctly measure and track all necessary information. Collaborate with source system experts to ensure the source systems are set up correctly to gather all relevant information and support the most effective data structures. Create and maintain comprehensive documentation for data models, processes, and systems to facilitate knowledge sharing. Technical and Professional : Celonis You have 2+ years of relevant work experience in process and data modelling. You have worked with data from ERP systems like SAP. You have a proven track record in using SQL and Python. You are a team player and can communicate data structural concepts and ideas to both technical and non-technical stakeholders. You have strong analytical skills and have an affinity with business concepts. Celonis Data Engineer/Implementation Professional certification will be an advantage. Celonis project experience will be a big plus. Preferred Skills: Foundational-Business Process Management-Business Process Model and Notation (BPMN) ver 2.0-Celonis
Posted 1 day ago
2.0 - 7.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,Bachelor Of Technology (Integrated),Bachelor Of Technology,Bachelor Of Business Adm.,Master Of Business Adm.,Master of Science (Technology),Master Of Technology,Master of Technology (Integrated) Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional : At least 2 years of configuration & development experience with respect to implementation of OFSAA Solutions (such as ERM, EPM, etc.) Expertise in implementing OFSAA Technical areas covering OFSAAI and frameworks Data Integrator, Metadata Management, Data Modelling. Perform and understand data mapping from Source systems to OFSAA staging; Execution of OFSAA batches and analyses result area tables and derived entities. Perform data analysis using OFSAA Metadata (i.e Technical Metadata, Rule Metadata, Business Metadata) and identify if any data mapping gap and report them to stakeholders. Participate in requirements workshops, help in implementation of designed solution, testing (UT, SIT), coordinate user acceptance testing etc. Knowledge and experience with full SDLC lifecycle Experience with Lean / Agile development methodologies Preferred Skills: Technology-Oracle Industry Solutions-Oracle Financial Services Analytical Applications (OFSAA)
Posted 1 day ago
10.0 - 15.0 years
12 - 16 Lacs
Pune
Work from Office
To be successful in this role, you should meet the following requirements(Must have ) Payments and Banking experience is a must. Experience in implementing and monitoring data governance using standard methodology throughout the data life cycle, within a large organisation. Demonstrate up-to-date knowledge of data governance theory, standard methodology and the practical considerations. Demonstrate knowledge of data governance industry standards and tools. Overall experience of 10+ years experience in Data governance, encompassing Data Quality management, Master data management, Data privacy & compliance, Data cataloguing and metadata management, Data security, maturity and lineage. Prior experience in implementing an end-to-end data governance framework. Experience in Automating Data cataloguing, ensuring accurate, consistent metadata, making data easily discoverable and usable. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field - with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j Experience in Conceptual/ Logical/Physical Data Modeling. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane