Jobs
Interviews

4342 Data Quality Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

10 - 14 Lacs

Hyderabad

Work from Office

BW on HANA/BW4HANA implementation in key SAP modules Requirement s analysis, conception, implementation/development of solution as per requirement. Create HLD and then convert them to LLD. Data extraction from SAP and nonSAP systems, data modelling and reports delivery. Work closely with Project leaders, Business teams, SAP functional counterparts to Architect, Design, Develop SAP BW4 HANA Solutions Understand the integration and consumption of BW data models/repots with other Tools Responsibilities Handson experience in SAP BW/4HANA or SAP BW ON HANA and strong understanding of usage of objects like Composite providers, Open ODS view, advanced DSOs, Transformations, exposing BW models as HANA views, mixed scenarios, and performance optimization concepts such as data tiering optimization. Experience in integration of BW with various SAP and NonSAP backend systems/sources of data and good knowledge of different data acquisition techniques in BW/4HANA. knowledge of available SAP BW/4HANA tools and its usage like BW/4HANA Web Cockpit. Mandatory skill sets Full life cycle Implementation experience in SAP BW4HANA or SAP BW on HANA Hands on Experience in data extraction using standard or generic data sources. Good Knowledge of data source enhancement Strong experience in writing ABAP/AMDP code for exits, Transformation. Strong understanding CKF, RKF, Formula, Selections, Variables and other components used for reporting. Preferred skill sets Understanding of LSA/LSA++ architecture and its development standards. Good understanding of BW4 Application and Database security concepts. Functional Knowledge of various Modules like SD, MM, FI. Education qualification B.Tech / M.Tech (Computer Science, Mathematics & Scientific Computing etc) Education Degrees/Field of Study required Bachelor of Technology, Master of Engineering Degrees/Field of Study preferred Required Skills SAP Business Warehouse

Posted 2 weeks ago

Apply

0.0 - 3.0 years

20 - 25 Lacs

Hyderabad

Work from Office

YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment We are Compliance Engineering, a global team of more than 300 engineers and scientists who work on the most complex, mission-critical problems. We: build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive UX/UI and Big Data applications. The firm is making a significant investment to uplift and rebuild the Compliance application portfolio in 2025. To achieve that we are hiring experienced software development engineers. HOW YOU WILL FULFILL YOUR POTENTIAL As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, leverage various technologies including; Java, SpringBoot, Hibernate, BPMN workflows, Rules Engine, JavaScript, TypeScript, React-Redux, REST APIs, GraphQL, Elastic Search, Kafka, Kubernetes, Machine Learning be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, including negotiating data contracts, capturing data quality metrics, processing large scale data, building surveillance detection models, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software systems across our products. QUALIFICATIONS A successful candidate will possess the following attributes: A Bachelors or Masters degree in Computer Science, Computer Engineering, or a similar field of study. Expertise in java, as we'll as proficiency with databases and data manipulation. Experience in end-to-end solutions, automated testing and SDLC concepts. The ability (and tenacity) to clearly express ideas and arguments in meetings and on paper. Experience in some of following is desired and can set you apart from other candidates: knowledge of the financial industry and compliance or risk functions, ability to influence stakeholders.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Overview Act as an expert on consumer insights with focus on social listening to answer business questions in a compelling and engaging way. This expertise will include Understanding the PepsiCo trends framework, leverage available technology stack and provide insights based on business partner request by connecting relevant data sources Available tools to analyze consumer trends from market manifestations based on Big Data Trendscope to identity and analyze Food and Beverage trends ai to produce inspiring Springboards about territories and platforms based on digital conversations Social Listening Sprinklr ADA Innovation and creative evaluation Responsibilities Execution of research projects with quality and depth of deliverables with low/no support from external vendors. Ensuring to tell the story in a compelling way, putting together all BIG data (what is happening) and THICK data (human motivations and drivers) tools at our disposal. The analyst will be the responsible for producing complete analysis and one page summary for all projects conducted. The analyst will also present his/her work to the local PepsiCo business teams who requested this work. Key tasksEnd to end delivery of alignment on the brief, proposal coordination, execution and delivery of results. Lead social listening projects from the brief to the outputs delivery - Translate business Market and Business Challenges into a Social Listening brief Ensure highest level of data quality and validation Qualifications Social Listening Expertise with a heavy focus on Insights vs Reporting 4-6 years of experience preferrable at a FMCG company / client, making an impact in a market research/insights/analytics, marketing, competitive intelligence, or other similar function with demonstrated ability to execute projects in a complex environment with multiple constituencies Very comfortable in running in depth Consumer research analyses, ability to turn findings into compelling and insightful stories and present them to Business teams. Understanding of Brand and Innovation strategy process and Insights critical roles at each stage. Experience in working on Trends and Foresight project E.g. Pre and Post COVID impact, consumer trend changes, etc. Experience in projects involving flavor innovation, trending ingredients, health benefits, consumer behavior. Demonstrated skills with written communication especially in PowerPoint and email Strong verbal and written communication English Project Management Highly analytical, motivated, decisive with excellent project management skills. OrganizedCapable of juggling multiple projects, priorities, and stakeholders, ensure delivery while proactively managing trade-offs. Demonstrated ability to manage projects and overcome challenges Ability to influence local insights partners in their ways of working Ability to run consumer research analyses alone by leveraging various available data sources

Posted 2 weeks ago

Apply

10.0 - 15.0 years

27 - 32 Lacs

Hyderabad

Work from Office

Overview DQ Expert will act as individual contributor enforcing strict Enterprise Data Management strategy through globaly defined data standards and governance in order to successfuly deliver business transformation within SAP S/4 Hana projects. Data Quality Expert will be responsible for delivery of internal data quality application to support data readiness and conversion activities for project and new market deployment assuring Global Template data standards are followed. This role involves active engagement in requirements gathering, testing, data cleansing, issue analysis and resolution, data conversion, and mock/cutover conversion activities. Position holder must work directly with multiple project fuction specialists, ex.OTC, P2P, MTD, as part of extended Data Conversion team on day to day basis as well as engaging the market business process/data owners . Responsibilities Partner with Data Validation Team to ensure Quality of migrated data Ensure global lift and shift opportunities are deployed across the sectors Manage questions and clear path for developers to complete build/test of data validation Work with Global on design updates/new patterns Manager the overall tracker for conversion stats and Provide direction & guidance to conversion team\" Qualifications Minimum Bachelors degree is required. Computer Science or Information Systems is preferred. Minimum 10 years in IT in ERP transformation programs in Data Management area. Experience in at least 3 end to end implementations of SAP ERP/ECC with responsibility for Data Quality and Master data standards. Experience of working with/manipulating big data sets (or systems built on significant datasets) Knowledge of SAP Master data models Data Readiness, Conversion, Migration and Cutover experience from a functional standpoint Understanding of Data Quality/Data cleansing practices Demonstrated documentation acumen, presentation of data standards materials for reporting to project and stakeholders alignment.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

25 - 27 Lacs

Hyderabad

Work from Office

Overview We are seeking a highly skilled and experienced Azure Data Engineer to join our dynamic team. In this critical role, you will be responsible for designing, developing, and maintaining robust and scalable data solutions on the Microsoft Azure platform. You will work closely with data scientists, analysts, and business stakeholders to translate business requirements into effective data pipelines and data models. Responsibilities Design, develop, and implement data pipelines and ETL/ELT processes using Azure Data Factory, Azure Databricks, and other relevant Azure services. Develop and maintain data lakes and data warehouses on Azure, including Azure Data Lake Storage Gen2 and Azure Synapse Analytics. Build and optimize data models for data warehousing, data marts, and data lakes. Develop and implement data quality checks and data governance processes. Troubleshoot and resolve data-related issues. Collaborate with data scientists and analysts to support data exploration and analysis. Stay current with the latest advancements in cloud computing and data engineering technologies. Participate in all phases of the software development lifecycle, from requirements gathering to deployment and maintenance Qualifications 6+ years of experience in data engineering, with at least 3 years of experience working with Azure data services. Strong proficiency in SQL, Python, and other relevant programming languages. Experience with data warehousing and data lake architectures. Experience with ETL/ELT tools and technologies, such as Azure Data Factory, Azure Databricks, and Apache Spark. Experience with data modeling and data warehousing concepts. Experience with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Experience with Agile development methodologies. Bachelor's degree in Computer Science, Engineering, or a related field (Master's degree preferred). Relevant Azure certifications (e.g., Azure Data Engineer Associate) are a plus

Posted 2 weeks ago

Apply

12.0 - 17.0 years

25 - 30 Lacs

Hyderabad

Work from Office

Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Overview Data Engineering Assoc Manager (L09). Responsibilities Enhance and maintain data pipelines on EDF Requirement analysis, data analysis Work on application migration from Teradata to EDF Lead a team of data engineers and testers Qualifications Data engineer with 10+ years of experience

Posted 2 weeks ago

Apply

7.0 - 12.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overview PepsiCo is on a significant initiative of digitalization and standardization of the FP&A solution across all its markets in alignment with the Planning 2025 vision to make the finance organization more Capable, more Agile, and more Efficient. Mosaic Program is a key enabler of that vision, It is FP&A solution of the PepsiCo. Responsibilities The NA Mosaic Sustain Developer is responsible for the sustain of high-quality solution for the MOSAIC Noth America program specific to management of the financial planning. The role will interact directly on the design/development and maintenance of the solution and will have to work closely with the various detailed design and development teams. This role will require a strong background in financial planning and sub streams (Topline, COGS, Opex) data quality/data flow and development. Qualifications University education (BE/BTech/B.Sc) or equivalent work experience Minimum of 7+ years of information Technology or business experience Strong understanding of the financial planning process, revenue management principles and sales finance forecasting. For years of Experience, provide detail such as 5+ year experience in TM1 Planning Analytics by IBM development 3+ year experience in TM1 Planning Analytics by IBM support Mandatory Tech skills Knowledge of IBM PLANNING ANALYTICS(TM1) solution Ability to understand and debug complex TM1 code (processes and rules) Ability to write complex TM1 code (processes and rules) Sound understanding and implementation of TM1 parallel processing. Experience in building PAW based reports. Functional Knowledge of FP&A (Financial Planning and Analysis) Soft Skills Data flow and integration as a critical component Self-motivation and ability to stay focused Ability to drive complex business discussions to design the best solution. Knowledge of FMCG and FP&A related data objects Ability to search for new solutions to meet challenges together with the team. Good communication skills Ability to leverage relationships to understand, document and communicate processes and change implications Achieved Ability to handle complexity and to execute with excellence under pressure. Conceptual Selling Deployment Planning and Execution Relationship Management and Service Technology Innovation Process Design and Architecture

Posted 2 weeks ago

Apply

0.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Overview The Data Analyst will play a critical role in the success of the Mosaic (global planning tool). Mosaic is transforming the way FP&A teams work across PepsiCo markets and the level of financial information available for the senior leadership teams. The Data Analyst will be responsible for the ongoing live market support focussing mostly on resolution of data issues related to the staging/ETL area of the data, providing guidance on data sources and connectivity, system issues and data transformation logic, root cause analysis and coordination on solution deployment. Additionally, he/she will be key in understanding and closing data quality gaps in current system and assisting local teams by supporting their data preparation to be MOSAIC ready. The role will require working closely with the IT/BRM, the Sector FP&A, the Cockpit and other functions teams (Net Revenue Management, Global Procurement, Coman, Supply Chain, etc.). Responsibilities Live market support Conduct thorough data validation to ensure data pipelines meet business requirements. Gain knowledge on how data is being processed and transformed from different sources and prepared for the Mosaic product. Assist in ad-hoc analytics, troubleshoot tools, and provide direct support to end-users Deep understanding of data quality and cleansing requirements for the data to be ready to be consumed in BI and SPOT Bridge the gap and coordinate with tech and FPA teams to ensure the data quality Support sustainable data solution Collaborate with business users, data engineers, product owners and BI developers to design and implement end-to-end data solutions. Oversee data processes with detailed DQR notifications, proactively monitoring ETL pipelines to address any issues Qualifications MBA, CA, CMA, Any degree in Finance

Posted 2 weeks ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overview FOBO businesses in Europe, AMESA and APAC have migrated its planning capability from XLS to MOSAIC, an integrated and digital planning tool, in a step forward towards reaching the Financial Planning 2025 Vision. However, the underlaying FOBO operating model limits our ability to capture benefits given the high attrition and lack of process standardization. To become more capable, agile, and efficient a fundamental change in the way we do FOBO Financial Planning is required, which will be addressed by establishing the FOBO Planning Central (FPC). FPC evolves the GBS approach, pivoting from a geography focus to a process focus, and allows BUs to concentrate their attention on the Bottlers. Planning services will be provided by a single team, based in HBS, led by a single leader to serve FOBO globally. The central planning team will be organized around key processes under 3 roles to drive efficiency and standardization NavigatorsSingle point of contact for the BU, responsible for overall planning and analysis activities IntegratorsWorks with Navigator to support business closing activities, reporting & planning Ecosystem AdminOwns TM1 data quality and overall system administration This new operating model will provide a better and faster response to BUs. In addition, it will reduce overall people cost, as some positions will be eliminated due to process standardization and simplification while other positions will migrate from BUs (RetainCo) to the FPC (at HBS). Responsibilities Ensures excellent TM1 data quality and timely overall system administration is delivered for EUROPE/AMESA/APAC FOBO businesses, which includes the following activities TM1 Admin TM1 Scenario Management (eg Create/officialise scenarios, copy actuals into fcst scenario, etc) TM1 Cubes flows execution and Export data to SPOT-Cockpit on a daily basis Perform Systems Reconciliation to ensure 100% financial data alignment between ERP, HFM, TM1 and Cockpit Master Data Perform daily Data quality checks/corrections/reconciliations (before/during closing and planning cycles) Work closely with Navigators to maintain Mappings/allocations in TM1 updated (aligning any changes with business FP&A leads) Maintenance of master data (e.g. profit centres, creation of new NPD, etc) Qualifications 4-6 years experience in Finance position (experience in FOBO business a plus) BA required (Business/Finance or IT) TM1 experience a MUST Comfortable dealing with big/complex data Detailed oriented, and strong analytical skills (quick understanding of E2E process/data flow analysis) Tech savy/passionate for systems, digital tools Excellent communications, interpersonal skills and stakeholder management 100% fluent in English

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Bengaluru

Work from Office

Optimize existing ETL processes, ensuring scalability, performance, and reliability. Identify data transformation opportunities and implement solutions to improve data quality, governance, and operational efficiency. Troubleshoot and resolve ETL failures, performance issues, and integration challenges. Identify performance optimization areas by analysing ETL and other connected services Work closely with data architects, engineers, and business stakeholders to understand requirements and deliver solutions. Ensure data integrity, security, and compliance with organizational and industry standards. Document ETL workflows, configurations, and best practices.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title:Oracle & MongoDB DBAExperience8-12 YearsLocation:Bangalore : Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, "not-invented-here" syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title Oracle & MongoDB DBAExperience 8-16 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, 'not-invented-here' syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner. Skills PRIMARY COMPETENCY Data Engineering PRIMARY Oracle APPS DBA PRIMARY PERCENTAGE 75 SECONDARY COMPETENCY Data Engineering SECONDARY MongoDB APPS DBA SECONDARY PERCENTAGE 25

Posted 2 weeks ago

Apply

4.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Job Title Oracle & MongoDB DBAExperience 4-8 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills : Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, "not-invented-here" syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Job Title:Data Engineer - DBT (Data Build Tool)Experience0-5 YearsLocation:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWSRequirements definition, source data analysis and profiling, the logical and physical design of the data lake and datawarehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systemsWork in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONSEssential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposureOther skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as neededStrong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and drawconclusions.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

We are looking for a qualified and detail-oriented GLP Archivist to support the implementation of the OECD Principles of Good Laboratory Practice (GLP). The successful candidate will be responsible for managing the archiving of scientific study records, ensuring compliance with international GLP standards, and supporting the integrity and traceability of non-clinical safety data. We invite motivated and deserving candidates with a passion for regulatory compliance and data stewardship to apply for this opportunity. Roles and Responsibilities Responsible for the management, operations, and procedures for archiving in accordance with OECD Principles of GLP. Creating and maintaining archives for the collection for easy retrieval of Records. Maintain a stable physical environment for the receipt, storage, and handling of the archival holdings. Knowledge of OECD Principles of Good Laboratory Practices (GLP).

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As an experienced IICS Developer, you will be responsible for supporting a critical data migration project from Oracle to Snowflake. This remote opportunity requires working night-shift hours to align with the U.S. team. Your primary focus will be on developing and optimizing ETL/ELT workflows, collaborating with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. To excel in this role, you must possess strong hands-on experience with IICS (Informatica Intelligent Cloud Services), a solid background in Oracle databases (including SQL, PL/SQL, and data modeling), and a working knowledge of Snowflake, specifically data staging, architecture, and data loading. Your responsibilities will also include building mappings, tasks, and parameter files in IICS, as well as understanding data pipeline performance tuning to enhance efficiency. In addition, you will be expected to implement error handling, performance monitoring, and scheduling to support the migration process effectively. Your role will extend to providing assistance during the go-live phase and post-migration stabilization to ensure a seamless transition. This position offers the flexibility of engagement as either a Contract or Full-time role, based on availability and fit. If you are looking to apply your expertise in IICS development to contribute to a challenging data migration project, this opportunity aligns with your skill set and availability. The shift timings for this role are from 7:30 PM IST to 1:30 AM EST, allowing you to collaborate effectively with the U.S. team members.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Management Consultant at SAP Success Delivery Center, you play a crucial role in supporting customers on their digital transformation journey by implementing Data Management solutions, including Data Migrations and Master Data Governance. Working as a tech-no functional consultant, you will be an integral part of project teams responsible for delivering SAP Implementations to clients. Your responsibilities include being hands-on with solutions, possessing good communication skills for engaging in business discussions, and having a functional understanding of Data Management. Prior development experience is considered an added advantage. While occasional travel may be required based on customer needs, the primary focus will be on remote and offshore delivery. One of your key objectives is to own or acquire relevant SAP Business AI skills to effectively position and deliver SAP's AI offerings to customers. Your role also involves enhancing the adoption and consumption of various SAP AI offerings within customer use cases. You will be joining the Data Management Solution Area within BTP Delivery @ Scale, which is a robust team of over 100 professionals delivering engagements across a wide range of Data Management topics such as Data Migration, Data Integration, Data Engineering, Data Governance, and Data Quality. At SAP, our innovations empower over four hundred thousand customers globally to collaborate more efficiently and leverage business insights effectively. Our company, known for its leadership in enterprise resource planning (ERP) software, has evolved into a market leader in end-to-end business application software and related services, including database, analytics, intelligent technologies, and experience management. With a cloud-based approach, two hundred million users, and a diverse workforce of over one hundred thousand employees worldwide, we are committed to being purpose-driven and future-focused. Our culture emphasizes collaboration, personal development, and a strong team ethic. We believe in connecting global industries, people, and platforms to provide solutions for every challenge. At SAP, you have the opportunity to bring out your best. Diversity and inclusion are at the core of SAP's culture, with a focus on health, well-being, and flexible working models that ensure every individual, regardless of background, feels included and empowered to perform at their best. We believe in the strength that comes from the unique capabilities and qualities each person brings to our organization, and we invest in our employees to nurture confidence and unlock their full potential. SAP is dedicated to unleashing all talent and contributing to a better and more equitable world. SAP is an equal opportunity workplace and an affirmative action employer. We uphold the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you require accommodation or special assistance to navigate our website or complete your application, please contact the Recruiting Operations Team at Careers@sap.com. For SAP employees, only permanent roles qualify for the SAP Employee Referral Program, subject to the eligibility criteria outlined in the SAP Referral Policy. Specific conditions may apply to roles in Vocational Training. EOE AA M/F/Vet/Disability: Successful candidates may undergo a background verification with an external vendor. Requisition ID: 422298 | Work Area: Consulting and Professional Services | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

The Customer Excellence Advisory Lead (CEAL) role focuses on empowering customers to maximize the potential of their data through top-tier architectural guidance and design. As a part of the Oracle Analytics Service Excellence organization, the team comprises Solution Architects specializing in Oracle Analytics Cloud, Oracle Analytics Server, and Fusion Data Intelligence. The primary objective is to ensure the successful adoption of Oracle Analytics by engaging with customers and partners globally to build trust in the platform. Collaboration with Product Management is key to enhancing product offerings and sharing insights through various mediums such as blogs, webinars, and demonstrations. The ideal candidate will work closely with strategic FDI customers and partners to guide them towards optimized implementations and develop Go-live plans geared towards achieving high usage levels. This position is classified as Career Level - IC4. Responsibilities include proactively identifying customer requirements, uncovering unaddressed needs, and devising potential solutions across different customer segments. The role involves assisting in shaping complex product and program strategies based on customer interactions and effectively implementing scalable solutions and projects for customers operating in diverse enterprise environments. Collaboration with customers and internal stakeholders to communicate strategies, synchronize solution implementation timelines, provide updates, and adjust plans according to evolving objectives is vital. Additionally, preparing for complex product or solution-related inquiries and challenges that customers may present, gathering detailed product insights based on customer needs, and promoting understanding of customer complexities and the value propositions of various programs are key responsibilities. Primary Skills required for this role include: - Over 4 years of experience with OBIA and Oracle Analytics - Robust knowledge of Analytics RPD design, development, and deployment - Strong understanding of BI/data warehouse analysis, design, development, and testing - Extensive experience in data analysis, data profiling, data quality, data modeling, and data integration - Proficiency in crafting complex queries and stored procedures using Oracle SQL and Oracle PL/SQL - Skilled in developing visualizations and user-friendly workbooks - Previous experience in developing solutions incorporating AI and ML using Analytics - Experience in enhancing report performance Desirable Skills: - Experience with Fusion Applications (ERP/HCM/SCM/CX) - Ability to design and develop ETL Interfaces, Packages, Load plans, user functions, variables, and sequences in ODI for batch and real-time data integrations - Worked with multiple Cloud Platforms - Certification on FDI, OAC, and ADW Qualifications: Career Level - IC4 About Us: Oracle, a global leader in cloud solutions, leverages cutting-edge technology to address present-day challenges. With over 40 years of experience, Oracle partners with industry leaders across various sectors and continues to thrive by operating with integrity. The company is committed to fostering an inclusive workforce that promotes opportunities for all, recognizing that true innovation flourishes when everyone is empowered to contribute. Oracle offers competitive benefits based on parity and consistency, supporting employees with flexible medical, life insurance, and retirement options. The organization also encourages community involvement through volunteer programs. Commitment to inclusivity extends to people with disabilities at all stages of the employment process. For accessibility assistance or accommodation for a disability, individuals can reach out via email at accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You will be working as a Databricks Developer with 3-6 years of experience, located in India. Joining the data engineering and AI innovation team, your main responsibilities will include developing scalable data pipelines using Databricks and Apache Spark, implementing AI/ML workflows with tools like MLflow and AutoML, collaborating with data scientists to deploy models into production, performing ETL development, data transformation, and model training pipelines, managing Delta Lake architecture, and working closely with cross-functional teams to ensure data quality and governance.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer at our company, you will be responsible for handling ETL processes using PySpark, SQL, Microsoft Fabric, and other relevant technologies. You will collaborate with clients and stakeholders to comprehend data requirements and devise efficient data models and solutions. Additionally, optimizing and tuning existing data pipelines for enhanced performance and scalability will be a crucial part of your role. Ensuring data quality and integrity throughout the data pipeline and documenting technical designs, processes, and procedures will also be part of your responsibilities. It is essential to stay updated on emerging technologies and best practices in data engineering and contribute to building CICD pipelines using Github. To qualify for this role, you should hold a Bachelor's degree in computer science, engineering, or a related field, along with a minimum of 3 years of experience in data engineering or a similar role. A strong understanding of ETL concepts and best practices is required, as well as proficiency in Azure Synapse, Microsoft Fabric, and other data processing technologies. Experience with cloud-based data platforms such as Azure or AWS, knowledge of data warehousing concepts and methodologies, and proficiency in Python, PySpark, and SQL programming languages for data manipulation and scripting are also essential. Desirable qualifications include experience with data lake concepts, familiarity with data visualization tools like Power BI or Tableau, and certifications in relevant technologies such as Microsoft Certified: Azure Data Engineer Associate. Our company offers various benefits including group medical insurance, cab facility, meals/snacks, and a continuous learning program. Stratacent is a Global IT Consulting and Services firm with headquarters in Jersey City, NJ, and global delivery centers in Pune and Gurugram, along with offices in the USA, London, Canada, and South Africa. Specializing in Financial Services, Insurance, Healthcare, and Life Sciences, we assist our customers in their transformation journey by providing services in Information Security, Cloud Services, Data and AI, Automation, Application Development, and IT Operations. For more information, you can visit our website at http://stratacent.com.,

Posted 2 weeks ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

We are looking for someone who is enthusiastic to contribute to the implementation of a metadata-driven platform managing the full lifecycle of batch and streaming Big Data pipelines. This role involves applying ML and AI techniques in data management, such as anomaly detection for identifying and resolving data quality issues and data discovery. The platform facilitates the delivery of Visa's core data assets to both internal and external customers. You will provide Platform-as-a-Service offerings that are easy to consume, scalable, secure, and reliable using open source-based Cloud solutions for Big Data technologies. Working at the intersection of infrastructure and software engineering, you will design and deploy data and pipeline management frameworks using open-source components like Hadoop, Hive, Spark, HBase, Kafka streaming, and other Big Data technologies. Collaboration with various teams is essential to build and maintain innovative, reliable, secure, and cost-effective distributed solutions. Facilitating knowledge transfer to the Engineering and Operations team, you will work on technical challenges and process improvements with geographically distributed teams. Your responsibilities will include designing and implementing agile-innovative data pipeline and workflow management solutions that leverage technology advances for cost reduction, standardization, and commoditization. Driving the adoption of open standard toolsets to reduce complexity and support operational goals for increasing automation across the enterprise is a key aspect of this role. As a champion for the adoption of open infrastructure solutions that are fit for purpose, you will keep technology relevant. The role involves spending 80% of the time writing code in different languages, frameworks, and technology stacks. At Visa, your uniqueness is valued. Working here provides an opportunity to make a global impact, invest in your career growth, and be part of an inclusive and diverse workplace. Join our global team of disruptors, trailblazers, innovators, and risk-takers who are driving economic growth worldwide, moving the industry forward creatively, and engaging in meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers. This position is hybrid, and the expectation of days in the office will be confirmed by your hiring manager. **Basic Qualifications**: - Minimum of 6 months of work experience or a bachelor's degree - Bachelor's degree in Computer Science, Computer Engineering, or a related field - Good understanding of data structures and algorithms - Good analytical and problem-solving skills **Preferred Qualifications**: - 1 or more years of work experience or an Advanced Degree (e.g., Masters) in Computer Science - Excellent programming skills with experience in at least one of the following: Python, Node.js, Java, Scala, GoLang - MVC (model-view-controller) for end-to-end development - Knowledge of SQL/NoSQL technology. Familiarity with Databases like Oracle, DB2, SQL Server, etc. - Proficiency in Unix-based operating systems and bash scripts - Strong communication skills, including clear and concise written and spoken communications with professional judgment - Team player with excellent interpersonal skills - Demonstrated ability to lead and navigate through ambiguity **Additional Information**:,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

The Junior Process & Solution Key User - Vendor Master Data role involves supporting process and solution development, improvements, and implementation of standard processes on a local site and organizational unit. Your responsibilities will include ensuring accurate and consistent Vendor Master Data practices, supporting data governance initiatives, and maintaining data quality. You will be required to bring business knowledge and requirements from all users to the Business Process Developer/Solution Leader for process and solution development and improvement activities. Additionally, you will develop and maintain Vendor master data management processes and standards, conduct data quality assignments, and analyze business issues from a process and solution perspective. As a Process & Solution Key User, you will participate in acceptance tests, approve/reject user acceptance tests for new solution releases, and identify root causes for process and solution improvement areas. You will also be responsible for collecting, analyzing, proposing, and prioritizing change requests from users, as well as communicating and anchoring process/solution improvement proposals. To be successful in this role, you should have a minimum of 4 years of professional experience in the accounting area, with Vendor Master Data experience strongly preferred. Strong organizational and time management skills, effective communication skills (both written and verbal), and the ability to work in shifts are essential requirements. Being detail-oriented, having a professional attitude, and being reliable are also important characteristics for this role. Knowledge of various SAP ECC or S/4 systems and proficiency in Microsoft Office are necessary for this position. Additionally, you will need to ensure Internal Control compliance and External Audit requirements, perform process training, and provide support to end users. Joining Volvo Group offers you the opportunity to work with a global team of talented individuals dedicated to shaping the future of efficient, safe, and sustainable transport solutions. As part of Group Finance, you will contribute to realizing the vision of the Volvo Group by providing expert financial services and working collaboratively with a diverse team of professionals. If you are passionate about making a difference in the world of transport and have the required skills and experience, we encourage you to apply for this opportunity and be a part of our mission to leave a positive impact on society for the next generation.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in business architecture and data management. You have found the right team. As a Banking Book Product Owner Analyst in our Firmwide Finance Business Architecture (FFBA) team, you will spend each day defining, refining, and delivering set goals for our firm. You will partner with stakeholders across various lines of business and subject matter experts to understand products, data, source system flows, and business requirements related to Finance and Risk applications and infrastructure. As a Product Owner on the Business Architecture team, you will work closely with Line of Business stakeholders, data Subject Matter Experts, Consumers, and technology teams across Finance, Credit Risk & Treasury, and various Program Management teams. Your primary responsibilities will include prioritizing the traditional credit product book of work, developing roadmaps, and delivering on multiple projects and programs during monthly releases. Your expertise in data analysis and knowledge will be instrumental in identifying trends, optimizing processes, and driving business growth. As our organization grows, so does our reliance on insightful, data-driven decisions. You will dissect complex datasets to unearth actionable insights while possessing a strong understanding of data governance, data quality, and data management principles. Utilize Agile Framework to write business requirements in the form of user stories to enhance data, test execution, reporting automation, and digital analytics toolsets. Engage with development teams to translate business needs into technical specifications, ensuring acceptance criteria are met. Drive adherence to product and Release Management standards and operating models. Manage the release plan, including scope, milestones, sourcing requirements, test strategy, execution, and stakeholder activities. Collaborate with lines of business to understand products, data capture methods, and strategic data sourcing into a cloud-based big data architecture. Identify and implement solutions for business process improvements, creating supporting documentation and enhancing end-user experience. Collaborate with Implementation leads, Release managers, Project managers, and data SMEs to align data and system flows with Finance and Risk applications. Oversee the entire Software Development Life Cycle (SDLC) from requirements gathering to testing and deployment, ensuring seamless integration and execution. Required qualifications, capabilities, and skills: - Bachelors degree with 3+ years of experience in Project Management or Product Ownership, with a focus on process re-engineering. - Proven experience as a Product Owner with a strong understanding of agile principles and delivering complex programs. - Strong analytical and problem-solving abilities, with the capacity to quickly assimilate business and technical knowledge. - Experience in Finance, Risk, or Operations as a Product Lead. - Familiarity with Traditional Credit Products and Liquidity and Credit reporting data. - Highly responsible, detail-oriented, and able to work with tight deadlines. - Excellent written and verbal communication skills, with the ability to articulate complex concepts to diverse audiences. - Strong organizational abilities to manage multiple work streams concurrently, maintaining sound judgment and a risk mindset. - Solid understanding of financial and regulatory reporting processes. - Energetic, adaptable, self-motivated, and effective under pressure. - Basic knowledge of cloud technologies (e.g., AWS). Preferred qualifications, capabilities, and skills: - Knowledge of JIRA, SQL, Microsoft suite of applications, Databricks and data visualization/analytical tools (Tableau, Alteryx, Python) is a plus. - Knowledge and experience of Traditional Credit Products (Loans, Deposits, Cash etc.,) and Trading Products (Derivatives and Securities) a plus.,

Posted 2 weeks ago

Apply

5.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies