Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Overview FOBO businesses in Europe, AMESA and APAC have migrated its planning capability from XLS to MOSAIC, an integrated and digital planning tool, in a step forward towards reaching the Financial Planning 2025 Vision. However, the underlaying FOBO operating model limits our ability to capture benefits given the high attrition and lack of process standardization. To become more capable, agile, and efficient a fundamental change in the way we do FOBO Financial Planning is required, which will be addressed by establishing the FOBO Planning Central (FPC). FPC evolves the GBS approach, pivoting from a geography focus to a process focus, and allows BUs to concentrate their attention on the Bottlers. Planning services will be provided by a single team, based in HBS, led by a single leader to serve FOBO globally. The central planning team will be organized around key processes under 3 roles to drive efficiency and standardization NavigatorsSingle point of contact for the BU, responsible for overall planning and analysis activities IntegratorsWorks with Navigator to support business closing activities, reporting & planning Ecosystem AdminOwns TM1 data quality and overall system administration This new operating model will provide a better and faster response to BUs. In addition, it will reduce overall people cost, as some positions will be eliminated due to process standardization and simplification while other positions will migrate from BUs (RetainCo) to the FPC (at HBS). Responsibilities Ensures excellent TM1 data quality and timely overall system administration is delivered for EUROPE/AMESA/APAC FOBO businesses, which includes the following activities TM1 Admin TM1 Scenario Management (eg Create/officialise scenarios, copy actuals into fcst scenario, etc) TM1 Cubes flows execution and Export data to SPOT-Cockpit on a daily basis Perform Systems Reconciliation to ensure 100% financial data alignment between ERP, HFM, TM1 and Cockpit Master Data Perform daily Data quality checks/corrections/reconciliations (before/during closing and planning cycles) Work closely with Navigators to maintain Mappings/allocations in TM1 updated (aligning any changes with business FP&A leads) Maintenance of master data (e.g. profit centres, creation of new NPD, etc) Qualifications 4-6 years experience in Finance position (experience in FOBO business a plus) BA required (Business/Finance or IT) TM1 experience a MUST Comfortable dealing with big/complex data Detailed oriented, and strong analytical skills (quick understanding of E2E process/data flow analysis) Tech savy/passionate for systems, digital tools Excellent communications, interpersonal skills and stakeholder management 100% fluent in English
Posted 1 week ago
6.0 - 9.0 years
8 - 11 Lacs
Bengaluru
Work from Office
Optimize existing ETL processes, ensuring scalability, performance, and reliability. Identify data transformation opportunities and implement solutions to improve data quality, governance, and operational efficiency. Troubleshoot and resolve ETL failures, performance issues, and integration challenges. Identify performance optimization areas by analysing ETL and other connected services Work closely with data architects, engineers, and business stakeholders to understand requirements and deliver solutions. Ensure data integrity, security, and compliance with organizational and industry standards. Document ETL workflows, configurations, and best practices.
Posted 1 week ago
8.0 - 12.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title:Oracle & MongoDB DBAExperience8-12 YearsLocation:Bangalore : Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, "not-invented-here" syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner.
Posted 1 week ago
8.0 - 13.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title Oracle & MongoDB DBAExperience 8-16 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills: Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, 'not-invented-here' syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner. Skills PRIMARY COMPETENCY Data Engineering PRIMARY Oracle APPS DBA PRIMARY PERCENTAGE 75 SECONDARY COMPETENCY Data Engineering SECONDARY MongoDB APPS DBA SECONDARY PERCENTAGE 25
Posted 1 week ago
4.0 - 8.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Job Title Oracle & MongoDB DBAExperience 4-8 YearsLocation Bangalore : Must have 4 Year degree (Computer Science, Information Systems or equivalent) 8+ years overall IT experience (5+ Years as DBA) Technical Skills: Hands-on expertise in data migration between databases on-prem to AWS cloud RDS. Experience in usage of AWS Data Migration Service (DMS) Experience in export/import of huge size Database Schema, full load plus CDC. Should have knowledge & experience in Unix command and writing task automation shell scripts. Knowledge & experience on Different Backup/restore methods and backup tools is needed. Experienced with hands on experience in DBA Skills of database monitoring, performance tuning, DB refresh. Hands on experience in Database support. Hands-on expertise in data migration between databases on-prem to Mongo Atlas. Experience in creating clusters, databases and creating the users. Good command on DB Query language and its architecture. Experience in Conversion of Schema from one DB to Other is added advantage. Experience in Database and server consolidation. Strong hands-on experience in building logical data models, Data Quality, Data security, Understand Application lifecycle and build Service continuity documents. Responsible for building knowledge base - Run books, Cheat sheet, DR Drill books, Escalation procedure. Database refreshes from Production to Acceptance/Development environment Co-ordinate with the infrastructure/application team to get the required information. Evidence gathering for Audits. Non-Technical Skills : Candidate needs to be Good Team Player Ownership skills- should be an individual performer to take deliverables and handle fresh challenges. Service/Customer orientation/ strong oral and written communication skills are mandate. Should be confident and capable to speak with client and onsite teams. Effective interpersonal, team building and communication skills. Ability to collaborate; be able to communicate clearly and concisely both to laypeople and peers, be able to follow instructions, make a team stronger for your presence and not weaker. Should be ready to work in rotating shifts (morning, general and afternoon shifts) Ability to see the bigger picture and differing perspectives; to compromise, to balance competing priorities, and to prioritize the user. Desire for continuous improvement, of the worthy sort; always be learning and seeking improvement, avoid change aversion and excessive conservatism, equally avoid harmful perfectionism, "not-invented-here" syndrome and damaging pursuit of the bleeding edge for its own sake. Learn things quickly, while working outside the area of expertise. Analyze a problem and realize exactly what all will be affected by even the smallest of change you make in the database. Ability to communicate complex technology to no tech audience in simple and precise manner.
Posted 1 week ago
0.0 - 5.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Engineer - DBT (Data Build Tool)Experience0-5 YearsLocation:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWSRequirements definition, source data analysis and profiling, the logical and physical design of the data lake and datawarehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systemsWork in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONSEssential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposureOther skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as neededStrong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and drawconclusions.
Posted 1 week ago
0.0 - 5.0 years
3 - 5 Lacs
Bengaluru
Work from Office
We are looking for a qualified and detail-oriented GLP Archivist to support the implementation of the OECD Principles of Good Laboratory Practice (GLP). The successful candidate will be responsible for managing the archiving of scientific study records, ensuring compliance with international GLP standards, and supporting the integrity and traceability of non-clinical safety data. We invite motivated and deserving candidates with a passion for regulatory compliance and data stewardship to apply for this opportunity. Roles and Responsibilities Responsible for the management, operations, and procedures for archiving in accordance with OECD Principles of GLP. Creating and maintaining archives for the collection for easy retrieval of Records. Maintain a stable physical environment for the receipt, storage, and handling of the archival holdings. Knowledge of OECD Principles of Good Laboratory Practices (GLP).
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As an experienced IICS Developer, you will be responsible for supporting a critical data migration project from Oracle to Snowflake. This remote opportunity requires working night-shift hours to align with the U.S. team. Your primary focus will be on developing and optimizing ETL/ELT workflows, collaborating with architects/DBAs for schema conversion, and ensuring data quality, consistency, and validation throughout the migration process. To excel in this role, you must possess strong hands-on experience with IICS (Informatica Intelligent Cloud Services), a solid background in Oracle databases (including SQL, PL/SQL, and data modeling), and a working knowledge of Snowflake, specifically data staging, architecture, and data loading. Your responsibilities will also include building mappings, tasks, and parameter files in IICS, as well as understanding data pipeline performance tuning to enhance efficiency. In addition, you will be expected to implement error handling, performance monitoring, and scheduling to support the migration process effectively. Your role will extend to providing assistance during the go-live phase and post-migration stabilization to ensure a seamless transition. This position offers the flexibility of engagement as either a Contract or Full-time role, based on availability and fit. If you are looking to apply your expertise in IICS development to contribute to a challenging data migration project, this opportunity aligns with your skill set and availability. The shift timings for this role are from 7:30 PM IST to 1:30 AM EST, allowing you to collaborate effectively with the U.S. team members.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Management Consultant at SAP Success Delivery Center, you play a crucial role in supporting customers on their digital transformation journey by implementing Data Management solutions, including Data Migrations and Master Data Governance. Working as a tech-no functional consultant, you will be an integral part of project teams responsible for delivering SAP Implementations to clients. Your responsibilities include being hands-on with solutions, possessing good communication skills for engaging in business discussions, and having a functional understanding of Data Management. Prior development experience is considered an added advantage. While occasional travel may be required based on customer needs, the primary focus will be on remote and offshore delivery. One of your key objectives is to own or acquire relevant SAP Business AI skills to effectively position and deliver SAP's AI offerings to customers. Your role also involves enhancing the adoption and consumption of various SAP AI offerings within customer use cases. You will be joining the Data Management Solution Area within BTP Delivery @ Scale, which is a robust team of over 100 professionals delivering engagements across a wide range of Data Management topics such as Data Migration, Data Integration, Data Engineering, Data Governance, and Data Quality. At SAP, our innovations empower over four hundred thousand customers globally to collaborate more efficiently and leverage business insights effectively. Our company, known for its leadership in enterprise resource planning (ERP) software, has evolved into a market leader in end-to-end business application software and related services, including database, analytics, intelligent technologies, and experience management. With a cloud-based approach, two hundred million users, and a diverse workforce of over one hundred thousand employees worldwide, we are committed to being purpose-driven and future-focused. Our culture emphasizes collaboration, personal development, and a strong team ethic. We believe in connecting global industries, people, and platforms to provide solutions for every challenge. At SAP, you have the opportunity to bring out your best. Diversity and inclusion are at the core of SAP's culture, with a focus on health, well-being, and flexible working models that ensure every individual, regardless of background, feels included and empowered to perform at their best. We believe in the strength that comes from the unique capabilities and qualities each person brings to our organization, and we invest in our employees to nurture confidence and unlock their full potential. SAP is dedicated to unleashing all talent and contributing to a better and more equitable world. SAP is an equal opportunity workplace and an affirmative action employer. We uphold the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you require accommodation or special assistance to navigate our website or complete your application, please contact the Recruiting Operations Team at Careers@sap.com. For SAP employees, only permanent roles qualify for the SAP Employee Referral Program, subject to the eligibility criteria outlined in the SAP Referral Policy. Specific conditions may apply to roles in Vocational Training. EOE AA M/F/Vet/Disability: Successful candidates may undergo a background verification with an external vendor. Requisition ID: 422298 | Work Area: Consulting and Professional Services | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
The Customer Excellence Advisory Lead (CEAL) role focuses on empowering customers to maximize the potential of their data through top-tier architectural guidance and design. As a part of the Oracle Analytics Service Excellence organization, the team comprises Solution Architects specializing in Oracle Analytics Cloud, Oracle Analytics Server, and Fusion Data Intelligence. The primary objective is to ensure the successful adoption of Oracle Analytics by engaging with customers and partners globally to build trust in the platform. Collaboration with Product Management is key to enhancing product offerings and sharing insights through various mediums such as blogs, webinars, and demonstrations. The ideal candidate will work closely with strategic FDI customers and partners to guide them towards optimized implementations and develop Go-live plans geared towards achieving high usage levels. This position is classified as Career Level - IC4. Responsibilities include proactively identifying customer requirements, uncovering unaddressed needs, and devising potential solutions across different customer segments. The role involves assisting in shaping complex product and program strategies based on customer interactions and effectively implementing scalable solutions and projects for customers operating in diverse enterprise environments. Collaboration with customers and internal stakeholders to communicate strategies, synchronize solution implementation timelines, provide updates, and adjust plans according to evolving objectives is vital. Additionally, preparing for complex product or solution-related inquiries and challenges that customers may present, gathering detailed product insights based on customer needs, and promoting understanding of customer complexities and the value propositions of various programs are key responsibilities. Primary Skills required for this role include: - Over 4 years of experience with OBIA and Oracle Analytics - Robust knowledge of Analytics RPD design, development, and deployment - Strong understanding of BI/data warehouse analysis, design, development, and testing - Extensive experience in data analysis, data profiling, data quality, data modeling, and data integration - Proficiency in crafting complex queries and stored procedures using Oracle SQL and Oracle PL/SQL - Skilled in developing visualizations and user-friendly workbooks - Previous experience in developing solutions incorporating AI and ML using Analytics - Experience in enhancing report performance Desirable Skills: - Experience with Fusion Applications (ERP/HCM/SCM/CX) - Ability to design and develop ETL Interfaces, Packages, Load plans, user functions, variables, and sequences in ODI for batch and real-time data integrations - Worked with multiple Cloud Platforms - Certification on FDI, OAC, and ADW Qualifications: Career Level - IC4 About Us: Oracle, a global leader in cloud solutions, leverages cutting-edge technology to address present-day challenges. With over 40 years of experience, Oracle partners with industry leaders across various sectors and continues to thrive by operating with integrity. The company is committed to fostering an inclusive workforce that promotes opportunities for all, recognizing that true innovation flourishes when everyone is empowered to contribute. Oracle offers competitive benefits based on parity and consistency, supporting employees with flexible medical, life insurance, and retirement options. The organization also encourages community involvement through volunteer programs. Commitment to inclusivity extends to people with disabilities at all stages of the employment process. For accessibility assistance or accommodation for a disability, individuals can reach out via email at accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
You will be working as a Databricks Developer with 3-6 years of experience, located in India. Joining the data engineering and AI innovation team, your main responsibilities will include developing scalable data pipelines using Databricks and Apache Spark, implementing AI/ML workflows with tools like MLflow and AutoML, collaborating with data scientists to deploy models into production, performing ETL development, data transformation, and model training pipelines, managing Delta Lake architecture, and working closely with cross-functional teams to ensure data quality and governance.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer at our company, you will be responsible for handling ETL processes using PySpark, SQL, Microsoft Fabric, and other relevant technologies. You will collaborate with clients and stakeholders to comprehend data requirements and devise efficient data models and solutions. Additionally, optimizing and tuning existing data pipelines for enhanced performance and scalability will be a crucial part of your role. Ensuring data quality and integrity throughout the data pipeline and documenting technical designs, processes, and procedures will also be part of your responsibilities. It is essential to stay updated on emerging technologies and best practices in data engineering and contribute to building CICD pipelines using Github. To qualify for this role, you should hold a Bachelor's degree in computer science, engineering, or a related field, along with a minimum of 3 years of experience in data engineering or a similar role. A strong understanding of ETL concepts and best practices is required, as well as proficiency in Azure Synapse, Microsoft Fabric, and other data processing technologies. Experience with cloud-based data platforms such as Azure or AWS, knowledge of data warehousing concepts and methodologies, and proficiency in Python, PySpark, and SQL programming languages for data manipulation and scripting are also essential. Desirable qualifications include experience with data lake concepts, familiarity with data visualization tools like Power BI or Tableau, and certifications in relevant technologies such as Microsoft Certified: Azure Data Engineer Associate. Our company offers various benefits including group medical insurance, cab facility, meals/snacks, and a continuous learning program. Stratacent is a Global IT Consulting and Services firm with headquarters in Jersey City, NJ, and global delivery centers in Pune and Gurugram, along with offices in the USA, London, Canada, and South Africa. Specializing in Financial Services, Insurance, Healthcare, and Life Sciences, we assist our customers in their transformation journey by providing services in Information Security, Cloud Services, Data and AI, Automation, Application Development, and IT Operations. For more information, you can visit our website at http://stratacent.com.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
We are looking for someone who is enthusiastic to contribute to the implementation of a metadata-driven platform managing the full lifecycle of batch and streaming Big Data pipelines. This role involves applying ML and AI techniques in data management, such as anomaly detection for identifying and resolving data quality issues and data discovery. The platform facilitates the delivery of Visa's core data assets to both internal and external customers. You will provide Platform-as-a-Service offerings that are easy to consume, scalable, secure, and reliable using open source-based Cloud solutions for Big Data technologies. Working at the intersection of infrastructure and software engineering, you will design and deploy data and pipeline management frameworks using open-source components like Hadoop, Hive, Spark, HBase, Kafka streaming, and other Big Data technologies. Collaboration with various teams is essential to build and maintain innovative, reliable, secure, and cost-effective distributed solutions. Facilitating knowledge transfer to the Engineering and Operations team, you will work on technical challenges and process improvements with geographically distributed teams. Your responsibilities will include designing and implementing agile-innovative data pipeline and workflow management solutions that leverage technology advances for cost reduction, standardization, and commoditization. Driving the adoption of open standard toolsets to reduce complexity and support operational goals for increasing automation across the enterprise is a key aspect of this role. As a champion for the adoption of open infrastructure solutions that are fit for purpose, you will keep technology relevant. The role involves spending 80% of the time writing code in different languages, frameworks, and technology stacks. At Visa, your uniqueness is valued. Working here provides an opportunity to make a global impact, invest in your career growth, and be part of an inclusive and diverse workplace. Join our global team of disruptors, trailblazers, innovators, and risk-takers who are driving economic growth worldwide, moving the industry forward creatively, and engaging in meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers. This position is hybrid, and the expectation of days in the office will be confirmed by your hiring manager. **Basic Qualifications**: - Minimum of 6 months of work experience or a bachelor's degree - Bachelor's degree in Computer Science, Computer Engineering, or a related field - Good understanding of data structures and algorithms - Good analytical and problem-solving skills **Preferred Qualifications**: - 1 or more years of work experience or an Advanced Degree (e.g., Masters) in Computer Science - Excellent programming skills with experience in at least one of the following: Python, Node.js, Java, Scala, GoLang - MVC (model-view-controller) for end-to-end development - Knowledge of SQL/NoSQL technology. Familiarity with Databases like Oracle, DB2, SQL Server, etc. - Proficiency in Unix-based operating systems and bash scripts - Strong communication skills, including clear and concise written and spoken communications with professional judgment - Team player with excellent interpersonal skills - Demonstrated ability to lead and navigate through ambiguity **Additional Information**:,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
The Junior Process & Solution Key User - Vendor Master Data role involves supporting process and solution development, improvements, and implementation of standard processes on a local site and organizational unit. Your responsibilities will include ensuring accurate and consistent Vendor Master Data practices, supporting data governance initiatives, and maintaining data quality. You will be required to bring business knowledge and requirements from all users to the Business Process Developer/Solution Leader for process and solution development and improvement activities. Additionally, you will develop and maintain Vendor master data management processes and standards, conduct data quality assignments, and analyze business issues from a process and solution perspective. As a Process & Solution Key User, you will participate in acceptance tests, approve/reject user acceptance tests for new solution releases, and identify root causes for process and solution improvement areas. You will also be responsible for collecting, analyzing, proposing, and prioritizing change requests from users, as well as communicating and anchoring process/solution improvement proposals. To be successful in this role, you should have a minimum of 4 years of professional experience in the accounting area, with Vendor Master Data experience strongly preferred. Strong organizational and time management skills, effective communication skills (both written and verbal), and the ability to work in shifts are essential requirements. Being detail-oriented, having a professional attitude, and being reliable are also important characteristics for this role. Knowledge of various SAP ECC or S/4 systems and proficiency in Microsoft Office are necessary for this position. Additionally, you will need to ensure Internal Control compliance and External Audit requirements, perform process training, and provide support to end users. Joining Volvo Group offers you the opportunity to work with a global team of talented individuals dedicated to shaping the future of efficient, safe, and sustainable transport solutions. As part of Group Finance, you will contribute to realizing the vision of the Volvo Group by providing expert financial services and working collaboratively with a diverse team of professionals. If you are passionate about making a difference in the world of transport and have the required skills and experience, we encourage you to apply for this opportunity and be a part of our mission to leave a positive impact on society for the next generation.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a strategic thinker passionate about driving solutions in business architecture and data management. You have found the right team. As a Banking Book Product Owner Analyst in our Firmwide Finance Business Architecture (FFBA) team, you will spend each day defining, refining, and delivering set goals for our firm. You will partner with stakeholders across various lines of business and subject matter experts to understand products, data, source system flows, and business requirements related to Finance and Risk applications and infrastructure. As a Product Owner on the Business Architecture team, you will work closely with Line of Business stakeholders, data Subject Matter Experts, Consumers, and technology teams across Finance, Credit Risk & Treasury, and various Program Management teams. Your primary responsibilities will include prioritizing the traditional credit product book of work, developing roadmaps, and delivering on multiple projects and programs during monthly releases. Your expertise in data analysis and knowledge will be instrumental in identifying trends, optimizing processes, and driving business growth. As our organization grows, so does our reliance on insightful, data-driven decisions. You will dissect complex datasets to unearth actionable insights while possessing a strong understanding of data governance, data quality, and data management principles. Utilize Agile Framework to write business requirements in the form of user stories to enhance data, test execution, reporting automation, and digital analytics toolsets. Engage with development teams to translate business needs into technical specifications, ensuring acceptance criteria are met. Drive adherence to product and Release Management standards and operating models. Manage the release plan, including scope, milestones, sourcing requirements, test strategy, execution, and stakeholder activities. Collaborate with lines of business to understand products, data capture methods, and strategic data sourcing into a cloud-based big data architecture. Identify and implement solutions for business process improvements, creating supporting documentation and enhancing end-user experience. Collaborate with Implementation leads, Release managers, Project managers, and data SMEs to align data and system flows with Finance and Risk applications. Oversee the entire Software Development Life Cycle (SDLC) from requirements gathering to testing and deployment, ensuring seamless integration and execution. Required qualifications, capabilities, and skills: - Bachelors degree with 3+ years of experience in Project Management or Product Ownership, with a focus on process re-engineering. - Proven experience as a Product Owner with a strong understanding of agile principles and delivering complex programs. - Strong analytical and problem-solving abilities, with the capacity to quickly assimilate business and technical knowledge. - Experience in Finance, Risk, or Operations as a Product Lead. - Familiarity with Traditional Credit Products and Liquidity and Credit reporting data. - Highly responsible, detail-oriented, and able to work with tight deadlines. - Excellent written and verbal communication skills, with the ability to articulate complex concepts to diverse audiences. - Strong organizational abilities to manage multiple work streams concurrently, maintaining sound judgment and a risk mindset. - Solid understanding of financial and regulatory reporting processes. - Energetic, adaptable, self-motivated, and effective under pressure. - Basic knowledge of cloud technologies (e.g., AWS). Preferred qualifications, capabilities, and skills: - Knowledge of JIRA, SQL, Microsoft suite of applications, Databricks and data visualization/analytical tools (Tableau, Alteryx, Python) is a plus. - Knowledge and experience of Traditional Credit Products (Loans, Deposits, Cash etc.,) and Trading Products (Derivatives and Securities) a plus.,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, ETL, and related tools. With a minimum of 5 years of experience in Data Engineering, you have expertise in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This role offers you an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions. Your responsibilities will include: - Developing and maintaining data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes across various data sources. - Writing complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. - Implementing advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and designing high-performance data architectures. - Collaborating with business stakeholders to understand data needs and translating business requirements into technical solutions. - Performing root cause analysis on data-related issues, ensuring effective resolution, and maintaining high data quality standards. - Working closely with cross-functional teams to integrate data solutions and creating clear documentation for data processes and models. Your qualifications should include: - Expertise in Snowflake for data warehousing and ELT processes. - Strong proficiency in SQL for relational databases and writing complex queries. - Experience with Informatica PowerCenter for data integration and ETL development. - Proficiency in using Power BI for data visualization and business intelligence reporting. - Familiarity with Sigma Computing, Tableau, Oracle, DBT, and cloud services like Azure, AWS, or GCP. - Experience with workflow management tools such as Airflow, Azkaban, or Luigi. - Proficiency in Python for data processing (knowledge of other languages like Java, Scala is a plus). Education required for this role is a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. This position will be based in Bangalore, Chennai, Kolkata, or Pune. If you meet the above requirements and are passionate about data engineering and analytics, this is an excellent opportunity to leverage your skills and contribute to impactful data solutions.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You will be responsible for SAP archiving and ILM as an SAP Archiving and ILM Specialist. With over 6 years of SAP experience, you will primarily focus on SAP archiving and ILM. Your technical skills should include a robust understanding of SAP archiving concepts like archiving classes, profiles, and retention periods. Experience with SAP ILM (Information Lifecycle Management) and its configuration is essential for this role. Knowledge of SAP data management, data governance, and data quality is also required. Strong analytical and problem-solving skills will be beneficial in this position. Understanding business requirements and data retention needs is crucial, as you will need to communicate technical concepts to non-technical stakeholders effectively. This position is based in Bangalore and can be either permanent or contractual based on your preference. If you meet the experience criteria and have the necessary technical skills, kindly submit your profile to shweta@skilltasy.com for consideration.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Data Operations Analyst at First Citizens India, you will play a crucial role in ensuring the smooth operations and support of key data deliverables at Silicon Valley Bank. Your responsibilities will include monitoring data quality controls, addressing data quality issues, managing user inquiries, and facilitating data certification. Additionally, you will be expected to demonstrate thought leadership and provide guidance to junior team members. Your essential functions will involve collaborating with business and technology teams to prioritize and resolve data-related problems. You will engage partners or vendors as necessary to address issues, open service requests for remediation, and uphold a high standard of data quality. Identifying anomalies in data quality and resolving escalated issues will be key aspects of your role. Furthermore, you will support project planning and management to ensure adherence to best practices in data governance. In this position, you will have the autonomy to define and implement data quality controls, as well as prioritize data issues for resolution. Your recommendations will focus on maintaining consistent definitions, adherence to standards, and addressing security requirements. Possessing strong analytical skills, attention to detail, and effective communication will be essential for success in this role. Proficiency in SQL query skills for independent data analysis and the ability to collaborate across different workstreams are desired qualities. As a Data Operations Analyst, you will be expected to exhibit critical thinking, thought leadership, and a proactive approach to driving change and achieving results. Your passion for establishing data governance practices and your ability to provide guidance to junior team members will be crucial for the success of the team. A Bachelor's degree in a relevant field, 5-7 years of experience in the financial industry, and proficiency in production data operations are required for this role. Experience with regulated systems and data governance tools will be advantageous. Join us at First Citizens India as we continue our mission of delivering tailored business solutions and driving innovation in the global banking technology and business services industry.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
At Goldman Sachs, our Engineers don't just make things - we make things possible. We change the world by connecting people and capital with ideas, solving the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets. Engineering, which is comprised of our Technology Division and global strategists groups, is at the critical center of our business. Our dynamic environment requires innovative strategic thinking and immediate, real solutions. If you want to push the limit of digital possibilities, start here. Goldman Sachs Engineers are innovators and problem-solvers, building solutions in risk management, big data, mobile, and more. We look for creative collaborators who evolve, adapt to change, and thrive in a fast-paced global environment. Data plays a critical role in every facet of the Goldman Sachs business. The Data Engineering group is at the core of that offering, focusing on providing the platform, processes, and governance for enabling the availability of clean, organized, and impactful data to scale, streamline, and empower our core businesses. As a Site Reliability Engineer (SRE) on the Data Engineering team, you will be responsible for observability, cost, and capacity with operational accountability for some of Goldman Sachs's largest data platforms. We engage in the full lifecycle of platforms from design to demise with an adapted SRE strategy to the lifecycle. We are looking for individuals with a background as a developer who can express themselves in code. You should have a focus on Reliability, Observability, Capacity Management, DevOps, and SDLC (Software Development Lifecycle). As a self-leader comfortable with problem statements, you should structure them into data-driven deliverables. You will drive strategy with skin in the game, participate in the team's activities, drive Postmortems, and have an attitude that the problem stops with you. **How You Will Fulfil Your Potential** - Drive adoption of cloud technology for data processing and warehousing - Drive SRE strategy for some of GS's largest platforms including Lakehouse and Data Lake - Engage with data consumers and producers to match reliability and cost requirements - Drive strategy with data **Relevant Technologies**: Snowflake, AWS, Grafana, PromQL, Python, Java, Open Telemetry, Gitlab **Basic Qualifications** - A Bachelor's or Master's degree in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline) - 1-4+ years of relevant work experience in a team-focused environment - 1-2 years hands-on developer experience at some point in career - Understanding and experience of DevOps and SRE principles and automation, managing technical and operational risk - Experience with cloud infrastructure (AWS, Azure, or GCP) - Proven experience in driving strategy with data - Deep understanding of multi-dimensionality of data, data curation, and data quality - In-depth knowledge of relational and columnar SQL databases, including database design - Expertise in data warehousing concepts - Excellent communication skills - Independent thinker, willing to engage, challenge, or learn - Ability to stay commercially focused and to always push for quantifiable commercial impact - Strong work ethic, a sense of ownership and urgency - Strong analytical and problem-solving skills - Ability to build trusted partnerships with key contacts and users across business and engineering teams **Preferred Qualifications** - Understanding of Data Lake / Lakehouse technologies incl. Apache Iceberg - Experience with cloud databases (e.g., Snowflake, Big Query) - Understanding concepts of data modeling - Working knowledge of open-source tools such as AWS lambda, Prometheus - Experience coding in Java or Python,
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Mumbai
Work from Office
Assist in addressing and resolving data quality issues reported by users and data stakeholders, including anomalies, inconsistencies, and data quality concerns. Participate in the monitoring, tracking, and management of data quality incidents and requests, ensuring timely and effective responses. Assist in creating and maintaining comprehensive documentation of data quality issues, resolutions, and best practices for reference and training. Support the Data Quality Service Desk Lead in managing and escalating complex data quality issues to the relevant teams, ensuring appropriate actions are taken. Assist in tracking and reporting on data quality incidents and metrics to ensure transparency and accountability. Excel Basic understanding of SQL. Good communication and Aptitude. Qualifications Graduate Additional Information 100% Work from Office (24 X7) No Mobile Phones/storage devices allowed within the floor Rotational shifts Current office location is Vikhroli however team will move to Thane (GB) in few months Job Location
Posted 2 weeks ago
8.0 - 10.0 years
10 - 20 Lacs
Pune
Remote
Job Summary: We are seeking an experienced Azure Data Governance Specialist to design, implement, and manage data governance frameworks and infrastructure across Azure-based platforms. The ideal candidate will ensure enterprise data is high-quality, secure, compliant, and aligned with business and regulatory requirements. This role combines deep technical expertise in Azure with a strong understanding of data governance principles, MDM, and data quality management. Key Responsibilities: Data Governance & Compliance: Design and enforce data governance policies, standards, and frameworks aligned with enterprise objectives and compliance requirements (e.g., GDPR, HIPAA). Master Data Management (MDM): Implement and manage MDM strategies and solutions within the Azure ecosystem to ensure consistency, accuracy, and accountability of key business data. Azure Data Architecture: Develop and maintain scalable data architecture on Azure (e.g., Azure Data Lake, Synapse, Purview, Alation, Anomalo) to support governance needs. Tooling & Automation: Deploy and manage Azure-native data governance tools such as Azure Purview, Microsoft Fabric, and Data Factory to classify, catalog, and monitor data assets including third party tools like Alation. Data Quality (DQ): Lead and contribute to Data Quality forums, establish DQ metrics, and integrate DQ checks and dashboards within Azure platforms. Security & Access Management: Collaborate with security teams to implement data security measures, role-based access controls, and data encryption in accordance with Azure best practices. Technical Leadership: Guide teams in best practices for designing data pipelines, metadata management, and lineage tracking with Azure tooling. Continuous Improvement: Drive improvements in data management processes and tooling to enhance governance efficiency and compliance posture. Mentorship & Collaboration: Provide technical mentorship to data engineers and analysts, promoting data stewardship and governance awareness across the organization. Qualifications: Education: Bachelors degree in Computer Science, Information Systems, or a related field. Experience: 8+ years of experience in data infrastructure and governance, with 3+ years focused on Azure data services and tools. Technical Skills: Proficiency with data governance tools: Alation, Purview, Synapse, Data Factory, Azure SQL, etc. Strong understanding of data modeling (conceptual, logical, and physical models). Experience with programming languages such as Python, C#, or Java. In-depth knowledge of SQL and metadata management. Leadership: Proven experience leading or influencing cross-functional teams in data governance and architecture initiatives. Certifications (preferred): Azure Data Engineer Associate, Azure Solutions Architect Expert, or Azure Purview-related certifications.
Posted 2 weeks ago
8.0 - 13.0 years
40 - 45 Lacs
Hyderabad
Work from Office
Role Summary: The Lead Data Scientist, Foundational Analytics is pivotal in ensuring that core data assets support the growth of our business. The appointed Lead will be required to apply their experience in managing a range of analytical resources and functions to develop a centre of excellence for analytic capabilities, fostering a culture of innovation, collaboration and expertise. They will work collaboratively with key stakeholders across the business to understand and set appropriate analytical strategies that support business growth, and apply these strategies to their business planning, and in particular support business growth initiatives through allocation of appropriate resources, subject matter expertise and long range planning. They will act as experts in developing the way we manage our foundational data assets and drive innovative analytical solutions that create value. They will also work with Data Engineers and implementation teams on solution design and with business stakeholders on change management. Role & responsibilities: Develop relationships with key internal and external stakeholders across Quantium to understand and support the strategic goals and key initiatives of the company Develop and execute strategy and business plans for the ongoing development of our foundational data assets Manage Lead and Senior Analysts in other verticals to achieve business plan goals on an ongoing basis across a variety of teams Set and manage aligned KPIs across one analytics team focussed on a single data partner Develop a culture of innovation and analytic excellence by focussing on development of existing team members and hiring the best talent available Drive accountability within the teams to achieve deadlines and the best possible analytic outcomes Support growth of the business through implementation of capabilities that streamline the ingestion and preparation of new data assets Support development of technology solutions that focus on automation of solutions, operational excellence and innovation by working with implementation and technology teams on solution design Key activities: Set and manage business plans for three or more foundational data assets including data curation, customer segmentation and other foundational analytics Engage with business and external stakeholders to gather feedback and requirements to drive the foundational analytics roadmap for key business initiatives Set up regular forums to engage stakeholders across the business that use these foundational data assets to receive and give feedback on current and future initiatives and ensure they are supported appropriately Manage the high-level prioritization process across teams including the use of WIPs, charter cards and business plans which are reviewed by the team on a regular basis, and manage stakeholders expectations regarding timeframes Develop and implement resource plans to support business plans and roadmaps Responsible for team recruitment, for both external and internal candidates Ensure tailored performance plans and KPIs are set, monitored and managed on a regular basis for all team members Provide coaching and support to direct reports to develop their management and analytical capabilities Monitor and approve technical designs that implement analytical outcomes and models Drive a culture of innovation by engaging with analysts across the team through a variety of forums including one to ones, skip level meetings, lunch time sessions and analytical showcases Foster a cohesive team spirit through team meetings, awards and strong communication Taking a structured and analytical approach in troubleshooting data issues and problems Performance manager for junior team members and enabling the team to succeed by resolving issues and building on strengths Preferred candidate profile: Strong background and understanding of data structures, data manipulation and transformation Previous experience in the fields of data science or high-level data analysis; data modelling experience highly regarded Previous experience working directly with Big Data Engineers highly regarded Previous experience with project management and people management Proven ability and experience in optimizing performance of an existing process with a can-do attitude Desire to work with a team of feedback-driven analysts and developers in a cross-functional and agile environment Lead level data science/data analytics and management of one or more team of analysts Variety of analytical roles preferred, including consulting Skills Required: Sound knowledge of technical analytics discipline, including data preparation, feature engineering and foundational analytics concepts, preferably including model development and model training Sufficient skills in several disciplines or techniques to autonomously apply without the need for material guidance or revision by others; and Sufficient skills in at least one discipline or technique to be an authoritative source of guidance and support for the team, and to be able to solve problems of a high technical complexity in this domain Strong problem-solving skills, including ability to apply a systematic problem-solving approach Solid interpersonal skills, as the role will need to work in a cross-functional team with both other analysts and specialists from non-analytics disciplines Very strong client and stakeholder management skills Ability to autonomously carry out work following analytics best practice as defined at Quantium for the relevant types of tools or techniques, suggesting improvements where appropriate Ability to motivate peers to follow analytics best practice as defined at Quantium, by positively presenting the benefits and importance of these ways of working Commercial acumen to understand business needs and the commercial impacts of different analytics solutions or approaches Attention to detail Drive for continuous improvement
Posted 2 weeks ago
4.0 - 5.0 years
2 - 5 Lacs
Hyderabad
Work from Office
About Beyond Key: We are a Microsoft Gold Partner and a Great Place to Work-certified company. "Happy Team Members, Happy Clients" is a principle we hold dear. We are an international IT consulting and software services firm committed to providing. Cutting-edge services and products that satisfy our clients' global needs. Our company was established in 2005, and since then we've expanded our team by including more than 350+ Talented skilled software professionals. Our clients come from the United States, Canada, Europe, Australia, the Middle East, and India, and we create and design IT solutions for them. If you need any more details, you can get them at https: / / www.beyondkey.com / about. Jo b Description: We are seeking a highly skilled and motivated Senior MarkIT EDM Data Engineer to join our team. The ideal candidate will be a self-starter who can work independently and collaborate effectively with both internal and external stakeholders. This role requires a deep understanding of data engineering principles, strong technical skills, and the ability to drive projects to successful completion. Key Responsibilities: Design, develop, and maintain data solutions using MarkIT EDM. Collaborate with internal teams and external stakeholders to gather requirements and deliver data solutions. Ensure data quality, integrity, and security across all data processes. Optimize data workflows and processes for efficiency and scalability. Troubleshoot and resolve data-related issues in a timely manner. Required Skills: Extensive experience with MarkIT EDM and data engineering. Proficiency in SQL, ControlM and database management. Strong understanding of data modeling and ETL processes. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Excellent problem-solving and analytical skills. Ability to work independently and manage multiple projects simultaneously. Strong communication skills to interact effectively with stakeholders. Preferred Qualifications: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience with programming languages such as Python, or Scala.. Familiarity with Agile methodologies and project management tools.
Posted 2 weeks ago
1.0 - 2.0 years
1 - 5 Lacs
Gurugram
Work from Office
Job Description The Financial Operations Associate will primarily support the Operations and Finance functions and will report to the Financial Operations Manager. This role will provide comprehensive finance administration and will work closely with the Finance team and Project Managers. The Financial Operations Associate will be responsible for preparing and organizing data, ensuring data quality, and assisting with monthly reconciliations within the company s project management and accounting systems. Responsibilities Project Data & Revenue Coordination Create and update Projects in Project Management system, organize and store project SOWs and Purchase Orders Support project management by assigning team members to Active projects while ensuring accuracy with rates, start dates and allocations based off approved pricing guide Assist with project lifecycke by creating Project billing milestones and revenue contracts for T&M and Fixed Fee Projects Track and follow-up with Project Managers that all signed Agreements are received and appropriately stored Monitor Project Contract status and update Project Management system status Lead data monitoring and maintenance of Project Pipeline database Reconcile Project budget, timecards, and recognized revenue schedules Identify, research, and resolve issues regarding project discrepancies or updates; escalate as necessary Send weekly Timecard reminders and monthly missing hours reminders, monitor Consultant timecard submissions and follow-up as necessary Prepare reports like project financial reports, resources utilization report, capacity reports etc Support monthly project revenue recognition process Update Project Management system with project expenses Finance Support Support Finance team during audits by compiling and organizing audit data Provide operational and financial reporting support for Project Managers Financial Process documentation; create and maintain standard operating procedures Project Management system troubleshooting and Q&A Onboard and offboard users and resources in Project Management system Qualifications Bachelor s degree 1-2 years experience in Financial Force or other PSA system preferred Strong knowledge of MS Office products, especially with Excel Organized, detail-oriented, and competent follow-through skills Ability to prioritize and multi-task in a fast-paced environment while meeting deadlines Ability to execute activities within complex processes. Don't meet every job requirement? That's okay! Our company is dedicated to building a diverse, inclusive, and authentic workplace. If you're excited about this role, but your experience doesn't perfectly fit every qualification, we encourage you to apply anyway. You may be just the right person for this role or others.
Posted 2 weeks ago
2.0 - 4.0 years
4 - 9 Lacs
Mumbai
Work from Office
Assist in addressing and resolving data quality issues reported by users and data stakeholders, including anomalies, inconsistencies, and data quality concerns. Participate in the monitoring, tracking, and management of data quality incidents and requests, ensuring timely and effective responses. Assist in creating and maintaining comprehensive documentation of data quality issues, resolutions, and best practices for reference and training. Support the Data Quality Service Desk Lead in managing and escalating complex data quality issues to the relevant teams, ensuring appropriate actions are taken. Assist in tracking and reporting on data quality incidents and metrics to ensure transparency and accountability. Excel Basic understanding of SQL. Good communication and Aptitude. Qualifications Graduate
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi