Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
2 - 6 Lacs
Mumbai
Work from Office
Salesforce System Admin Associate - Vera Solutions Salesforce System Admin Associate Location: S o Paulo Salary: R$12,716 per month Check out our values-driven benefits below! Join our innovative, global team using technology to transform the way the social sector engages with data. We run on passion, blending diverse skill sets and experiences with a collective mission to help organizations tackle challenges and achieve greater impact. We encourage all staff to push their own boundaries and drive the company in exciting new directions. As a social enterprise and certified B Corporation, we reinvest a majority of our profits in our growth and continued pursuit of our mission and vision. Position Overview The Salesforce System Admin Associate will be part of Vera s growing Systems team. The Admin Associate will be an active contributor to the design, development, and implementation of new features in Vera s internal Salesforce implementation and work with managed package products. The Admin Associate should bring a desire to learn new technologies and processes and an eagerness to develop new skills and serve internal stakeholders. The Admin Associate should have a strong work ethic, be driven to think for themselves, work independently, and balance multiple tasks. The Admin Associate will join a team of passionate individuals with diverse backgrounds and experiences, all dedicated to improving the way social impact organizations operate. We re a self-motivated, creative group, and we emphasize collaboration, flexibility, and professionalism. Primary Responsibilities: Manage Vera s internal Salesforce system and other tools and make improvements and updates as necessary to maintain a highly effective system and seamless user experience. Participate in the planning and analysis of business requirements for system changes and enhancements. Manage Salesforce custom apps and packages. Write and deliver training material or other Salesforce system support documentation. Provide technical support to end users while acting as a service analyst, by investigating and troubleshooting issues and applying solutions. Perform User Management on internal tools such as Salesforce, Google Suite, Slack etc. Creation and maintenance of roles, profiles, hierarchies. Develop custom solutions on Salesforce platform using declarative automation. Participate in creating and implementing QA Testing and Test suites by performing different types of testing. Collaborate with the team for release management tasks such as Sandbox creation, deployment tasks etc. Manage and improve data quality, system integrity, and security by performing regular audits and clean up tasks. Clearly communicate technical changes and recommendations to users on a regular basis. Perform Data Migration tasks within different systems using ETL tools. Qualifications and Experience Essential Minimum Bachelor s Degree in Engineering /IT/ Computer Science Degree. 2+ years of Salesforce experience. Skills Required Technical expertise: Possesses knowledge of Salesforce administration, data management, custom development and reporting. Issue resolution: Troubleshoots system issues, identifies root causes, and implements solutions to maintain smooth operations. Collaboration: Works closely with internal teams like development and implementation to ensure stakeholder needs are met effectively. Desirable Interest in social service work or international development. Relevant Salesforce certifications Salesforce Administrator Platform App Builder Salesforce Certified Associate Salesforce Business Analyst Vera Solutions offers competitive compensation (including benefits), commensurate with experience and cost of living. Employee Incentive Plan All employees participate in our Employee Incentive Plan. Employees receive stock options, aligned to their job title and location, as part of their total compensation package. To showcase our commitment to employee growth and well-being, Vera offers a paid sabbatical to employees who complete three consecutive years. We do not cap sick leave. If you re sick, we hope you get better! Paid Time Off 48 days of paid leave each year, including local and company-wide holidays and a December break. Vera is passionate about supporting the health of you and your family. We cover medical insurance premiums, including family coverage, up to a maximum amount. Complete an online application by clicking apply below. Vera Solutions is committed to fostering a diverse and inclusive environment and encourages applicants from underrepresented backgrounds to apply. What makes Vera such a great place to work? In everything we do, we re guided by our core values: Excellence, Sustainability, Teamwork, Leadership, and Impact Our Mission As a certified B Corporation, we meet the highest social and environmental standards in pursuit of our mission. Our Global Team Our wonderfully diverse team spans 5 continents and speaks more than 15 different languages. We are invested in and committed to making training, learning, and development one of Vera s signature strengths. Diversity & Inclusion We value our culture of diversity and inclusion in the workplace. We bring our full selves to work and position each other to thrive. Rising Leaders Program We run an annual leadership development initiative for emerging leaders at Vera.
Posted 1 week ago
5.0 - 10.0 years
3 - 7 Lacs
Mumbai
Work from Office
SUMMARY We are seeking a field-oriented Block Coordinator to join our team. This role offers an opportunity to contribute to a high-impact evaluation study focused on improving developmental outcomes in rural regions. As a Block Coordinator, you will be responsible for monitoring and supervising field activities across three blocks in Nandurbar district. You will work closely with field staff, ensure smooth implementation of data collection processes, and coordinate with local stakeholders. This role demands strong organizational skills, attention to detail, and the ability to manage field operations effectively in a dynamic environment. Location - Nandurbar District, Maharashtra ABOUT US - https://www.wadhwaniai.org/ Wadhwani AI is a nonprofit institute building and deploying applied AI solutions to solve critical issues in public health, agriculture, education, and urban development in underserved communities in the global south. We collaborate with governments, social sector organizations, academic and research institutions, and domain experts to identify real-world problems, and develop practical AI solutions to tackle these issues with the aim of making a substantial positive impact. We have over 30+ AI projects supported by leading philanthropies such as Bill & Melinda Gates Foundation, UNICEF and Google.org. With a team of over 200 professionals, our expertise encompasses AI/ML research and innovation, software engineering, domain knowledge, design and user research. In the Press: Our Founder Donors are among the Top 100 AI Influencers G20 India s Presidency: AI Healthcare, Agriculture, & Education Solutions Showcased Globally. Unlocking the potentials of AI in Public Health Wadhwani AI Takes an Impact-First Approach to Applying Artificial Intelligence - data.org Winner of the H&M Foundation Global Change Award 2022 Indian Winners of the 2019 Google AI Impact Challenge, and the first in the Asia Pacific to host Google Fellows Cultures page of Wadhwani AI - https: / / www.wadhwaniai.org / culture / ROLES AND RESPONSIBILITIES Liaise with block-level officials to obtain necessary approvals and ensure smooth execution of data collection activities. Support the selection of study sites through field-based situational analysis. Conduct monthly supervision visits Anganwadi Centers (AWCs) across the blocks Maintain strict adherence to data confidentiality and privacy protocols. Be willing to travel extensively within the district as required. Document field insights, and share regular reports. Ensure data quality, ethical standards, and timely submission of field updates. Plan field schedules efficiently to maximize AWC coverage. Handle local logistics including scheduling interviews and managing field travel. Undertake additional responsibilities as assigned by the supervisor from time to time. REQUIREMENTS Education: Bachelors in Statistic, economics, epidemiology, development studies Relevant data collection experience of at least 5 years is preferred Skilled in collecting quantitative and qualitative data Should have strong contextual understanding and familiarity with local community Strong communication in marathi and hindi languages is mandatory Ability to work independently, travel extensively, and manage time well. Proficient computer skills in including Word, and Excel We are committed to promoting diversity and the principle of equal employment opportunity for all our employees and encourage qualified candidates to apply irrespective of religion or belief, ethnic or social background, gender, gender identity, and disability.
Posted 1 week ago
5.0 - 10.0 years
6 - 10 Lacs
Bengaluru
Work from Office
We are excited to invite applicants to join a diverse team from different regions sitting in Third-Party Management Claims team. We offer a flexible working environment where curious and adaptable people thrive. We are flexible on the location of the right candidate. About the Role As a direct report to the Head Third Party Management, you will closely collaborate primarily with Corporate Solutions Claims and Property & Casualty Reinsurance (P&C Re) and Global Clients & Solutions (GC&S) teams focusing on claims service providers. The aim of the role is to drive an efficient and effective claims external service providers management program, collaborating with the Groups global claims departments, that deliver differentiated services and propositions in clearly defined customer segments. Leadership of Third Party Management Claims Pillar Responsible for Swiss Re Group wide Claims Vendor Strategy, working closely with Claims Leadership across all business divisions In conjunction with Compliance and Risk, develop the TPM Claims vendor risk framework In conjunction with Legal, develop and implement the Global Contracting Strategy for all key claims vendor types Responsibility for Group Claims ebilling strategy, including the ownership and ongoing management of ebilling providers Negotiate contracts, terms & conditions and rates for panel vendors as determined by CLE contracting strategy Drive data quality improvement initiative as well as reporting metrics in place to better steer VM spend portfolio Provide ad-hoc support to Claims in relation to non-panel preferred vendors Build and maintain relationship with strategic vendors Work closely with Third Party Management on implementation of Delegated Authority vendors About the Team The Third Party Management Claims leverages value and buying power across Swiss Re Group, by supporting business units with the onboarding of claims outside counsel (legal firms and other claims service providers). The team works with its peer team Third Party Management to ensure Outsourcing arrangements for External Claims Service Providers are appropriately governed (identified, triaged, assessed, including appropriate due diligence, contract wordings and ongoing support with governance and oversight). The Third Party Management Claims team is part of the Global Business Solutions (GBS) division, focusing on a strong partnership with Business Units and Group Functions. About You We are seeking a self-aware strong communicator, able to challenge current structure and processes. An ideal candidate can establish trust through open, clear communication and empower others through effective knowledge sharing. Minimum 5 years Claims Vendor Management experience required Minimum 3 years within insurance industry, familiar with risk and compliance topics (nice to have) Solid knowledge of international claims vendor requirements and regulations Strong social skills, including excellent command of English Nice to have: Leadership and people management experience University degree or equivalent preferred, but not required with targeted work experience MCIPS qualifications desired If you are a committed collaborative colleague willing to help, we will be happy to get to know you! About Swiss Re Swiss Re is one of the world s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords: Reference Code: 134600
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Mumbai
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : ServiceNow Governance, Risk, and Compliance (GRC) Good to have skills : ServiceNow IT Service ManagementMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for ensuring the successful delivery of projects and collaborating with cross-functional teams to achieve project goals. Your creativity and expertise will contribute to the development of innovative solutions and drive the success of our applications. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the effort to design, build, and configure applications.- Act as the primary point of contact for project-related activities.- Ensure the successful delivery of projects.- Collaborate with cross-functional teams to achieve project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in ServiceNow Governance, Risk, and Compliance (GRC).- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in ServiceNow Governance, Risk, and Compliance (GRC).- This position is based in Mumbai with location flex for cross location resource.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
13 - 17 Lacs
Mumbai
Work from Office
The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified
Posted 1 week ago
7.0 - 12.0 years
6 - 10 Lacs
Mumbai
Work from Office
Client Clearing Group manages Collateral for all its clients that clear trades through centralized clearing houses. The group generates/sends margin calls to its clients on behalf of clearing houses in which they hold account. The group is responsible for ensuring that BNP is covered in terms of Credit Risk by issuing Intraday Calls as per notification from clearing houses and Prefunding client accounts as per market fluctuation. The group also reconciles Start of day balances with clearing house and accordingly transfers funds to ensure that there is no impact on clients trading activities throughout the day. Apart from call and bond bookings the group also looks after Start of Day activities which involves sending Margin Summary Notices to clients. The process requires individually monitoring and resolving breaks that occur on each component of Margin Summary Notice and ensuring that Margin Call notices generated are in line with balances published by all the clearing houses that we deal with. This is an extremely time sensitive activity and requires liaising with IT Support, Data Storage, Legal, Middle Office and Clearing Houses. Responsibilities Direct Responsibilities Performing Data Quality analysis, Monitoring batches that generate cash flows, interest, bond prices, and other components of a Margin Call. Taking ownership to Investigate on disputed calls and liaising with client, MO and legal on various aspects of disputed margin calls. Review and resolved failed/missed client payments. Review and resolve all bond fails by liaising with client, RM and Settlement Team. Being a first point of contact for escalations. Ensure interest is booked within a month and that there are no instances of carry forward interest. Investigating on Interest discrepancies. Representing Operations on IT Development calls. Smooth transition of BAU during IT enhancements and IT releases. Building operational solutions on new market requirements. Contributing Responsibilities Contributing on Automation calls and Meetings by actively participating. Testing new functionalities. Come up with tactical solutions to ensure calls are issued on time. Review and understand Business Documents raised for automations. Generate monthly client KPIs and publish it to stakeholders. Participate in stakeholder calls and present team performance. Identify open risk items in the process and look for solutions to overcome those. Create and maintain training plans for New Joiners. Technical & Behavioral Competencies Strong Communication skills required for an effective liaison with counterparties and internal stakeholders (Middle offices, marketers, traders, Risk, Clearing Houses, compliance and legal) Knowledge of derivative products (Credit, Interest Rate, FX, Equities) and Trade life cycle. Experience in clearing processes is favorable. Basic Excel skills required. Advanced Presentation skills required Basic Power Bi skills Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Attention to detail / rigor Organizational skills Personal Impact / Ability to influence Critical thinking Transversal Skills: Ability to manage a project Analytical Ability Ability to develop and leverage networks Ability to develop others & improve their skills Ability to inspire others & generate people's commitment Education Level: Bachelor Degree or equivalent Experience Level At least 7 years
Posted 1 week ago
5.0 - 10.0 years
9 - 14 Lacs
Mumbai
Work from Office
This position will report to Data/Process Governance Manager. The candidate will contribute to various Data/Process Governance initiatives. This will involve interactions with stakeholders across the designations. The candidate will also play a pivotal role in facilitating Committee and other Data Governance Working Group meetings. Testing periodic Controls of the team. The candidate will also be involved in various Data/Process Governance initiatives such as creating/ amending various documents i.e., Policies, Standards and Procedures etc. The role demands excellent communication skills to communicate and deliver cross border requirements. Responsibilities Direct Responsibilities The candidate shall work closely with Data/Process Governance and Data Management teams. Should have good hands-on knowledge of MS PowerPoint to support preparing deck/presentations for various data governance committee meetings and working groups. Prepare minutes of the meeting and circulate them. Coordinate various Data/Process Governance related initiatives with data stewards and other stakeholders. Independently Manage interactions/requirements coming from Onshore Data Governance Manager and others Leads in the Data Office team. Perform periodic governance controls testing activities. Create/amend various documents Policies, Standards and Procedures. This requires excellent interpretation skill to understand complex documents. The candidate shall support policy administration, documents renewal process, other admin activities related to Data Management and data analysis etc. Contributing Responsibilities Own/Contribute/Conclude to any ad-hoc projects. Technical Behavioral Competencies Good organizational and interpersonal skills. Attention to detail and the ability to work with a distributed multinational and multicultural team. Display a sense of curiosity, enthusiasm, and eagerness to understand business constraints, environment, and impact on regulation for the financial industry. Initiative, autonomy, self-motivated, self-starter. Extremely well organized and able to ensure adherence to a strict process. Flexible in a dynamic and evolving environment. Develops open, considerate, and effective relationships with stakeholders. Ability to work under pressure and creatively address various topics in-hand. Specific Qualifications (if required) Bachelor Degree 5+ years of work experience in Process Governance / Data Governance / Data Management /Internal Audits. Proficient in MS Office (Word/Excel/PowerPoint) Excellent communication skills, verbal and written. MBA can also apply. Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Ability to deliver / Results driven Ability to share / pass on knowledge Communication skills - oral written Transversal Skills: Ability to understand, explain and support change Analytical Ability Ability to manage / facilitate a meeting, seminar, committee, training Ability to develop and adapt a process Education Level: Bachelor Degree or equivalent
Posted 1 week ago
7.0 - 12.0 years
12 - 17 Lacs
Mumbai
Work from Office
The purpose of this position is to conduct data risk assessment, find issues in the data used for US Federal Reserve regulatory reporting, set up detective rules to catch hold of exceptions. set up controls in close coordination with business stakeholders. The person joining will assist business and regulatory reporting teams to raise issues that affects them, conduct root cause analysis and take them to effective resolution owners to get them closed. The position also helps in identifying areas of continuous improvement and improve them thus achieving higher efficiency. Support audit queries from time to time from audit teams. Direct Responsibilities; Data Risk Assessment Conduct data risk assessment on data that is used, linked and moves across hobs and the risks faced. Data Analysis Analyze large volumes of data related to investment banking domains i.e. capital markets, corporate banking, liquidity, credit risk, market risk, client referential and securities referential and identify issues in them via data quality checks, conduct quality checks on a periodical basis as required and present results to stakeholder for corrective action and take them to closure. Prepare functional reports and assist in automating the process using tools as required. Undertake deep dive of issues affecting the business and come up with the remediation actions. Understand how the data moves across the bank and set up data control checks based on risks. Assess the exceptions generated and share the same with Data Stewards. Understand either of the US FED reports FFIEC002, FRY-15. FFIEC-009 and understand how they are filed from an accounting basis. Understand the gaps in coverage and check for setting up of control checks. Write medium to complex SQL queries to extract data from database for data analysis. Be able to connect the dots. Explain the functional gaps to techno functional team so that they can incorporate those checks in their systems. Develop test cases aimed at validating development for a given functional requirement and record them in the official test management platform. Have a continuous improvement mindset. Data Controls Maintain an overview and inventory of controls. Conduct a review of the controls in a periodic manner with Data Stewards and business, APS and IT stakeholders. Conduct a certification assessment of the controls on a quarterly cycle. Prefiling Analysis Conduct a prefiling analysis to understand whether the transactions are clubbed properly and reported in the correct MDRM in specific Fed reports. Issue Management and Remediation Keep a list of the issues, help conduct root cause analysis. Conduct root cause analysis on the issues identified and communicated in close collaboration with data stakeholders. Project Work Work on creating requirements for automating processes. Test the solution as per requirement. Audit Support Help support audit requirements and responses under stringent timelines. Contributing Responsibilities; - Partner with data domain owners to develop new functional logics in existing SQL queries; which will enhance and support the goal of quality data being used for reporting purposes. - Produce and distribute daily/weekly team Status reports and contribute to project status reporting depending on the undertaking at hand - Understand the competitive landscape and adopt technology based solutions to drive efficiency - Create SOPs, perform knowledge transfer and cross training for proper backup in team Technical Behavioral Competencies Able to write moderate level SQL queries in Oracle without any assistance. - Understanding of various data models for extraction of required data sets to be consumed for Root cause analysis. - Experience in one or many of these data quality tools and reporting tools/applications; Power BI, Collibra - Proficient in MS Office applications; MS Excel, MS Word, MS Powerpoint. Specific Qualifications (if required) M.Com/CA/ CFA along with 7 years of experience with all shown below:- - Data quality area - Data Controls attestation expertise - Should have supported US FED reports - Ability to perform root cause analysis - Business Analyst working in a financial institution and has completed multiple SDLC lifecycle. Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Attention to detail / rigor Ability to deliver / Results driven Active listening Transversal Skills: Ability to develop and adapt a process Ability to understand, explain and support change Analytical Ability Ability to set up relevant performance indicators Ability to develop and leverage networks Education Level: Master Degree or equivalent
Posted 1 week ago
5.0 - 10.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP FSCM Credit Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FSCM Credit Management- Experience with using Multi-Bank Connectivity (MBC) product for customer payments- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in SAP FSCM Credit Management- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
22 - 30 Lacs
Mumbai
Hybrid
Type of Candidate They Want: Strong experience with data governance tools (especially Informatica) Experience building data policies and quality frameworks Knows privacy laws and regulatory standards Worked with cloud platforms like AWS, Azure, or GCP Can manage cross-functional teams , conduct meetings, and influence business leaders
Posted 1 week ago
3.0 - 8.0 years
1 - 5 Lacs
Mumbai
Work from Office
About The Role Project Role : Application Tech Support Practitioner Project Role Description : Act as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge. Must have skills : Wireless Technologies Operations Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Tech Support Practitioner, you will act as the ongoing interface between the client and the system or application. You will be dedicated to quality, using exceptional communication skills to keep our world-class systems running. With your deep product knowledge, you will accurately define client issues and design resolutions. Your typical day will involve providing ongoing support to clients, troubleshooting technical issues, and ensuring smooth system operations. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Provide ongoing support to clients, addressing their technical issues and concerns.- Troubleshoot system or application problems and provide timely resolutions.- Collaborate with cross-functional teams to ensure smooth system operations.- Stay updated with the latest product knowledge and industry trends.- Identify areas for process improvement and suggest innovative solutions. Professional & Technical Skills: - Must To Have Skills: Proficiency in Wireless Technologies Operations.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 3 years of experience in Wireless Technologies Operations.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
4 - 8 Lacs
Pune
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse, PySpark, Core Banking Good to have skills : AWS BigDataMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data processing workflow. Your role will be pivotal in enhancing the efficiency and reliability of data operations within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processing workflows to optimize performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse, Core Banking, PySpark.- Good To Have Skills: Experience with AWS BigData.- Strong understanding of data modeling and database design principles.- Experience with data integration tools and ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Stibo Product Master Data Management Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive successful project outcomes. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the needs of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Stibo Product Master Data Management.- Strong understanding of data governance principles and practices.- Experience with application design and architecture.- Ability to analyze complex data sets and derive actionable insights.- Familiarity with project management methodologies and tools. Additional Information:- The candidate should have minimum 5 years of experience in Stibo Product Master Data Management.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Governance Good to have skills : Machine LearningMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the implementation of new technologies for enhanced application performance- Conduct regular code reviews and provide constructive feedback to team members- Stay updated on industry trends and best practices to drive continuous improvement Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance- Good To Have Skills: Experience with Machine Learning- Strong understanding of data governance principles and best practices- Hands-on experience in implementing data governance solutions- Ability to analyze complex data sets and derive actionable insights- Excellent communication and interpersonal skills for effective collaboration Additional Information:- The candidate should have a minimum of 7.5 years of experience in Collibra Data Governance- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
4 - 8 Lacs
Bengaluru
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Alteryx Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead data solution architecture design.- Implement data governance policies and procedures.- Optimize data storage and retrieval processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Alteryx.- Strong understanding of data modeling and database design.- Experience with data integration tools and techniques.- Knowledge of data warehousing concepts.- Hands-on experience with SQL and scripting languages. Additional Information:- The candidate should have a minimum of 5 years of experience in Alteryx.- This position is based at our Mumbai office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
5.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
About The Role Skill required: Property & Casualty - Catastrophe Risk Management Designation: Analytics and Modeling Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Ability to perform quarterly CAT risk aggregation and deliver insightful and actionable reporting to internal stakeholdersUnderstand complex treaty structure and apply to portfolio risks Provide stakeholders explanation of portfolio results and trending over time, model change impacts, and data quality and limitationsSupport ad-hoc reporting, corporate and unit level reinsurance placements, and deliver quarterly CAT data to external partnersApply CAT modeling best practices by continually understanding the evolving process within Client and build process efficienciesCoaching and develop members within teamMaintain good documentation of results, issues, data quality etc and have regular dialogue with each line of business to provide feedbackClaims settlements related any client property they own or any accidentsCatastrophe Risk Management refers to the process of guiding insurers how to manage risk aggregations, deploy capital, and price insurance coverage by using computer assisted calculations to estimate the losses that could be sustained due to a catastrophic event such as a hurricane or earthquake. What are we looking for Minimum 5 years of catastrophe modelling experience - Strong analytical, problem-solving approach strong knowledge on various reinsurance is mustExperience in using RMS applications and / or AIR CAT model(s) is mustClear communicator, able to structure an argument and explain complex issues in a simple way faster learning capabilitiesAdvanced knowledge in MS Office applications (Outlook, Word, Excel & PowerPoint) Willingness to work and handle multi tasking in a rapidly changing environment with some firm deadlinesShould have better SQL skills and understand the database schema additional programming (R/VBA) is advantageAbility to work independently. Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day to day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Please note that this role may require you to work in rotational shifts Qualification Any Graduation,12th/PUC/HSC
Posted 1 week ago
5.0 - 10.0 years
10 - 14 Lacs
Mumbai
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Spring Boot, Japanese Language Good to have skills : Spring Application FrameworkMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Proficiency in Spring Boot, Japanese- Good To Have Skills: Experience with Spring Application Framework- Strong understanding of statistical analysis and machine learning algorithms- Experience with data visualization tools such as Tableau or Power BI- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information:- The candidate should have a minimum of 5 years of experience in Spring Boot- This position is based in Mumbai with location flex for cross location resource.- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
About The Role Project Role : Data Governance Practitioner Project Role Description : Establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Must have skills : Collibra Data Governance Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance initiatives within the organization- Develop and implement data governance policies and procedures- Ensure compliance with data governance regulations and standards Professional & Technical Skills: - Must To Have Skills: Proficiency in Collibra Data Governance- Strong understanding of data governance principles- Experience in implementing data governance frameworks- Knowledge of data quality management practices- Familiarity with data privacy and security regulations Additional Information:- The candidate should have a minimum of 5 years of experience in Collibra Data Governance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
5.0 - 6.0 years
25 - 30 Lacs
Chennai
Work from Office
Description Ciklum is looking for a Senior Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4, 000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Senior Data Engineer, become a part of a cross-functional development team engineering experiences of tomorrow. Responsibilities Develop and optimize robust, scalable data pipelines using PySpark and Python Clean, transform, and enrich large-scale datasets from structured and unstructured sources Implement data ingestion, ETL/ELT workflows, and integration strategies across cloud and on-prem platforms Collaborate with data scientists, analysts, and business stakeholders to understand data requirements Ensure data quality, integrity, and lineage throughout the data lifecycle Participate in performance tuning, troubleshooting, and production support Contribute to best practices in data engineering, including code versioning, testing, and CI/CD Requirements Bachelor s degree in Computer Science, Data Engineering, or related field 3+ years of experience in data engineering with a focus on PySpark and Python Strong hands-on experience with distributed data processing frameworks (e. g. , Apache Spark) Solid understanding of SQL, data modeling, and relational databases Experience working with cloud platforms (e. g. , AWS, Azure, GCP) Familiarity with workflow orchestration tools (e. g. , Airflow, Azure Data Factory) Desirable Java experience for supporting hybrid data platforms and legacy integrations Exposure to data lakes, delta lakes, and modern data architectures Knowledge of containerization (Docker), Kubernetes, and CI/CD pipelines Familiarity with data governance, security, and compliance frameworks Whats in it for you Strong community: Work alongside top professionals in a friendly, open-door environment Growth focus: Take on large-scale projects with a global impact and expand your expertise Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies Care: We ve got you covered with company-paid medical insurance, mental health support, and financial & legal consultations
Posted 1 week ago
7.0 - 11.0 years
22 - 27 Lacs
Chennai
Work from Office
The opportunity: As a Global Configurator Deployment Specialist, you will play a pivotal role in ensuring the smooth operation of our product customization processes. Your responsibilities encompass both configuration governance and master data management. The primary responsibility of the Global Application specialist is to onboard various Transformer products, into different types of configurators and Web shop. You are strongly involved in all operational activities and contributes to various areas, from solution design, through testing, deployment, configuration, and development to users training. How you ll make an impact: Configuration Management: Lead the deployment and maintenance of configuration data, including product specifications of complex industrial goods, costing, and pricing, collaborate closely with developers in order to ensure business satisfaction. Business understanding: Develop business requirements based on company needs, drive ROI assessments for new developments. Testing and Validation: Rigorously test and validate configuration models and scenarios. Collaboration: Work closely with cross-functional teams, including product managers, engineers, and sales, to maintain consistency and accuracy in product configurations, creation of documentations and training material Performance Monitoring: Continuously monitor and analyze configuration performance, incorporating customer feedback for improvements. Training and Support: Provide training and support to factory and sales teams on configurator and web shop utilization. Cross-Functional Project Management: Manage cross-functional projects related to configurator and web shop enhancements. Communication: Facilitate communication between businesses and IT, resolving conflicts and gaining commitment. Requirement Engineering: Determine which requirements are covered by standard functionalities and identify areas for enhancements. Data-Driven Decision-Making: Analyze data from configurator and web shop usage to inform decision-making and improve tool effectiveness. Core Data Elements: Manage critical master data objects, such as products, business partners, technical assets, and enterprise structures. Data Compliance: Ensure compliance with data standards and policies. Data Quality: Perform data cleansing, validation, and enrichment to maintain high-quality master data. Data Integration: Support data migration, integration, and synchronization efforts. Issue Resolution: Address data-related issues promptly and provide data support to stakeholders. C ompliance: Responsible to ensure compliance with applicable external and internal regulations, procedures, and guidelines. Safety and integrity: Living Hitachi Energy s core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your Background: Bachelor s degree in a relevant field (e. g. , Business, IT, Engineering). Experience working within a multinational company and managing cross-regional projects is highly desirable. Demonstrated excellence in project management. Solid background in sales and marketing. Experience with e-Config and Camos tools is an added advantage Knowledge of configuration strategies across various business archetypes. Technical expertise in configuration and master data management. Established proficiency in SalesForce. com. Experience/knowledge in SAP SD, MM, Pricing Familiarity with MS Project and Jira is beneficial. Technical expertise in configuration and master data management tools. Robust problem-solving capabilities and meticulous attention to detail. Strong communication skills for effective cross-team collaboration. Previous experience in pricing is highly valued. Comprehensive cost modelling and understanding of full cost models (highly advantageous). Proficiency in both spoken & written English language is required. Qualified individuals with a disability may request a reasonable accommodation if you are unable or limited in your ability to use or access the Hitachi Energy career site as a result of your disability. You may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process. .
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Pune, Ahmedabad, Vadodara
Work from Office
Job Description : Job Summary: We are looking for a talented Power BI Developer with hands-on experience in Microsoft Fabric to join our data and analytics team. The ideal candidate will be responsible for building interactive dashboards, integrating various data sources using Fabric Dataflows, and implementing end-to-end BI solutions that drive business decisions. A strong understanding of the Microsoft data ecosystem, DAX, and data modeling best practices is essential. Having knowledge of the manufacturing industry would be an added advantage. Key Responsibilities: Power BI Development: Design and develop visually compelling, interactive reports and dashboards using Power BI. Build data models, implement DAX calculations, and optimize performance for large datasets. Create reusable templates, themes, and standardized visuals. Microsoft Fabric (Preferred/Good to Have): Utilize Microsoft Fabric components like OneLake, Lakehouse, Dataflows Gen2, Pipelines, and Notebooks. Connect and manage Data Warehouses and Lakehouses for unified data modeling. Collaborate with data engineers to create semantic models using Direct Lake, Import, or Direct Query modes. Work with Notebooks for data exploration and Power BI integrations for data storytelling. Data Integration & Governance: Connect Power BI to various data sources (SQL Server, Excel, SharePoint, Fabric, Dataverse, SAP, APIs, etc. ). Implement row-level security (RLS) and dataset certifications to ensure secure and trusted analytics. Ensure data quality, integrity, and documentation across the BI solution lifecycle. Collaboration & Communication: Gather requirements from stakeholders and translate them into scalable BI solutions. Work closely with cross-functional teams including Data Engineers, Analysts, and Business Users. Present analytical insights to non-technical users in a meaningful way.
Posted 1 week ago
1.0 - 5.0 years
5 - 9 Lacs
Coimbatore
Work from Office
Responsibilities: Responsible and self-driven work in the field of data engineering. Design, develop, and maintain data pipelines and ETL processes to ingest, process, transform and store large volumes of data from diverse sources. Collaborate with the business stakeholders to understand their data requirements and provide data-driven solutions and insights for decision making. Optimize and tune data pipelines and database performance for scalability, efficiency, and reliability. Implement and maintain data warehouses, data lakes, and other data storage solutions for efficient data retrieval and analysis. Ensure data quality, integrity, and security throughout the data lifecycle. Create data visualizations and reports to communicate findings effectively to technical and non-technical audiences. Stay updated with the latest data engineering, tools, and best practices. Interest in Data (Analysis, Schema, Validation, Visualization) Should be able to handle both big data and small data. Required Skills: Proven experience as a Data Engineer, showcasing expertise in both domains. Proficiency in programming languages such as Python with proper coding standards. Strong knowledge in PySpark, IMPALA, HIVE & Tableau. Strong experience with SQL and relational databases (e. g. , PostgreSQL, MySQL, etc. ). Familiarity with NoSQL databases (e. g. , MongoDB, Cassandra) is a plus. Knowledge of data modeling, data warehousing, and data architecture. Experience with big data processing frameworks (e. g. , Hadoop, Spark). In-depth knowledge of orchestration and scheduling jobs for data engineering pipelines with Apache Airflow. Experience with Cloud-based data solutions (e. g. , AWS, Azure, GCP) is advantageous. Experience with JIRA, GIT & Bitbucket. Excellent problem-solving and analytical skills. Strong communication and presentation skills. Ability to work independently and collaboratively in a team environment. Qualifications Bachelor s or Master s degree in Computer Science or a related field.
Posted 1 week ago
7.0 - 11.0 years
25 - 30 Lacs
Hosur, Bengaluru
Work from Office
Key Responsibilities Data Architecture Design Define and design Big Data architecture solutions, including data lakes, data warehouses, and real-time processing systems. Architect and implement scalable, secure, and high-performance data pipelines and data integration solutions. Ensure alignment with industry best practices and organizational goals for data architecture. Big Data Ecosystem Management Develop and manage workflows using Big Data tools like Hadoop, Spark, Kafka, Hive, and Flink. Leverage cloud-based Big Data services (AWS EMR, Azure Synapse, GCP BigQuery, or similar) to optimize performance and scalability. Oversee the implementation of streaming data platforms to support real-time analytics. Data Modeling and Integration Design and maintain data models (conceptual, logical, and physical) that support structured and unstructured data. Build robust ETL/ELT processes to ingest, process, and integrate large volumes of diverse data sources. Implement APIs and frameworks for seamless data sharing and consumption. Data Governance and Security Establish frameworks to ensure data quality , lineage, and governance across the data lifecycle. Implement security measures for data at rest and in motion using encryption and access controls. Ensure compliance with global data regulations such as GDPR, CCPA, or similar. Gen AI exposure / experience as mandatory . Collaboration and Stakeholder Engagement Partner with data engineers, data scientists, business analysts, and IT teams to align architecture with business needs. Translate complex technical concepts into actionable insights for stakeholders. Performance Optimization and Monitoring Monitor and optimize performance of Big Data systems, ensuring low latency and high reliability. Troubleshoot and resolve performance bottlenecks in distributed data environments. Emerging Technology and Innovation Evaluate and implement emerging technologies, such as Graph Databases, NoSQL Systems, and AI-driven analytics platforms . Continuously explore innovations in the Big Data ecosystem to drive efficiency and competitive advantage. Success Criteria Explore different tech stacks and architecture design Document supporting evidence with KPIs for decision on solution design and document product guideline and protocols to seamlessly utilize the framework Prior experience with ETL & Big Data to set up scalable pipeline to process data in real time and batch Develop configurable solutions to support cross functional requirements and support multiple platforms. Experience in managing cross-functional team, requirements gathering , day-to-day relationships with clients, and stakeholders supporting clients to achieve better outcomes Preferred Qualifications Experience : 10+ years of experience in data architecture, with at least 3+ years focusing on Big Data technologies. 5+ years as Data Architect with proficiency working in environments supporting solutions design Proven track record of delivering end-to-end Big Data solutions in enterprise environments. Technical Expertise : Strong understanding of Big Data frameworks like Hadoop, Spark, Kafka, Hive, Flink, and Presto. Proficiency in cloud-based Big Data platforms (AWS EMR, Azure Synapse, GCP BigQuery, or Databricks). Expertise in database systems, including both SQL (e. g. , PostgreSQL, MySQL) and NoSQL (e. g. , MongoDB, Cassandra). Hands-on experience with ETL tools like Talend, Informatica, or Apache NiFi. Familiarity with data visualization tools (e. g. , Tableau, Power BI) and analytics platforms. Certifications : Certifications such as AWS Certified Data Analytics, Azure Data Engineer Associate, GCP Professional Data Engineer, or Hadoop certifications are highly desirable. Key Attributes Strong analytical and problem-solving skills with a passion for data-driven innovation. Excellent communication and collaboration abilities to engage both technical and non-technical stakeholders. Strategic mindset with a focus on scalability, performance, and alignment with business objectives. Ability to thrive in fast-paced environments and handle multiple priorities effectively.
Posted 1 week ago
7.0 - 11.0 years
25 - 30 Lacs
Hosur, Bengaluru
Work from Office
: Data Architect Location : Bangalore / Hyderabad / Pune / Coimbatore Position Overview We are seeking an experienced Data Architect to design, implement, and optimize cutting-edge data solutions with a focus on Big Data technologies. This role will involve creating robust and scalable data architectures to support analytics, AI, and business intelligence initiatives. The ideal candidate will have deep expertise in data modeling, integration, governance, and advanced tools and frameworks used in Big Data environments. Key Responsibilities Data Architecture Design Define and design Big Data architecture solutions, including data lakes, data warehouses, and real-time processing systems. Architect and implement scalable, secure, and high-performance data pipelines and data integration solutions. Ensure alignment with industry best practices and organizational goals for data architecture. Big Data Ecosystem Management Develop and manage workflows using Big Data tools like Hadoop, Spark, Kafka, Hive, and Flink. Leverage cloud-based Big Data services (AWS EMR, Azure Synapse, GCP BigQuery, or similar) to optimize performance and scalability. Oversee the implementation of streaming data platforms to support real-time analytics. Data Modeling and Integration Design and maintain data models (conceptual, logical, and physical) that support structured and unstructured data. Build robust ETL/ELT processes to ingest, process, and integrate large volumes of diverse data sources. Implement APIs and frameworks for seamless data sharing and consumption. Data Governance and Security Establish frameworks to ensure data quality , lineage, and governance across the data lifecycle. Implement security measures for data at rest and in motion using encryption and access controls. Ensure compliance with global data regulations such as GDPR, CCPA, or similar. Gen AI exposure / experience as mandatory . Collaboration and Stakeholder Engagement Partner with data engineers, data scientists, business analysts, and IT teams to align architecture with business needs. Translate complex technical concepts into actionable insights for stakeholders. Performance Optimization and Monitoring Monitor and optimize performance of Big Data systems, ensuring low latency and high reliability. Troubleshoot and resolve performance bottlenecks in distributed data environments. Emerging Technology and Innovation Evaluate and implement emerging technologies, such as Graph Databases, NoSQL Systems, and AI-driven analytics platforms . Continuously explore innovations in the Big Data ecosystem to drive efficiency and competitive advantage. Success Criteria Explore different tech stacks and architecture design Document supporting evidence with KPIs for decision on solution design and document product guideline and protocols to seamlessly utilize the framework Prior experience with ETL & Big Data to set up scalable pipeline to process data in real time and batch Develop configurable solutions to support cross functional requirements and support multiple platforms. Experience in managing cross-functional team, requirements gathering , day-to-day relationships with clients, and stakeholders supporting clients to achieve better outcomes Preferred Qualifications Education : Bachelor s or Master s degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience : 10+ years of experience in data architecture, with at least 3+ years focusing on Big Data technologies. 5+ years as Data Architect with proficiency working in environments supporting solutions design Proven track record of delivering end-to-end Big Data solutions in enterprise environments. Technical Expertise : Strong understanding of Big Data frameworks like Hadoop, Spark, Kafka, Hive, Flink, and Presto. Proficiency in cloud-based Big Data platforms (AWS EMR, Azure Synapse, GCP BigQuery, or Databricks). Expertise in database systems, including both SQL (e. g. , PostgreSQL, MySQL) and NoSQL (e. g. , MongoDB, Cassandra). Hands-on experience with ETL tools like Talend, Informatica, or Apache NiFi. Familiarity with data visualization tools (e. g. , Tableau, Power BI) and analytics platforms. Certifications : Certifications such as AWS Certified Data Analytics, Azure Data Engineer Associate, GCP Professional Data Engineer, or Hadoop certifications are highly desirable. Key Attributes Strong analytical and problem-solving skills with a passion for data-driven innovation. Excellent communication and collaboration abilities to engage both technical and non-technical stakeholders. Strategic mindset with a focus on scalability, performance, and alignment with business objectives. Ability to thrive in fast-paced environments and handle multiple priorities effectively.
Posted 1 week ago
3.0 - 7.0 years
18 - 20 Lacs
Bengaluru
Work from Office
You are a strategic thinker passionate about driving solutions in business architecture and data management. You have found the right team. As a Banking Book Product Owner Sr. Associate in our Firmwide Finance Business Architecture (FFBA) team, you will spend each day defining, refining, and delivering set goals for our firm. You will partner with stakeholders across various lines of business and subject matter experts to understand products, data, source system flows, and business requirements related to Finance and Risk applications and infrastructure. As a Product Owner on the Business Architecture team, you will work closely with Line of Business stakeholders, data Subject Matter Experts, Consumers, and technology teams across Finance, Credit Risk & Treasury, and various Program Management teams. Your primary responsibilities will include prioritizing the traditional credit product book of work, developing roadmaps, and delivering on multiple projects and programs during monthly releases. Your expertise in data analysis and knowledge will be instrumental in identifying trends, optimizing processes, and driving business growth. As our organization grows, so does our reliance on insightful, data-driven decisions. You will dissect complex datasets to unearth actionable insights while possessing a strong understanding of data governance, data quality, and data management principles. Job Responsibilities Utilize Agile Framework to write business requirements in the form of user stories to enhance data, test execution, reporting automation, and digital analytics toolsets. Engage with development teams to translate business needs into technical specifications, ensuring acceptance criteria are met. Drive adherence to product and Release Management standards and operating models. Manage the release plan, including scope, milestones, sourcing requirements, test strategy, execution, and stakeholder activities. Collaborate with lines of business to understand products, data capture methods, and strategic data sourcing into a cloud-based big data architecture. Identify and implement solutions for business process improvements, creating supporting documentation and enhancing end-user experience. Collaborate with Implementation leads, Release managers, Project managers, and data SMEs to align data and system flows with Finance and Risk applications. Oversee the entire Software Development Life Cycle (SDLC) from requirements gathering to testing and deployment, ensuring seamless integration and execution. Required qualifications, capabilities, and skills Bachelor s degree with 8+ years of experience in Project Management or Product Ownership, with a focus on process re-engineering. Proven experience as a Product Owner with a strong understanding of agile principles and delivering complex programs. Strong analytical and problem-solving abilities, with the capacity to quickly assimilate business and technical knowledge. Experience in Finance, Risk, or Operations as a Product Lead. Familiarity with Traditional Credit Products and Liquidity and Credit reporting data. Highly responsible, detail-oriented, and able to work with tight deadlines. Excellent written and verbal communication skills, with the ability to articulate complex concepts to diverse audiences. Strong organizational abilities to manage multiple work streams concurrently, maintaining sound judgment and a risk mindset. Solid understanding of financial and regulatory reporting processes. Energetic, adaptable, self-motivated, and effective under pressure. Basic knowledge of cloud technologies (e. g. , AWS). Preferred qualifications, capabilities, and skills Knowledge of JIRA, SQL, Microsoft suite of applications, Databricks and data visualization/analytical tools (Tableau, Alteryx, Python) is a plus. Knowledge and experience of Traditional Credit Products (Loans, Deposits, Cash etc. , ) and Trading Products (Derivatives and Securities) a plus. You are a strategic thinker passionate about driving solutions in business architecture and data management. You have found the right team. As a Banking Book Product Owner Sr. Associate in our Firmwide Finance Business Architecture (FFBA) team, you will spend each day defining, refining, and delivering set goals for our firm. You will partner with stakeholders across various lines of business and subject matter experts to understand products, data, source system flows, and business requirements related to Finance and Risk applications and infrastructure. As a Product Owner on the Business Architecture team, you will work closely with Line of Business stakeholders, data Subject Matter Experts, Consumers, and technology teams across Finance, Credit Risk & Treasury, and various Program Management teams. Your primary responsibilities will include prioritizing the traditional credit product book of work, developing roadmaps, and delivering on multiple projects and programs during monthly releases. Your expertise in data analysis and knowledge will be instrumental in identifying trends, optimizing processes, and driving business growth. As our organization grows, so does our reliance on insightful, data-driven decisions. You will dissect complex datasets to unearth actionable insights while possessing a strong understanding of data governance, data quality, and data management principles. Job Responsibilities Utilize Agile Framework to write business requirements in the form of user stories to enhance data, test execution, reporting automation, and digital analytics toolsets. Engage with development teams to translate business needs into technical specifications, ensuring acceptance criteria are met. Drive adherence to product and Release Management standards and operating models. Manage the release plan, including scope, milestones, sourcing requirements, test strategy, execution, and stakeholder activities. Collaborate with lines of business to understand products, data capture methods, and strategic data sourcing into a cloud-based big data architecture. Identify and implement solutions for business process improvements, creating supporting documentation and enhancing end-user experience. Collaborate with Implementation leads, Release managers, Project managers, and data SMEs to align data and system flows with Finance and Risk applications. Oversee the entire Software Development Life Cycle (SDLC) from requirements gathering to testing and deployment, ensuring seamless integration and execution. Required qualifications, capabilities, and skills Bachelor s degree with 8+ years of experience in Project Management or Product Ownership, with a focus on process re-engineering. Proven experience as a Product Owner with a strong understanding of agile principles and delivering complex programs. Strong analytical and problem-solving abilities, with the capacity to quickly assimilate business and technical knowledge. Experience in Finance, Risk, or Operations as a Product Lead. Familiarity with Traditional Credit Products and Liquidity and Credit reporting data. Highly responsible, detail-oriented, and able to work with tight deadlines. Excellent written and verbal communication skills, with the ability to articulate complex concepts to diverse audiences. Strong organizational abilities to manage multiple work streams concurrently, maintaining sound judgment and a risk mindset. Solid understanding of financial and regulatory reporting processes. Energetic, adaptable, self-motivated, and effective under pressure. Basic knowledge of cloud technologies (e. g. , AWS). Preferred qualifications, capabilities, and skills Knowledge of JIRA, SQL, Microsoft suite of applications, Databricks and data visualization/analytical tools (Tableau, Alteryx, Python) is a plus. Knowledge and experience of Traditional Credit Products (Loans, Deposits, Cash etc. , ) and Trading Products (Derivatives and Securities) a plus.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France