Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project description We've been engaged by a large Australian financial institution to provide resources to their upcoming Murex Binary Upgrade project. This Bank is considering moving from Current Murex version 42 to latest higher version ( V56 or 58) to include changes Majorly impacting the Collateral / Market Risk / Credit Risk/ Ops and settlement functionalities and modules. There are also some impacts in MX technical areas, Reporting and EODs. We require an experienced Murex FO Consultant as a part of this upgrade project. Responsibilities Understand the Murex system set up at the client, organization of the support teams, end-of-day procedures, and the report delivery process Analyse the DataMart setup and the table structures for the purpose of identifying redundant objects, minimizing the execution time of batches, and seeking the possibility of reuse of objects Collect and provide detailed technical specifications/business requirement specifications Develop new reports and/or update existing reports based on client requirements. Execute processing scripts and batches manually or through the use of a scheduling tool like Control-M Reconcile report extraction output with the onscreen / report output Analyse differences caused by adding or removing some filter conditions / dynamic table flags/launcher flags Create and maintain a report delivery plan with relevant traceability Skills Must have 8+ years of Murex Development experience Minimum 4 MxML and 2 Years of Murex DataMart experience on Murex 3.1 Expert understanding of Murex DataModel, Dynamic Tables, and Viewers Strong understanding of Murex DataMart best practices and solution design Good exposure to Unix shell scripting on Solaris Experience in applying MX.3 DataMart and SQL optimization techniques Experience in MX.3 DataMart batch creation and scheduling Experience in creating test cases for SIT and UAT Testing Experience in DataMart/EOD solution design and effort estimation with limited support required There will be close interaction with business stakeholders. You would be expected to work largely independently as part of a Luxoft-managed team in an Agile/story point-based environment. Write transformation logic for source data format to Murex understandable format (MxML) Create MxML import and export workflows using MXML Exchange Build a reconciliation process across source and destination Configure Messaging queues for real-time interfacing Document Functional, Technical Specifications, and Test Cases for integration Produce exception reports for failures Configure and build Murex Reports for report-based interfaces Build custom tasks in MxML Exchange for specific processing not available through the standard task library Nice to have Functional understanding of capital markets Experience in other Murex modules Other Languages EnglishC1 Advanced Seniority Senior
Posted 16 hours ago
5.0 - 10.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description We require an experienced cross-skilled MxML developer with strong knowledge of Murex, experience in MxML solution design, and broad exposure to financial markets. You will be working as a subject matter expert in a team of Murex Developers on a variety of tasks. This is the ideal role for someone multi-skilled across DM and MxML looking to gain more exposure to either of these technologies. This is also an opportunity to learn about Cloud (AWS) and DevOps tooling since these are heavily used by our clients. Responsibilities Murex Responsibility: Multi-skilled across DM and MxML. not necessarily very experienced and general skillset Write transformation logic for source data format to Murex understandable format (MxML) Create MxML import and export workflows using MXML Exchange Update and Create Datamart Reports including managing EOD changes Build a reconciliation process across source and destination Configure Messaging queues for real-time interfacing Document Functional, Technical Specifications and Test Cases for integration Skills Must have Murex Knowledge of around 5+ years on Murex/MxML Exchange, Contract, or Deliverable workflows Good exposure to writing/coding MXML Formulae Has previously developed interfaces (Deals, Static Data) via Murex (both upstream and Downstream) Has Knowledge of XML Transformations, Document and template generation from MXML Has experience and knowledge of Datamart Report Development and EOD processing Has knowledge of various tasks in MxML and how they work Nice to have Murex Dev priority is MxML with exposure to Datamart. Mid/Senior level Knowledge around SWIFT message generation MT300, MT305, MT540 , MT202 , MT103 DevOps on Murex experience (GIT, Jenkins, JIRA, etc) Technical solution design experience and start-to-end solution ownership Experience with Interest Rate Derivatives, FX Derivatives DataMart nice to have Other Languages EnglishC2 Proficient Seniority Senior
Posted 16 hours ago
7.0 - 12.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description We've been engaged by a large Australian financial institution to provide resources to their upcoming Murex Binary Upgrade project. This Bank is considering moving from Current Murex version 42 to latest higher version ( V56 or 58) to include changes Majorly impacting the Collateral / Market Risk / Credit Risk/ Ops and settlement functionalities and modules. There are also some impacts in MX technical areas, Reporting and EODs. We require an experienced Murex FO Consultant as a part of this upgrade project. Responsibilities Contributing to Requirement Analysis in Front Office space for various projects (Upgrade, Regulatory, ..) Contributing to understanding the business needs, identifying business solutions, and validating the pros and cons of technical solution options Interact with Front Office user base and interface between Business and IT with respect to Murex Configure the application as part of the implementation. This can be related to Trade Insertion (etradepad, FDI, ..), Simulations, Curve setup, Product configuration (generators, indices, ...), etc Working with a Testing team to review test cases, coverage, and investigate any issues they encounter Follow up with Murex as and when necessary to resolve bugs, issues. Solution design for Front-Office area Understands P&L and can attribute differences between the two versions Skills Must have 7+ years of Murex Development experience 3+ years of relevant Murex (and/or other Primary Trading System) Front Office experience Good/Export knowledge of at least two asset classes (Risk and Pricing), including IRD, FI, Credit, FXD, FX Cash, and Commodities, of the Murex product suite. Experienced in dealing with Risk, Product Control or Front Office stakeholders in Markets or Treasury divisions Good hands-on knowledge of FO configurationinstruments, generators, curves, market data, market conventions, etc Good understanding of FO modulesSimulation screens, Simulation Viewer, eTradepad, P&L notepad, market operations, etc Good knowledge of how to analyze P&L and sensitivity issues within Murex Knowledge of organization setup and User Groups / Access Rights Nice to have MReport / Datamart, preTrade and postTrade workflows, and interfaces Deeper all-round knowledge of the Murex application Other Languages EnglishC1 Advanced Seniority Senior
Posted 16 hours ago
3.0 - 7.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description Luxoft has been engaged by a leading UK Financial Services organization to provide Murex implementation services across a varied portfolio of projects. We require an experienced integration developers with strong knowledge of Murex and broad exposure to financial markets to work on a multi-year program. You will be working in a high-performing team of Luxoft and Client staff. Responsibilities Technical Analysis of changes, solution design, development/configuration, and unit testing of MxML Workflows Technical Analysis of changes, solution design, development, and unit testing Participate in fixing production and test defects End-to-end ownership of tasks in cooperation with Business Analysts and Testing team Skills Must have 4+ years of Murex MxML experience Advanced MxML workflow and formulae development Advanced SQL Good general financial market understanding Knowledge of pre-trade framework along with MSL scripting language Unix Nice to have Good Java experience, especially integrating Java code in MxML Pretrade experience DevOps on Murex experience (GIT, Jenkins, JIRA, etc) Technical solution design experience and start-to-end solution ownership. Other Languages EnglishB2 Upper Intermediate Seniority Regular
Posted 16 hours ago
5.0 - 10.0 years
17 - 22 Lacs
Bengaluru
Work from Office
Project description We require an experienced MxML Developer with strong knowledge of Murex, experience in MxML solution design, and broad exposure to financial markets. You will be working as a subject matter expert in a team of Murex Developers on a variety of tasks. Responsibilities Murex Responsibility: Write transformation logic for source data format to Murex understandable format (MxML) Create MxML import and export workflows using MxML Exchange Build a reconciliation process across source and destination Configure Messaging queues for real-time interfacing Document Functional, Technical Specifications, and Test Cases for integration Produce exception reports for failures Configure and Build Murex Reports for report-based interfaces Build custom tasks in MxML Exchange for specific processing not available through the standard task library Skills Must have Murex Knowledge of around 5+ years on Murex/MxML Exchange, Contract, or Deliverable workflows Good exposure to writing/coding MxML formulas Has previously developed interfaces (Deals, Static Data) via Murex (both upstream and Downstream) Has Knowledge of XML Transformations, document generation, and template generation from MxML Has knowledge of various tasks in MxML and how they work Nice to have DevOps on Murex experience (GIT, Jenkins, JIRA, etc) Technical solution design experience and start-to-end solution ownership Experience with Interest Rate Derivatives, FX Derivatives Other Languages EnglishC1 Advanced Seniority Senior
Posted 16 hours ago
3.0 - 7.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project description Our Customer is a Leading bank that provides a front-to-back integrated platform for straight-through processing and risk management. This is a multi-year initiative where different projects run in concurrence under the program's variety of milestones. These streams include new product initiatives, new entity roll-outs, and regulatory compliances. We will have key roles in projects such as managing the scope, design, and delivering requirements from front to back office with Excelian. Responsibilities General Responsibility: Self-driven and able to delegate work to team members Able to review work from team members Strong analytic and technical skills Good understanding of the usage of various dynamic tables Good knowledge of one or more development tools Able to configure, execute, and troubleshoot batch reports in MXG Able to design and optimize the usage of dynamic tables Able to guide team members Ensure all deliverables are good and timely Able to escalate issues/risks in a timely manner to the supervisor Good communication skills Able to work with both technical and business stakeholders Murex Application Responsibility: Understand the Murex system set up at the client, organization of the support teams, end-of-day procedures, and the report delivery process Collect and provide detailed technical specifications and business requirement specifications Participates in peer review of requirements, technical, and/or testing documentation and assists mentor junior members with the same Analyse the Datamart setup and the table structures for the purpose of identifying redundant objects, minimizing the execution time of batches, and seeking the possibility of reuse of objects Segregate reports by users, products, creation classes, fields needed, complexity, and frequency of execution Develop a generic data model and then create datamart objects as required for the reports Execute processing scripts and batches manually or through the use of a scheduling tool like Control-M Reconcile report extraction output with the onscreen / report output Analyse differences caused by adding or removing some filter conditions / dynamic table flags/launcher flags Create and suggest processes, templates, and tools to streamline the development, testing, and implementation phases Analyses issues during planning and test execution phases Provide relevant and accurate information about defects and help the business reproduce errors Prepare and send effective periodic and timely status reports Track the testing progress and escalate the issue well in advance Assists in the preparation and execution of test plans Identifies the conditions that create errors to occur and escalate outstanding issues for clarification and resolution Escalates identified issues/risks in a timely fashion to the team lead. Assist team lead with effort estimation related to the development and UAT Ensures test documentation and deliverables are consistent with defined standards Coordinate with the development and infrastructure team for the availability of Mx environments and database Create and maintain a report delivery plan with relevant traceability Have excellent communication skills, both written and oral Try to increase the team's domain and application knowledge through formal/informal sessions or discussion Skills Must have Minimum 8-10 Years of Murex Datamart experience on Murex 3.1 8+ years of Murex Development experience Good exposure to Datamart architecture and solution design Some experience with EOD/CTRL-M or Autosys scheduling Advanced SQL Advanced financial market understanding covering different asset classes Unix Nice to have DevOps on Murex experience (GIT, Jenkins, JIRA, etc). Technical solution design experience and start-to-end solution ownership Understanding of other Murex modules and/or other financial markets applications. Other Languages EnglishC1 Advanced Seniority Senior
Posted 16 hours ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Long Description Responsibilities: As an ETL Developer, candidate will be expected to help the team craft data solutions to meet business and enterprise requirements using SQL and ETL Tools (Informatica) to enhance and support a Data Warehouse. Required Skills and Experience we are Looking For: Strong SQL skills and awareness / experience of good coding practices and ability to articulate such. Practical working experience with relational databases, preferably Oracle Exadata. Familiarity with Data Warehouse / Data Mart / Data Modeling / Business Intelligence concepts. Familiarity with Autosys / Orchestration Tools. Someone who likes data, enjoys puzzles and is intrigued by using new and existing tools to map out data transformation solutions. Desired Skills: Exposure / self-education with Unix scripting. Exposure / self-education with Python (e.g. Pandas, Data Frames) and use in data processing solutions. Exposure to CI/CD processes and tools (e.g. Ansible, Jenkins). Experience designing and building data warehouses and knowledge of the data flows involved. Willingness to take on problems outside of current skillset and experience. Experience with development methodologies and working in an agile framework Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Informatica iPaas. Experience5-8 Years.
Posted 1 day ago
6.0 - 11.0 years
10 - 20 Lacs
Noida
Remote
Data Modeler Responsibilities: Conducts Requirement Gathering and translates business understanding into Data Modeling terminologies. Responsible for Data Profiling tasks, including Granularity checks, Datatypes, Nullability, Anomalies, etc. Designs and maintains Conceptual, Logical, and Physical data models, as well as source-to-target mapping sheets and DDL generation. Collaborates closely with business analysts, data architects, and other stakeholders to understand the data requirements and business rules. Enforces data model standards and adheres to standards across Snowflake. Maintains documentation for data models, database schemas, and changes over time. Implements and oversees version control for data models, tracking changes, and maintaining a history of modifications. Offers support to developers, analysts, and other team members as needed. Conducts Design Reviews with the Central team and incorporates suggested changes. Qualifications: Demonstrates proven experience as a Data Modeler. Familiarity with Manufacturing industry terminologies. Good to have: Possesses experience with data warehouse and ETL / Data engineering concepts. Demonstrates excellent collaboration and communication skills. Displays self-starting abilities with minimal guidance.
Posted 1 day ago
7.0 - 12.0 years
5 - 15 Lacs
Bengaluru
Work from Office
BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL data bases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi- data -center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Required Skills Required Skills - Data Modeling, Dimensional modeling, Erwin, Data Management, RDBMS, SQL/NoSQL, ETL
Posted 2 days ago
7.0 - 12.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Murex Back Office Workflows Good to have skills : Murex Front Office FinanceMinimum 7.5 year(s) of experience is required Educational Qualification : NA Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the development process, coordinating with team members, and ensuring project milestones are met. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Coordinate with team members to ensure project milestones are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Murex Back Office Workflows- Strong understanding of financial systems- Experience in leading application development projects- Knowledge of software development lifecycle- Hands-on experience in configuring applications Additional Information:- The candidate should have a minimum of 7.5 years of experience in Murex Back Office Workflows- This position is based at our Pune office- A degree in a relevant field is required Qualification NA
Posted 2 days ago
5.0 - 7.0 years
22 - 25 Lacs
Bengaluru
Work from Office
We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance.
Posted 3 days ago
6.0 - 11.0 years
22 - 37 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
candidate will have exp. in designing, developing & maintaining Murex Datamart /MXML/ Business Analyst_ FO/BA/Murex Testing/Support also have exp. in SQL, Unix, and Oracle database. Design, develop, and maintain Murex Datamart solutions. Required Candidate profile Murex MXML Murex Front Office BA/Murex FO Murex Testing Murex Environment Engineer Murex Datamart Participate in code reviews-Work with business analysts to understand requirementsdevelop solutions
Posted 5 days ago
8.0 - 13.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset.. Working on a range of projects including batch pipelines, data modeling, and data mart solutions you’ll be part of collaborative project teams working to implement robust data collection and processing pipelines to meet specific business need.. Show more Show less
Posted 6 days ago
4.0 - 9.0 years
10 - 20 Lacs
Pune
Work from Office
Job Title : Data Modelling Manager Required Exp : 4-7yrs Job location : Pune (hybrid) (3 Days office 2 days Home) Schedule : 1PM -10 PM Mon-Fri Note: Looking for immediate joiners or who can join within 20 days. About Us: Insights is a mission-driven, start-up technology company focused on transforming the healthcare payer industry, ultimately creating a more personalized patient experience, improving health outcomes, and lowering the overall cost of healthcare. Insights provides a flexible, efficient, and secure platform that organizes and exchanges healthcare data from various sources and formats, allowing our customers to uncover differentiated insights that address their clients' needs. Our employees know that they play an active role in keeping our customers' data safe and are responsible for ensuring that our comprehensive policies and practices are met. With our deep expertise in cloud-enabled technologies and knowledge of the healthcare industry, we have built an innovative data integration and management platform that allows healthcare payers access to data that has been historically siloed and inaccessible. Through our platform, these health insurance payers can ingest and manage all the data they need to transform their business by supporting their analytical, operational, and financial needs. Since our founding in 2017, has built a highly successful SaaS business, raising more than $81 Million by leading VC firms who have deep expertise in the healthcare and technology industries. We are solving problems of massive scale and complexity in an industry that is not only ready for disruption. We're growing quickly and would love for you to be a part of it! About the Role: As our lead healthcare data expert, you will guide product managers, engineers, and operations staff on all aspects of healthcare data. Your deep experience to design, implement and document data modeling solutions will enable the team to nimbly build comprehensive products that optimally structure and organize data to meet our customer analytical and operations use cases. Your deep knowledge of identifying key business needs, define and govern data modeling design standards and best practices champion the usage of data with our customers. This will be key to the success of our highly differentiated, automated, and scalable data distribution capability (API Catalog, Data Marts, etc.) to serve diverse analytical and operational data consumption needs pushing the envelope on a cloud-first approach in healthcare data integration. All of this coupled with your passion for improving outcomes for all healthcare stakeholders will drive our data strategy. This represents a great opportunity to blaze your own trail working in a start-up organization in a cross-function, talented team to move the needle on some of the most pressing challenges in healthcare. You should expect to: Identify and define data requirements for data distribution/consumption use cases required by enterprise healthcare customers to include: Knowledge of data modeling, data integrity principles and SQL. Attributes, metrics, lookup/reference data required for Data Marts (e.g.: Risk Adjustment) and be able to explain to engineering: Data orientations/organization of the data required to build an ETL pipeline. Core vs. ancillary attributes/metrics in our lossless canonical models. Expected cardinality vs. outlier ranges for attributes/metrics. Attributes and calculations required to define healthcare industry metrics/KPI/KQIs. Typical queries users would ask of the data: Facts, Dimensions, JOINs. Data delivery patterns using complex events or business rules. Data enrichment (standardization, transformations, algorithms, grouping). Guaranteed data quality criteria required to meet specific SLAs to include data federation with business specific, governed data sources within an enterprise. Work with Engineering to manage, groom, and prioritize the product feature backlog to include creating epics and acceptance criteria. Help define data distribution implementation practices o Partner with health plan service providers to define and drive adoption of data distribution APIs/SDK. Work with Client Management for product support during client engagements Terrific if you have experience: In a Product Role with: Analyzing, visioning and road-mapping . Influencing and evangelizing key stakeholders (customers, partners, thought leaders, senior management) Agile, story writing, grooming, prioritizing, planning, showcasing and delivering MVP. In a Technical Role with: Healthcare IT Data analysis, modeling, and administration in cloud technologies. APIs/SDKs, business rules processing, data federation and data distribution technologies. Working with Engineering teams in highly technical environments. Bachelors degree in computer science, information systems, analytics, or related field, and/or equivalent.
Posted 1 week ago
7.0 - 9.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Want to be part of the Data & Analytics organization, whose strategic goal is to create a world-class Data & Analytics company by building, embedding, and maturing a data-driven culture across Thomson Reuters. About The Role We are looking for a highly motivated individual with strong organizational and technical skills for the position of Lead Data Engineer/ Data Engineering Manager (Snowflake). You will play a critical role working on cutting edge of Data Engineering and analytics, leveraging predictive models, machine learning and generative AI to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale data-driven initiatives.Effectively communicate across various levels, including Executives, and functions within the global organization.Demonstrate strong leadership skills with ability to drive projects/tasks to delivering valueEngage with stakeholders, business analysts and project team to understand the data requirements.Design analytical frameworks to provide insights into a business problem.Explore and visualize multiple data sets to understand data available and prepare data for problem solving.Design database models (if a data mart or operational data store is required to aggregate data for modeling). About You You're a fit for the Lead Data Engineer/ Data Engineering Manager (Snowflake), if your background includes:QualificationsB-Tech/M-Tech/MCA or equivalentExperience7-9 years of corporate experienceLocationBangalore, IndiaHands-on experience in developing data models for large scale data warehouse/data Lake Snowflake, BWMap the data journey from operational system sources through any transformations in transit to itsdelivery into enterprise repositories (Warehouse, Data Lake, Master Data, etc.)Enabling on the overall master and reference data strategy, including the procedures to ensure the consistency and quality of Finance reference data.Experience across ETL, SQL and other emerging data technologies with experience in integrations of a cloud-based analytics environmentBuild and refine end-to-end data workflows to offer actionable insightsFair understanding of Data Strategy, Data Governance ProcessKnowledge in BI analytics and visualization toolsPower BI, Tableau #LI-NR1 Whats in it For You Hybrid Work Model Weve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrows challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our valuesObsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound excitingJoin us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 week ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Chennai
Work from Office
Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.
Posted 2 weeks ago
10.0 - 15.0 years
12 - 17 Lacs
Pune, Bengaluru, Hinjewadi
Work from Office
Software Required Skills: Deep experience with Murex (version 3.1 or higher) in a production environment focusing on reports and Datamart modules Strong SQL proficiency for data querying, issue analysis, and troubleshooting Shell scripting (Bash/sh) skills supporting issue investigation and automation Use of incident management tools such as ServiceNow or JIRA for tracking and reporting issues Familiarity with report development and data analysis in financial contextsPreferred Skills: Experience with other reporting tools or frameworks, such as Tableau, PowerBI, or QlikView Knowledge of data warehousing concepts and architecture Basic scripting knowledge in other languages (Python, Perl) for automationOverall Responsibilities Lead and oversee the support activities for Murex Datamart and reporting modules, ensuring operational stability and accuracy Provide L2/L3 technical support for report-related incidents, resolve complex issues, and perform root cause analysis Monitor report generation, data extraction, and reconciliation processes, ensuring timely delivery Collaborate with business stakeholders to address reporting queries, anomalies, and data discrepancies Support and coordinate system upgrades, patches, and configuration changes affecting reporting modules Maintain comprehensive documentation of system configurations, incident resolutions, and process workflows Lead problem resolution initiatives, including performance tuning and automation opportunities Manage support teams during shifts (24x5/24x7), ensuring effective incident escalation and stakeholder communication Drive continuous improvement initiatives to enhance report accuracy, data quality, and operational efficiencyStrategic objectives: Maximize report availability, accuracy, and reliability Reduce incident resolution times and recurring issues Strengthen reporting processes through automation and data quality enhancementsPerformance outcomes: Minimal unplanned downtime of reporting systems High stakeholder satisfaction with timely, accurate reporting Clear documentation and proactive communication with stakeholdersTechnical Skills (By Category)Reporting & Data Analysis (Essential): Extensive experience supporting Murex Datamart, reports, and related workflows SQL proficiency for data extraction, troubleshooting, and validation Understanding of report structures for P&L, MV, Accounting, Risk, etc.Scripting & Automation (Essential): Shell scripting (Bash/sh) for automation, issue diagnosis, and process automation Experience in automating routine report checks and data validationsDatabases & Data Management (Essential): Relational database management, data querying, and reconciliation Knowledge of data warehousing concepts and architectureSupport Tools & Incident Management (Essential): Hands-on experience with ServiceNow, JIRA, or similar platformsAdvanced & Cloud (Preferred): Familiarity with cloud data hosting, deployment, or cloud-based reporting solutions Experience with other programming languages (Python, Perl) for automationExperience Over 10+ years supporting Murex production environments with a focus on Datamart and reporting modules Proven expertise in resolving complex report issues, data discrepancies, and interface problems Demonstrated leadership with experience managing or supporting L2/L3 teams Support support in high-pressure environments, including escalations Industry experience within financial services, especially trading, risk, or accounting, is preferredAlternative experience pathways: Extensive scripting, data support, and operational expertise supporting financial reports may qualify candidates with fewer years but equivalent depth of knowledgeDay-to-Day Activities Monitor system dashboards, reports, and logs for anomalies or failures Troubleshoot report data issues, interface failures, and system errors Lead incident investigations, performed root cause analysis, and document resolutions Collaborate with business units to clarify reporting needs and resolve discrepancies Support deployment, configuration changes, and upgrades affecting Report and Datamart modules Automate repetitive tasks, batch jobs, and data validation workflows Create and maintain documentation, runbooks, and best practices Conduct shift handovers, incident reviews, and process improvement sessions Proactively identify improvement opportunities in reporting reliability and performanceQualifications Bachelors degree in Computer Science, Finance, Data Management, or related discipline Strong expertise in SQL, shell scripting, and report troubleshooting Deep understanding of financial reporting, P&L, MV, Risk, and accounting data flows Support experience in high-availability, high-pressure settings Willingness to work shifts, including nights, weekends, or holidays as neededProfessional Competencies Strong analytical and problem-solving skills for resolving complex issues Excellent communication skills for engaging with technical teams, business stakeholders, and vendors Leadership qualities to support and mentor support teams Ability to work independently and prioritize effectively under pressure Adaptability to evolving systems and technological environments Focus on continuous improvement and operational excellence
Posted 2 weeks ago
8.0 - 13.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Project description Our Customer is a Leading bank in Australia that provides a front to back integrated platform for straight-through processing and risk management. This is a multi-year initiative where different projects run in concurrence under the program's variety milestones. These streams include new product initiatives, new entity roll-outs, and regulatory compliance. We will have key roles in projects such as managing the scope, design, and delivering requirements from front to back office with Excelian. We are looking for talented and ambitious people. The roles are in the respective Functional, Test Management, Development, Test Support, Environment Management and Release teams. These units will collectively undertake scoping, design, building, testing, and implementation phases to deliver the variety program milestones. Looking for an experienced technical business analyst for the core Treasury IT team to deliver projects for the bank's treasury division for the business with a focus on Commodities, FX, and MM products. Responsibilities The Senior Technical Business Analyst role looks after the business engagement, functional requirements, solution design, and some system configuration for delivery of the migration projects. The role will require engagement with relevant business stakeholders for the initiatives in the approved scope and then work closely with the delivery team as well as relevant Technology partners, to ensure timeliness and quality of the delivery. The role is hence expected to have excellent Business Analysis abilities, as well as the ability to project manage small to medium initiatives. This will involve leading the implementation of regional rollouts in parallel with other sub-streams. The role would include solution design and technical configuration of the Murex 3.1 platform in cooperation with other technical teams. Hands-on work on the application would be required. Skills Must have 8+ years of relevant Murex (and/or other Primary Trading System) Front Office experience. Good/Expert knowledge of at least IRD, FI, CRD, Commodities, and/or FXMM implementation on Murex. Extensive experience in dealing with front-office trading & sales stakeholders in Markets or Treasury divisions. Good hands-on knowledge of FO configurationinstruments, generators, curves, market data, market conventions, etc. Good understanding of FO modulesPretrade workflow, Simulation screens, Simulation Viewer, eTradepad, P&L notepad, market operations, etc. Experience in the implementation of Murex 3.1 with regard to front office capabilities. Nice to have Experience on MReport / Datamart, postTrade workflows, and interfaces is nice to have. Other Languages EnglishC1 Advanced Seniority Senior
Posted 2 weeks ago
5.0 - 6.0 years
9 - 16 Lacs
Gurugram
Hybrid
Role Summary: We are seeking an experienced ETL Developer with strong expertise in Informatica PowerCenter , Oracle SQL/PLSQL , and data warehousing concepts . The ideal candidate will play a key role in developing, optimizing, and maintaining ETL workflows, ensuring seamless data integration and transformation to support business-critical applications. Experience in Snowflake and job scheduling tools such as Control-M is a plus. Key Responsibilities: Collaborate with Technical Leads, Business Analysts, and Subject Matter Experts to understand data models and business requirements. Design, develop, and implement ETL solutions using Informatica PowerCenter . Develop, optimize, and maintain complex SQL/PLSQL scripts to support data processing in Oracle databases. Provide accurate development estimates and deliver high-quality solutions within agreed timelines. Ensure data integrity, reconciliation, and exception handling by following best practices and development standards. Participate in cross-functional team meetings to coordinate dependencies and deliverables. Implement procedures for data maintenance, monitoring, and performance optimization. Essential Skills & Experience: Technical: Minimum 3+ years of hands-on experience with Informatica PowerCenter in ETL development. Experience with Snowflake data warehouse platform. Familiarity with Source Control tools (e.g., Git, SVN). Proficiency in job scheduling tools like Control-M . Strong skills in UNIX shell scripting for automation. Solid experience (minimum 2 years) in SQL/PLSQL development including query tuning and optimization. In-depth understanding of Data Warehousing, Datamart, and ODS concepts . Knowledge of data normalization, OLAP techniques , and Oracle performance optimization . Experience working with Oracle or SQL Server databases (3+ years) along with Windows/UNIX environment expertise. Functional: Minimum 3 years of experience in the financial services sector or related industries. Sound understanding of data distribution, modeling, and physical database design . Ability to engage and communicate effectively with business stakeholders and data stewards . Strong problem-solving, analytical, interpersonal, and communication skills .
Posted 2 weeks ago
2.0 - 3.0 years
6 - 7 Lacs
Pune
Work from Office
Data Engineer Job Description : Jash Data Sciences: Letting Data Speak! Do you love solving real-world data problems with the latest and best techniques? And having fun while solving them in a team! Then come and join our high-energy team of passionate data people. Jash Data Sciences is the right place for you. We are a cutting-edge Data Sciences and Data Engineering startup based in Pune, India. We believe in continuous learning and evolving together. And we let the data speak! What will you be doing? You will be discovering trends in the data sets and developing algorithms to transform raw data for further analytics Create Data Pipelines to bring in data from various sources, with different formats, transform it, and finally load it to the target database. Implement ETL/ ELT processes in the cloud using tools like AirFlow, Glue, Stitch, Cloud Data Fusion, and DataFlow. Design and implement Data Lake, Data Warehouse, and Data Marts in AWS, GCP, or Azure using Redshift, BigQuery, PostgreSQL, etc. Creating efficient SQL queries and understanding query execution plans for tuning queries on engines like PostgreSQL. Performance tuning of OLAP/ OLTP databases by creating indices, tables, and views. Write Python scripts for the orchestration of data pipelines Have thoughtful discussions with customers to understand their data engineering requirements. Break complex requirements into smaller tasks for execution. What do we need from you? Strong Python coding skills with basic knowledge of algorithms/data structures and their application. Strong understanding of Data Engineering concepts including ETL, ELT, Data Lake, Data Warehousing, and Data Pipelines. Experience designing and implementing Data Lakes, Data Warehouses, and Data Marts that support terabytes of scale data. A track record of implementing Data Pipelines on public cloud environments (AWS/GCP/Azure) is highly desirable A clear understanding of Database concepts like indexing, query performance optimization, views, and various types of schemas. Hands-on SQL programming experience with knowledge of windowing functions, subqueries, and various types of joins. Experience working with Big Data technologies like PySpark/ Hadoop A good team player with the ability to communicate with clarity Show us your git repo/ blog! Qualification 1-2 years of experience working on Data Engineering projects for Data Engineer I 2-5 years of experience working on Data Engineering projects for Data Engineer II 1-5 years of Hands-on Python programming experience Bachelors/Masters' degree in Computer Science is good to have Courses or Certifications in the area of Data Engineering will be given a higher preference. Candidates who have demonstrated a drive for learning and keeping up to date with technology by continuing to do various courses/self-learning will be given high preference.
Posted 2 weeks ago
10.0 - 15.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Req ID: 322003 We are currently seeking a Sr. ETL Developers to join our team in Bangalore, Karntaka (IN-KA), India (IN). Strong hands-on experience in SQLs, PL/SQLs [Procs, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp preferred]"‚"‚"‚"‚ Experience on MS-SQL [preferred], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred)"‚"‚"‚"‚"‚"‚"‚"‚"‚ Total Exp7 "“ 10 Yrs.
Posted 2 weeks ago
7.0 - 12.0 years
25 - 35 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)
Posted 2 weeks ago
1.0 - 3.0 years
2 - 5 Lacs
Chennai
Work from Office
Create test case documents/plan for testing the Data pipelines. Check the mapping for the fields that support data staging and in data marts & data type constraints of the fields present in snowflake Verify non-null fields are populated. Verify Business requirements and confirm if the correct logic Is implemented in the transformation layer of ETL process. Verify stored procedure calculations and data mappings. Verify data transformations are correct based on the business rules. Verify successful execution of data loading workflows
Posted 2 weeks ago
1.0 - 3.0 years
3 - 6 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.
Posted 2 weeks ago
4.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Primary Strong expereince in DW testing Strong Experience in testing ETL jobs Experience in writing test scripts for Java/Python scripts Strong Experience in writing complex SQL queries Strong Understanding of data warehouse concepts and data mart testing; To ensure Date integrity is not compromised Test Case Preparation and Execution Transform complex Business logic into SQL or PL/SQL queries Defect tracking experience (i.e. AzDO, Jira, Rally, etc) Exposure to large data sets and understand Data Quality Framework • Automates applicable test cases for regression testing • Good understanding of software testing methodologies, processes and quality metrics • Performs testing • Programming skills in Java • Identifies defects and work with scrum team on resolution • Experience in Selenium, SoapUI or similar tools or as stated in Program tech stack requirements • Accountable for overall test quality (functional & regression) • Promotes environment of collaboration within the test team • Coordinates test planning and execution (FFT, SIT, CVT; performance tests, load tests, etc.) • Provides reports related to product quality metrics • Provides quality related inputs on go/no-go for every release (incl. promotion to INT, CVT & prod environments) • Attends scrum ceremonies • Updates status in Rally on a daily basis • Ensures test cases cover 100% of new functionality
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane