Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Kochi, Kerala, India
On-site
We're looking for an experienced Encompass Admin/Developer with 6–8 years of hands-on experience in Encompass development, customization, and cloud upgrades. A strong understanding of Mortgage Loan Origination processes is essential. Experience integrating Encompass with Salesforce is a major plus! Key Responsibilities & Requirements: Administer and manage the Encompass Loan Origination System (LOS) across multiple environments Lead or support cloud migration/upgrade projects, including configuration, UAT, and deployment Customize Encompass with Business Rules, Input/Print Forms, Custom Fields, and Persona-based workflows Develop and maintain Encompass SDK/API integrations and tools (using .NET/C# or equivalent) Design, document, and implement solutions aligned with mortgage loan origination workflows Collaborate with cross-functional teams on Salesforce-Encompass integrations for seamless lead-to-loan lifecycle Conduct regular system performance reviews and proactively manage enhancements Provide ongoing production support, including backup checks, user issues, and audit tracking Excellent communication and documentation capabilities Qualifications 6–8 years of solid experience as an Encompass Admin/Developer Strong experience in Encompass customization and configuration Hands-on knowledge of Mortgage Loan Origination processes (disclosures, underwriting, closing, post-closing) Experience with cloud upgrade/migration of Encompass instances Familiarity with Salesforce integration – either direct API, middleware (MuleSoft, Informatica), or custom Experience using the Encompass SDK, API, and building plugins is a strong plus Knowledge of compliance requirements in mortgage systems Excellent problem-solving and communication skills
Posted 1 week ago
8.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Tech Lead, Systems Engineering What does a successful Snowflakes Advisor do? We are seeking a highly skilled and experienced Snowflake Advisor to take ownership of our data warehousing strategy, implementation, maintenance and support. In this role, you will design, develop, and lead the adoption of Snowflake-based solutions to ensure scalable, efficient, and secure data systems that empower our business analytics and decision-making processes. As a Snowflake Advisor, you will collaborate with cross-functional teams, lead data initiatives, and act as the subject matter expert for Snowflake across the organization. What You Will Do Define and implement best practices for data modelling, schema design, query optimization in Snowflakes Develop and manage ETL/ELT workflows to ingest, transform and load data into Snowflakes from various resources Integrate data from diverse systems like databases, API`s, flat files, cloud storage etc. into Snowflakes. Using tools like Streamsets, Informatica or dbt to streamline data transformation processes Monitor or tune Snowflake performance including warehouse sizing, query optimizing and storage management. Manage Snowflakes caching, clustering and partitioning to improve efficiency Analyze and resolve query performance bottlenecks Monitor and resolve data quality issues within the warehouse Collaboration with data analysts, data engineers and business users to understand reporting and analytic needs Work closely with DevOps team for Automation, deployment and monitoring Plan and execute strategies for scaling Snowflakes environments as data volume grows Monitor system health and proactively identify and resolve issues Implement automations for regular tasks Enable seamless integration of Snowflakes with BI Tools like Power BI and create Dashboards Support ad hoc query requests while maintaining system performance Creating and maintaining documentation related to data warehouse architecture, data flow, and processes Providing technical support, troubleshooting, and guidance to users accessing the data warehouse Optimize Snowflakes queries and manage Performance Keeping up to date with emerging trends and technologies in data warehousing and data management Good working knowledge of Linux operating system Working experience on GIT and other repository management solutions Good knowledge of monitoring tools like Dynatrace, Splunk Serve as a technical leader for Snowflakes based projects, ensuring alignment with business goals and timelines Provide mentorship and guidance to team members in Snowflakes implementation, performance tuning and data management Collaborate with stakeholders to define and prioritize data warehousing initiatives and roadmaps. Act as point of contact for Snowflakes related queries, issues and initiatives What You Will Need To Have Must have 8 to 10 years of experience in data management tools like Snowflakes, Streamsets, Informatica Should have experience on monitoring tools like Dynatrace, Splunk. Should have experience on Kubernetes cluster management CloudWatch for monitoring and logging and Linux OS experience Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and works well in a fast paced, fluid and dynamic environment. What Would Be Great To Have Experience in EKS for managing Kubernetes cluster Containerization technologies such as Docker and Podman AWS CLI for command-line interactions CI/CD pipelines using Harness S3 for storage solutions and IAM for access management Banking and Financial Services experience Knowledge of software development Life cycle best practices Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 1 week ago
6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Responsibilities: Identify automation opportunities within the system and lead discussions with stakeholders and clients. Conduct multiple Proof of Concepts (PoC’s) on Tosca's capabilities and present demos to stakeholders and clients. Review software requirements and prepare test scenarios. Collaborate with QA Engineers to develop effective strategies, test plans, and test cases. Develop and execute both automated and manual test cases and scripts. Analyze test results for database impacts, errors, bugs, and usability issues. Report bugs and errors to development teams. Assist in troubleshooting issues. Work with cross-functional teams to ensure quality throughout the software development lifecycle. Regularly update and maintain test documentation, ensuring accuracy and completeness for all testing phases. Monitor, track, and report on testing activities, progress, and outcomes to stakeholders, providing insights and recommendations for improvements. Provide expertise and guidance in quality assurance best practices, continually improving testing methodologies and processes. Qualifications: Bachelor's degree or equivalent experience. 6-8 years of professional experience in software quality assurance with a focus on Tosca automation. 2-3+ years of experience as an Automation Lead. Proficiency in the Tosca tool suite, including Tosca Commander and DEX. Strong leadership and communication skills to effectively lead a team and collaborate with stakeholders. Experience in creating and maintaining test automation frameworks and scripts using Tosca. Solid understanding of agile methodologies and the software development lifecycle (SDLC). Ability to analyze and interpret complex technical information and communicate it effectively to non-technical stakeholders. Experience in mobile, mainframe, API, and desktop-based automation. Test plan and test strategy creation. Mandatory certification: Automation Specialist (AS1, AS2, AE1, TDS1). Experience integrating Tosca with CI/CD pipelines and DEX Server for continuous testing. Familiarity with other automation testing tools and frameworks. Up-to-date knowledge of software test design and testing methodologies. Familiarity with agile frameworks. Ability to develop test strategies and write test cases. Ability to document and troubleshoot errors. Working knowledge of test management software. Industry Experience With a Healthcare Insurance Company. Experience testing both web and mobile applications. Excellent communication and critical thinking skills. Good organizational skills and a detail-oriented mindset. Analytical mind and problem-solving aptitude. Technical Skills: ETL testing tools: Tableau, Cognos, Informatica, DataStage, MuleSoft, Power BI, DataBricks. Test management tools: Spira, qTest, TestRail, HP ALM, and JIRA. Automation testing tools: Tosca and Selenium. Programming languages: Java, Python. Databases: Oracle, AWS RedShift. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com
Posted 1 week ago
12.0 - 22.0 years
20 - 25 Lacs
Bengaluru
Work from Office
The opportunity: The Quality Assurance (QA) and Control Manager will oversee the planning, coordination, and execution of QA activities for a large-scale SAP ERP set up. This role ensures that SAP-Center of Expertise meet internal quality standards, industry best practices, and business requirements. The manager will also be responsible for designing and managing governance frameworks to monitor process improvements and maintain long-term operational excellence in ERP and enabled processes aligned to the strategic objectives of SAP-CoE. How you ll make an impact: Define and implement a comprehensive quality assurance strategy and plan specific to the service management (defects/ incident management, and related interfaces), specification and development of new functionality, project management, and operations. Develop and enforce quality standards, testing protocols, and documentation procedures across SAP modules (e. g. , FI/CO, MM, SD, PP, etc. , ). Conduct quality gate reviews on SAP- CoE projects. Monitor deliverables from SAP consultants, developers, and business stakeholders to ensure they meet agreed-upon quality criteria. Provide any special input reviewing the testing procedures and development and execution of testing strategies including Unit Testing, Integration Testing, User Acceptance Testing (UAT), and Regression Testing. Ensure qualitative process in defects management. Establish control mechanisms to ensure that implemented ERP processes are compliant with internal policies and external regulations (e. g. , SOX, GDPR). Work closely with BU/FU leads and business process owners to align SAP processes with organizational objectives and continuous improvement efforts. Define KPIs and dashboards to monitor process adherence and performance post-implementation. Implement and drive continuous improvements in SAP- CoE. Maintain quality Document management system. Identify, document, and manage quality-related risks. Conduct root cause analysis for defects or process failures and ensure corrective/preventive actions are implemented. Conduct periodic process Audits and implement corrective actions. Ensure Process compliance through effective documentation and process traceability. Provide regular QA status reports to management/ steering committees. Facilitate workshops and meetings with functional teams to ensure quality awareness and continuous engagement. Act as a point of contact for QA/QC-related issues and escalate critical quality risks appropriately. Responsible to ensure compliance with applicable external and internal regulations, procedures, and guidelines. Living Hitachi Energy s core values of safety and integrity, which means taking responsibility for your own actions while caring for your colleagues and the business. Your background: Bachelor s or master s degree in information technology, Engineering, or related field. 15+ years of experience in large scale SAP ERP implementation with at least 7+ years in quality assurance/control in SAP/ ERP projects. Strong understanding of SAP modules and implementation methodologies (e. g. , SAP Activate, ASAP, ADO, Panaya, etc. , ). Certification in Quality Management (e. g. , Six Sigma, ISO 9001) and SAP Quality Assurance. Knowledge in Data - Syniti , Informatica, SAP Data Intelligence, Testing -Worksoft Tricentris , Selenium Etc. Proven experience in enterprise process design, process mapping, and control frameworks. Proficiency in both spoken & written English language is required. .
Posted 1 week ago
1.0 - 8.0 years
3 - 7 Lacs
Gurugram
Work from Office
A Day in Your Life at MKS: We are looking for an exceptional Senior Informatica Programmer/Analyst who can perform development, implementation and usage of Information Technology and management information systems within the Informatica Intelligent Data Management Cloud (IDMC) platform. Working in partnership with the business relationship managers, super-users, end-users and technical team to ensure full adoption, effective usage, and efficient deployment of our IT solutions. Effectively manage the change control process, gathering the end-user requirements, and communicating IT priorities and delivery status to the business. You Will Make an Impact By: Collaborate with business partners to understand and document integration processes and solutions Develop, test, document and implement solutions leveraging Informatica IDMC Actively demonstrate a passion for continuous improvement focused on end user productivity, and enterprise process integration. Work with various business groups in the organization to facilitate cross-functional implementation of new or improved business process requirements for all IT-related business, financial, and operations systems critical to core organizational functions. Effectively manage the IT change control process, gathering the end-user requirements, preparing functional specifications and communicating IT priorities and delivery status to the business. Skills You Bring: Bachelors degree in Computer Science, Information Technology, Information Systems or any related fields 6+ years of Informatica development experience required with 2 years of Informatica Intelligent Data Management Cloud experience preferred Strong knowledge of SQL and DDL scripting Strong communication skills with experience drafting technical documents Be dissatisfied with status quo with a thirst to introduce change Energetic team player with a can-do attitude We cant wait for your application ! #LI-AM2 MKS is committed to working with and providing reasonable accommodations to qualified individuals with disabilities. for a specific job, please include the requisition number (ex: RXXXX), the title and location of the role
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
About Fusemachines Fusemachines is a 10+ year old AI company, dedicated to delivering state-of-the-art AI products and solutions to a diverse range of industries. Founded by Sameer Maskey, Ph.D., an Adjunct Associate Professor at Columbia University, our company is on a steadfast mission to democratize AI and harness the power of global AI talent from underserved communities. With a robust presence in four countries and a dedicated team of over 400 full-time employees, we are committed to fostering AI transformation journeys for businesses worldwide. At Fusemachines, we not only bridge the gap between AI advancement and its global impact but also strive to deliver the most advanced technology solutions to the world. About the role: This is a remote, full time consulting position (contract) responsible for designing, building, and maintaining the infrastructure required for data integration, storage, processing, and analytics (BI, visualization and Advanced Analytics) to optimize digital channels and technology innovations with the end goal of creating competitive advantages for food services industry around the globe. We re looking for a solid lead engineer who brings fresh ideas from past experiences and is eager to tackle new challenges. We re in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data and cloud technologies. Candidates must also be proficient with data programming languages (Python and SQL), AWS cloud and Snowflake Data Platform. The data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects, with the intent of elevating the customer experience and driving revenue and profit growth globally. Qualification & Experience: Must have a full-time Bachelors degree in Computer Science or similar from an accredited institution. At least 3 years of experience as a data engineer with strong expertise in Python, Snowflake, PySpark, and AWS. Proven experience delivering large-scale projects and products for Data and Analytics, as a data engineer. Skill Set Requirement: Vast background in all things data-related. 3+ years of real-world data engineering development experience in Snowflake and AWS (certifications preferred). Highly skilled in one or more programming languages, must have Python , and proficient in writing efficient and optimized code for data integration, storage, processing, manipulation and automation. Strong experience in working with ELT and ETL tools and being able to develop custom integration solutions as needed, from different sources such as APIs, databases, flat files, and event streaming. Including experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus. Strong experience with scalable and distributed Data Technologies such as Spark/PySpark, DBT and Kafka, to be able to handle large volumes of data. Strong programming skills in SQL , with proficiency in writing efficient and optimized code for data integration, storage, processing, and manipulation. Strong experience in designing and implementing Data Warehousing solutions in AWS with Snowflake. Good understanding of Data Modelling and Database Design Principles. Being able to design and implement efficient database schemas that meet the requirements of the data architecture to support data solutions. Proven experience as a Snowflake Developer, with a strong understanding of Snowflake architecture and concepts. Proficient in Snowflake services such as Snowpipe, stages, stored procedures, views, materialized views, tasks and streams. Robust understanding of data partitioning and other optimization techniques in Snowflake. Knowledge of data security measures in Snowflake, including role-based access control (RBAC) and data encryption. Experience with Kafka, Pulsar, or other streaming technologies. Experience orchestrating complex task flows across a variety of technologies, Apache Airflow preferred. Expert in Cloud Computing in AWS, including deep knowledge of a variety of AWS services like Lambda, Kinesis, S3, Lake Formation, EC2, ECS/ECR, IAM, CloudWatch, EKS, API Gateway, etc Good understanding of Data Quality and Governance, including implementation of data quality checks and monitoring processes to ensure that data is accurate, complete, and consistent. Good Problem-Solving skills: being able to troubleshoot data processing pipelines and identify performance bottlenecks and other issues. Responsibilities: Follow established design and constructed data architectures. Developing and maintaining data pipelines (streaming and batch), ensuring data flows smoothly from source (point-of-sale, back of house, operational platforms and more of a Global Data Hub) to destination. Handle ETL/ELT processes, including data extraction, loading, transformation and loading data from various sources into Snowflake to enable best-in-class technology solutions. Play a key role in the Data Operations team - developing data solutions responsible for driving Growth. Contribute to standardizing and developing a framework to extend these pipelines globally, across markets and business areas. Develop on a data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best-in-breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.). Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points. Ensure the reliability, scalability, and efficiency of data systems are maintained at all times. Assist in the configuration and management of Snowflake data warehousing and data lake solutions, working under the guidance of senior team members. Work with cross-functional teams, including Product, Engineering, Data Science, and Analytics teams to understand and fulfill data requirements. Contribute to data quality assurance through validation checks and support data governance initiatives, including cataloging and lineage tracking. Takes ownership of storage layer, SQL database management tasks, including schema design, indexing, and performance tuning. Continuously evaluate and integrate new technologies to enhance data engineering capabilities and actively participate in our Agile team meetings and improvement activities. Fusemachines is an Equal opportunity employer, committed to diversity and inclusion. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristic protected by applicable federal, state, or local laws.
Posted 1 week ago
3.0 - 6.0 years
3 - 6 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities:- Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements.- Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements.- Develop and maintain technical documentation for SnapLogic integrations and workflows.- Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic.- Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage.- Experience in designing, developing, and maintaining integrations and workflows using SnapLogic.- Experience in analyzing business requirements and developing solutions to meet those requirements.- Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information:- The candidate should have a minimum of 5 years of experience in SnapLogic.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic.- This position is based at our Pune office. Qualification 15 Years of full time education
Posted 1 week ago
4.0 - 6.0 years
6 - 10 Lacs
Bengaluru
Work from Office
ECMS ID Technology Overall Exp. Mandatory Skills Relevant exp. min Relevant exp. - cutoff Relevant exp. Max Bill rate Bill rate max 531505 Devops Engineer 8+ AWS 8 6 10 10000 11000 531505 Devops Engineer 8+ CI/CD (Jenkins, Teamcity) 8 6 10 10000 11000 531505 Devops Engineer 8+ Iac(Terraform) 8 6 10 10000 11000 JD is mentioned below: Good working understanding of the AWS cloud services IaC coding using Terraform Enterprise Scripting , preferably using Ansible,python. Shell is also okay. Liaising with various teams as a consultant to understand and provide solutions to their issues/problems. Good to know Informatica DEI and IDMC tools .
Posted 1 week ago
5.0 - 6.0 years
15 - 16 Lacs
Chennai
Work from Office
Req ID: 325293 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Specialist to join our team in Chennai, Tamil N du (IN-TN), India (IN). Responsibilities: 5-6 years of application support experience supproing Dot Net and Azure apps. Ability to debug and coordinate with development teams to ensure efficient issue resolution. Monitoring Tools: Splunk, App Insight Management of deployments and addressing root causes for payment-related issues. Monitor Informatica batch job failures and provide insights for downstream dependencies. Proactively handle payment and application downtime issues Handle deployment monitoring and validations, escalating to the development team for resolution of critical issues when necessary. Shifts: Rotational 24X7 Mandatory Skills High level programming languages: C# (.NET MVC, .NET Core and .NET 6/7) UI: Angular, Javascript, CSS, ASP.NET MVC API: Restm Web API or Azure functions or Azure Durable Functions CI /CD: Azure pipelines, Terraform Scripting: Powershell, Bash Database: Microsoft SQL Server or NoSQL (e.g. CosmosDB) and Oracle Containerization: Azure Kubernetes Service, Kubernetes (open source) and Docker Agile knowledge
Posted 1 week ago
5.0 - 8.0 years
6 - 10 Lacs
Chennai, Bengaluru
Work from Office
Req ID: 328612 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer / Developer to join our team in Chennai/Bangalore, Tamil N du (IN-TN), India (IN). Data Engineer / Developer Primary Skillset (Must Have) Oracle PL/SQL, SnapLogic Secondary Skillset (Good to Have) Informatica, Aerospike Tertiary Skillset (Nice to Have) Python Scripting Minimum Experience on Key Skills - 5 to 8 years General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai OR GV, Manyata or EGL, Bangalore 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons.
Posted 1 week ago
6.0 - 11.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Position Manager Analytics - Trust & Safety About the Team- The PhonePe TnS teamis a high impact team which makes extensive use of data, statistical and quantitative analysis, rules-based methods, and explanatory and predictive modeling to identify and mitigate the fraud, risk and abuse patterns visible on PhonePe platform focusing on the consumer perspective. Our work increasingly employs specialized competencies, such as advanced analytics, data visualization, application development, and geographical trend analysis. As a Manager PhonePeTnS team, you will be leading the major analytics projects for keeping the platform safe for the consumer as well as identify any abusive behaviour. You would also be involved in development of new products and capabilities by working closely with the product teams. If your dream is to build processes and digital tools to better understand financial transactions and identify trends that would impact millions of customers, partnering with some of the best minds and executing on your dreams with purpose and speed, join us! Roles & Responsibilities : Supervise, lead & train a team of analysts and contribute actively to team development. Work closely with the operations, Product, Tech teams Create, manage, and utilize high performance relational databases (using SQL or other analytical tool) Query and mine large data sets to discover transaction patterns, examine financial data and filter for targeted information - using traditional as well as predictive/advanced analytic methodologies Explore, ideate and develop fraud, risk and abuse detection and mitigation methodologies, validate the existing ones, and look for opportunities to optimize the existing scenario thresholds backed by data insights. Design and development of user-defined application modules using BI tools (QlikView / Tableau/ QlikSense /PowerBI etc), perform data quality control, develop database reports and user interfaces, and normalize relational data Develop analysis on business problems and prepare and present complex written and verbal materials to senior management and stakeholders. Ability to understand and follow industry best practices for Fraud Prevention. Leverage information from regulatory changes, new regulations and internal policy changes to better identify new key risk areas. Monitoring and regulating high risk activities for various PhonePe business verticals. Identify automation opportunities and suggest process improvements. Analyze the customer transactional and behavioral data to identify the patterns, trends, anomalies and develop strategies to mitigate the suspicious activity. Support the ad-hoc reporting requirements Education & Preferred Qualifications : include, but are not limited to: Bachelors in engineering or Master's degree in Management, Mathematics, Statistics or related quantitative discipline Candidate should have demonstrated experience to work with Fraud and Risk Functions, technology and analytics. (Payment Industry experience preferred) 6+ years of experience in the field of analytics preferably in identifying fraud patterns and customer risk profiling. Strong quantitative abilities, distinctive problem-solving and excellent analytics skills Strong organizational, communication, presentation and project management skills Excellent SQL and Excel skills. Should be able to write queries to manipulate and consolidate data from multiple data sources. Data analytics project experience related to one or more of the followingtransaction monitoring data analysis and system implementation, development of customer risk rating models, customer segmentation, threshold tuning, customer/account/transaction data modeling, management and quality assessment, model validation, and/or Know-Your-Customer data remediation Working experience with BI tools (QlikView / Tableau/ QlikSense /PowerBI etc) will be an added advantage Strong background in statistical modeling and experience with tools such R /SAS /Python or SPSS. Ability to work independently and to liaise with other departments and coordinate with various stakeholders (internal / external). PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Remote
Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
3.0 - 8.0 years
9 - 14 Lacs
Mumbai
Remote
Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
5.0 - 10.0 years
9 - 13 Lacs
Mumbai
Work from Office
: J ob TitleBusiness Management Analyst Corporate TitleAnalyst LocationMumbai, India Role Description As a BA you are expected to design and deliver on critical senior management dashboards and analytics using tools such as Excel, SQL etc. These management packs should enable management to make timely decisions for their respective businesses and create a sound foundation for the analytics. You will need to collaborate closely with senior business managers, data engineers and stakeholders from other teams to comprehend requirements and translate them into visually pleasing dashboards and reports. You will play a crucial role in analyzing business data and generating valuable insights for other strategic ad hoc exercises. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Collaborate with business user, managers to gather requirements, and comprehend business needs to design optimal solutions. Perform ad hoc data analysis as per business needs to generate reports, visualizations, and presentations helping strategic decision making. You will be responsible for sourcing information from multiple sources, build a robust data pipeline model. To be able work on large and complex data sets to produce useful insights. Perform audit checks ensuring integrity and accuracy across all spectrums before implementing findings. Ensure timely refresh to provide most updated information in dashboards/reports. Identifying opportunities for process improvements and optimization based on data insights. Communicate project status updates and recommendations. Your skills and experience Bachelors degree in computer science, IT, Business Administration or related field Minimum of 5 years of experience in visual reporting development, including hands-on development of analytics dashboards and working with complex data sets Excellent Microsoft Office skills including advanced Excel skills. Comprehensive understanding of data visualization best practices Experience with data analysis, modeling, and ETL processes is advantageous. Excellent knowledge of database concepts and extensive hands-on experience working with SQL Strong analytical, quantitative, problem solving and organizational skills. Attention to detail and ability to coordinate multiple tasks, set priorities and meet deadlines. Excellent communication and writing skills. How well support you . . . .
Posted 1 week ago
4.0 - 9.0 years
8 - 12 Lacs
Pune
Work from Office
: Job Title Fixed Income Production Support Engineer LocationPune, India Corporate TitleAS Role Description The Production Support Engineer acts as a professional in the Global Fixed Income IT support team, providing front line support to the trading and sales business community and their associated applications used by the Global Fixed Income business. The team is responsible for management of Deutsche Banks premier trading platforms providing execution, market data and post-trade services to external and internal clients. Each individual is expected to balance between fast response (e.g. electronic trading monitoring and incident management) and longer term activities (e.g. DevOps, release planning, security and audit reviews). A single day or even a single hour is likely to see a mix of both. Working in Production Support means you'll use both creative and critical thinking skills to maintain application systems that are crucial to the daily operations of the firm. You'll work collaboratively in teams on a wide range of projects based on your primary area of focus. While learning to fix application and data issues as they arise, you'll also gain exposure to software development, testing, deployment, maintenance and improvement, in addition to production lifecycle methodologies and risk guidelines. Finally, you'll have the opportunity to develop professionally and to grow your career in any direction you choose. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Act as a Production Support engineer in the Global Fixed Income IT front office production service team, providing first and second level support to the Trading, Sales and business community for the suite of applications and their associated components. Serve as single point of contact to the business, acting as the prime liaison for the application suite into the incident, problem, change, release, capacity, and continuity functions. The role also includes but is not limited to: Production support/engineering role - Providing technical advice/resolution, Ensuring that all issues are recorded and tracked in accordance to policy, Reviewing and ownership of all outstanding issues, Management and communication of major production incidents Oversight into the release of change into the production environment Active participation in regular capacity and monitoring reviews User queries and Training Liaising with development teams on new application handover and 3rd line escalation of issues The role also demands an early start on Mondays (including on Public Holidays) and any other weekdays and working on Weekends (for releases) and public holidays Your skills and experience This role requires a wide variety of strengths and capabilities, including: At least Bachelors degree in a Technical / Engineering subject or equivalent experience Basic knowledge of application development lifecycle and maintenance Experience of 4-10 years in Financial/Investment banking domain and supporting front office trading applications Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals Familiarity with production management i.e. capacity management, incident management, problem management, change management Understand business fundamentals of Fixed Income (Rates, Credit, Intraday Risk, and EOD Risk) trading including Post Trade functions. Written and verbal communication skills sufficient to communicate directly with business and technical partners at all levels. Systematic, fact based decision making and problem solving. Unix/Linux: Linux skills and process management Some exposure to shell scripting (e.g. bash), Perl and/or python Databases: Required IntermediateOracle SQL, Optional Sybase, MS sql Networks Required Intermediate TCP/IP, UDP, Network monitoring, Low latency networks Optional multicast, routing MessagingOne or more of - LBM (Informatica UltraMessaging), Tibco EMS, Solace Team Player work with multiple teams, vendors and groups Detail Oriented Can organise and work with multiple systems Ability to manage and prioritise multiple issues; work under pressure Adaptable and able to quickly learn new processes and/or technologies How well support you
Posted 1 week ago
4.0 - 6.0 years
15 - 25 Lacs
Hyderabad
Work from Office
Job Summary Join our dynamic team as a TL-Product Info & MDM where you will leverage your expertise in Customer Service Management and Supply Chain Analytics within the retail domain. With a hybrid work model and rotational shifts you will play a crucial role in optimizing order management processes and enhancing customer service experiences. This position offers an exciting opportunity to contribute to our companys growth and impact the retail industry positively. Responsibilities Lead the development and implementation of product information management strategies to enhance data accuracy and accessibility. Oversee the integration of customer service management systems to improve service delivery and customer satisfaction. Provide insights and analytics on supply chain processes to optimize retail operations and drive efficiency. Collaborate with cross-functional teams to streamline order management processes and ensure timely fulfillment. Analyze customer feedback and service metrics to identify areas for improvement and implement corrective actions. Coordinate with IT teams to ensure seamless integration of MDM solutions with existing systems. Develop and maintain documentation for product information and MDM processes to ensure consistency and compliance. Monitor industry trends and best practices to continuously improve customer service and supply chain strategies. Facilitate training sessions for team members to enhance their understanding of MDM and customer service management tools. Support the development of KPIs to measure the effectiveness of customer service and supply chain initiatives. Ensure data governance and quality standards are met across all product information and MDM activities. Drive initiatives to enhance customer experience and loyalty through improved service delivery. Collaborate with stakeholders to align MDM strategies with business objectives and customer needs. Qualifications Possess strong analytical skills with experience in supply chain analytics within the retail domain. Demonstrate expertise in customer service management and order management processes. Exhibit proficiency in MDM tools and technologies to support data management initiatives. Have a solid understanding of retail industry trends and best practices. Show excellent communication and collaboration skills to work effectively in a hybrid work model. Display the ability to work in rotational shifts and adapt to changing priorities. Hold a bachelors degree in a relevant field or equivalent work experience.
Posted 1 week ago
3.0 - 8.0 years
9 - 14 Lacs
Chennai
Remote
Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
3.0 - 8.0 years
9 - 14 Lacs
Kolkata
Work from Office
Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization
Posted 1 week ago
8.0 - 13.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Career Category Information Systems Job Description ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: We are seeking an experienced MDM Engineer with 8-12 years of experience to lead development and operations of our Master Data Management (MDM) platforms, with hands-on experience in data engineering experience. This role will involve handling the backend data engineering solution within MDM team. This is a technical role that will require hands-on work. To succeed in this role, the candidate must have strong Data Engineering experience. Candidate must have experience on technologies like (SQL, Python, PySpark, Databricks, AWS, API Integrations etc). Roles & Responsibilities: Develop distributed data pipelines using PySpark on Databricks for ingesting, transforming, and publishing master data Write optimized SQL for large-scale data processing, including complex joins, window functions, and CTEs for MDM logic Implement match/merge algorithms and survivorship rules using Informatica MDM or Reltio APIs Build and maintain Delta Lake tables with schema evolution and versioning for master data domains Use AWS services like S3, Glue, Lambda, and Step Functions for orchestrating MDM workflows Automate data quality checks using IDQ or custom PySpark validators with rule-based profiling Integrate external enrichment sources (e. g. , D&B, LexisNexis) via REST APIs and batch pipelines Design and deploy CI/CD pipelines using GitHub Actions or Jenkins for Databricks notebooks and jobs Monitor pipeline health using Databricks Jobs API, CloudWatch, and custom logging frameworks Implement fine-grained access control using Unity Catalog and attribute-based policies for MDM datasets Use MLflow for tracking model-based entity resolution experiments if ML-based matching is applied Collaborate with data stewards to expose curated MDM views via REST endpoints or Delta Sharing Basic Qualifications and Experience: 8 to 13 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced proficiency in PySpark for distributed data processing and transformation Strong SQL skills for complex data modeling, cleansing, and aggregation logic Hands-on experience with Databricks including Delta Lake, notebooks, and job orchestration Deep understanding of MDM concepts including match/merge, survivorship, and golden record creation Experience with MDM platforms like Informatica MDM or Reltio, including REST API integration Proficiency in AWS services such as S3, Glue, Lambda, Step Functions, and IAM Familiarity with data quality frameworks and tools like Informatica IDQ or custom rule engines Experience building CI/CD pipelines for data workflows using GitHub Actions, Jenkins, or similar Knowledge of schema evolution, versioning, and metadata management in data lakes Ability to implement lineage and observability using Unity Catalog or third-party tools Comfort with Unix shell scripting or Python for orchestration and automation Hands on experience on RESTful APIs for ingesting external data sources and enrichment feeds Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e. g. Informatica, Reltio etc) Any Data Analysis certification (SQL, Python, PySpark, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. GCF Level 05A .
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS(BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS
Posted 1 week ago
2.0 - 5.0 years
14 - 17 Lacs
Pune
Work from Office
As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 week ago
0.0 - 8.0 years
13 - 15 Lacs
Chennai
Work from Office
Ford Credit IT is looking for a proficient technical anchor who is having excellent hands-on in Salesforce Service Cloud Interaction studio, Omni Studio, Velocity and Mobile Studio, AmpScript, JSON/Apex, JavaScript, Lightning components, Aura Component and Lighting Web Component with software engineering practices. Technical Anchor will build the scalable and fully available technical solutions in Digital space with team of software engineers based out of India and will be responsible to support NA markets. Technical anchor will collaborate directly and continuously with Software Engineers, Product Managers, Designers, Architects, Engineering Manager and Product Owners of Salesforce team. Description for Internal Candidates As a Technical Anchor working in Ford Credit IT, you will join a team that supports to develop enterprise scale applications/building SaaS products in the Salesforce Service Cloud/ Auto Cloud. Work on a balanced product team to define, design, develop and deploy Salesforce Service Cloud/ Auto Cloud in developing Form Data Models, Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio, contact builder, data extension, data sync, Sitemap, content block. Ability to Productize (Build/Author) a document generation product as a SaaS (Software as a Service) products hosted on Mulesoft and Google Cloud Platform (GCP). Build and maintain digital expertise by researching latest industry trends and standards, driving innovation through PoCs and experiments. Develop Salesforce Service Cloud/ Auto Cloud applications . Evaluate potential solutions using both technical and commercial criteria that support the established cost and service requirements with continuous improvement and innovative mindset. Develop and automate unit and integration test scripts. Integrated with MuleSoft applications for integrations around Sales/ Service clouds with Ford Credit Systems. Act as a mentor for less experienced developers through both your technical knowledge and ability to inspire a team to build extraordinary impact together. Understand the depth of the User Stories and provide accurate estimates. Automate performance monitoring and notification in the event of failures using best practices and tools. Research new technologies, influences and implements enterprise technology shifts and new trends impacting Ford application delivery. Do code deployments using CICD Salesforce Salescloud and Mulesoft pipeline with Service cloud - Copado Salesforce deployment. Participate in highly collaborative environment. DevOps o Continuous Integration and Continuous Deployment (CI/CD) Security (SAST/DAST) Monitoring/logging/tracing/ tools (SPLUNK etc ) Experience deployment using source control using Visualsourcecode/Github repo/Copado. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Build a reusable component using LWC component, AmpScript, Service Side Java Script (SSJS), and SQL. Integrating salesforce Marketing cloud with external system using SFMC APIs Follow enterprise architecture processes and advise teams on cloud design, development, and architecture, service blueprints. Engage in Agile practices including but not limited to Stand-ups, backlog grooming, sprint demos and journey mapping. B. E. / B. Tech / M. C. A Minimum 7 years of experience developing Salesforce Service/Auto Cloud customizations. Responsibilities for Internal Candidates Extensive experience in Ampscript, Apex, JavaScript, Lightning components, Aura Component and Lighting Web Component, Omniscript, Velocity Must have experience in in contact builder, data extension, data sync, Sitemap, content block, Lead Service/Auto Cloud data modeling and architecture including data extension modeling and cross-product data architecture & mapping Ability to integrate Mulesoft, Informatica, Grapghql, Mediallia and Emplifi. Ability to create flows, modify objects, create custom objects, write Apex, triggers and integrate API services using an IDE Demonstrated ability to drive development of highly technical technology services and capabilities. Experience with the Salesforce. com APEX data loader , Salesforce. com web services APIs/Platform Event/Changedata capture/REST/Pub/Sub. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Demonstrated experience in Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio. Demonstrated experience establishing and maintaining data structures, data extensions and automations within Salesforce Service/Auto Cloud Experience in Enterprise data analytics, Reporting and Monitoring using Splunk, Dynatrace, healthnut etc Qualifications for Internal Candidates 5+ years of experience in architecting and implementing fault tolerant, highly available Service/Auto cloud API/REST/SOAP, Platform Event(Pub/Sub). Salesforce Service/Auto Cloud Developer/Consultant/Salesforce Application Architect certifications will be an added advantage. Should have SQL knowledge and have the experience writing database scripts using DDL or queries using DML. Experience in SRE in Copado and ability to architect the services considering observability, traceability and monitoring aspects. At least 4 years of experience in Agile scrum software development process. Ability to work in team in diverse/ multiple stakeholder environment. Experience and desire to work in a Global delivery environment. Excellent communication skills with the ability to adapt your communication style to the audience. Demonstrated ability to drive development of highly technical technology services and capabilities. Experience deployment using source control using change sets and CICD pipelines.
Posted 1 week ago
8.0 - 11.0 years
13 - 18 Lacs
Noida
Work from Office
Position: SAP S4HANA Data Migration Lead Location: Chennai/ Noida Experience: 8 -11 years Education: B. E. / B. Tech. / MCA Job Description At least one end to end experience as Data Lead on one project At least 8 years of IT experience in SAP Data Migration, Data Analysis, Data Audits, Process Management, Business Analysis, ECM, Business Process Re-engineering, RFPs, Quality Assurance, Data Analysis and Modeling and Testing of enterprise wide client/server and Web-based applications and all aspects of Software Engineering and Systems Development Life Cycle SDLC. Experience as ABAP Developer. Able to understand abap, data dictionary , read bapi code ad debug. Certified in any of the ETL tools like Informatica/BODS/ADM Syniti. Good work experience in SAP data migration using LTMC and LTMON. Basic development skill in VBA excel or Microsoft SQL server in case no tool is provided but file management is needed Extensive experience in managing Industrial, Finance and Telecom Clients - Manufacturing process needed for AMC here ( Sales, purchasing, production, finance) Expertise in Cutover Planning, Project Planning, Project Design, Gathering Business and Functional requirements, creating functional specifications, and Use case data flow diagrams Worked in SAP Finance, SAP SD, SAP MM, SAP SCM, SAP MDM, SAP Project Systems, Business partner migration from SAP ECC or Open source system to S/4 HANA.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France