Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
2 - 6 Lacs
Chennai
Work from Office
Job Title:Senior Data AnalystExperience5-10 YearsLocation:Remote : As a Senior Data Analyst, you will provide invaluable insights to our product and Business teams, working closely with our product and engineering leaders. You'll support impactful initiatives including pricing, packaging, multi-product strategies, and self-service approaches. You will collaborate cross-functionally with Engineering, Sales, Finance, Marketing, and Customer Relations to translate data needs into powerful solutions. You'll also establish best practices in analytics and mentor team members, ensuring high standards in data governance and insight generation. Your Mission: Partner with product and business leaders to define the analytics roadmap and deliver key projects in customer acquisition, engagement, and retention. Provide actionable insights and compelling narratives to influence major decisions at the C-level. Build a culture of experimentation by designing and executing rigorous A/B tests to measure business impact. Work closely with data engineers to continuously improve the data stack, governance practices, and analysis quality. A Little More About You: 5+ years of hands-on technical experience in a high growth environment. Proficiency in SQL for deriving insights. NEEDED Experience in building "Looker dashboards " in BI tools Proficiency in Python is a plus.
Posted 2 weeks ago
6.0 - 9.0 years
4 - 8 Lacs
Chennai
Work from Office
Job Title:Data ScientistExperience6-9 YearsLocation:Remote : Your Mission: Lead groundbreaking projects that leverage AI and machine learning techniques to optimize operations and drive business success. Take ownership of the entire lifecycle from model design, training, productionization, and incremental enhancement. Foster a culture of experimentation by designing and scaling rigorous A/B tests when needed to measure business impact. Educate internal analytics teams on current and emerging research, technologies, and best practices for leveraging ML and LLM in product design and development. Collaborate with AI Platform engineers to identify upcoming requirements and continuously evolve the AI platform offering. Establish best practices in data science and mentor future team members to maintain high standards in data governance and insight generation. Communicate effectively with stakeholders across various technical levels, articulating insights and recommendations about your solutions. Cultivate trust with stakeholders by consistently addressing their true fundamental needs and challenges. A Little More About You: 6+ years of data science, machine learning/statistical modeling experience. Experience implementing theoretical models in an applied environment. M.S. or Ph.D. in Statistics, Computer Science, Math, Operations Research, Physics, Economics, or other quantitative field. Strong interest in delivering data-driven solutions using ML and AI. Proficiency in data querying languages (e.g., SQL), scripting languages (e.g., Python), or statistical/mathematical software. Solid understanding of statistics, machine learning, operations research, and causal inference. Knowledge of NLP or Bayesian testing is a plus. Insatiable curiosity and bias toward action.
Posted 2 weeks ago
5.0 - 10.0 years
2 - 5 Lacs
Chennai
Work from Office
Job Title:Data EngineerExperience5-10YearsLocation:Remote : Responsibilities: Design, build and maintain core data infrastructure pieces that allow Aircall to support our many data use cases. Enhance the data stack, lineage monitoring and alerting to prevent incidents and improve data quality. Implement best practices for data management, storage and security to ensure data integrity and compliance with regulations. Own the core company data pipeline, responsible for converting business needs to efficient & reliable data pipelines. Participate in code reviews to ensure code quality and share knowledge. Lead efforts to evaluate and integrate new technologies and tools to enhance our data infrastructure. Define and manage evolving data models and data schemas. Manage SLA for data sets that power our company metrics. Mentor junior members of the team, providing guidance and support in their professional development. Collaborate with data scientists, analysts and other stakeholders to drive efficiencies for their work, supporting complex data processing, storage and orchestration A little more about you: Bachelor's degree or higher in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on designing and building data pipelines and infrastructure. Proficient in SQL and Python, with the ability to translate complexity into efficient code. Experience with data workflow development and management tools (dbt, Airflow). Solid understanding of distributed computing principles and experience with cloud-based data platforms such as AWS, GCP, or Azure. Strong analytical and problem-solving skills, with the ability to effectively troubleshoot complex data issues. Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. Experience with data tooling, data governance, business intelligence and data privacy is a plus.
Posted 2 weeks ago
12.0 - 15.0 years
13 - 18 Lacs
Gurugram
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP Data Services Development Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the data architecture aligns with the overall business objectives and technical specifications. You will collaborate with various stakeholders to gather requirements and translate them into effective data solutions, while also addressing any challenges that arise during the design and implementation phases. Your role will be pivotal in ensuring that the data architecture is robust, scalable, and capable of supporting future growth and innovation within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor and evaluate the effectiveness of data solutions and make necessary adjustments. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Data Migration.- Experience with SAP Data & Development.- Strong understanding of data modeling techniques and best practices.- Familiarity with data integration tools and methodologies.- Ability to design and implement data governance frameworks. Additional Information:- The candidate should have minimum 12 years of experience in SAP Data Migration.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in troubleshooting and problem-solving to enhance application performance and user experience, while mentoring team members and facilitating knowledge sharing within the team. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate training sessions for team members to enhance their skills and knowledge.- Engage in continuous improvement initiatives to optimize application performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles and practices.- Experience with application design and configuration.- Ability to troubleshoot and resolve application issues effectively.- Familiarity with integration processes and data management. Additional Information:- The candidate should have minimum 3 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
4.0 - 8.0 years
0 - 0 Lacs
Pune
Hybrid
So, what’s the role all about? Within Actimize, the AI and Analytics Team is developing the next generation advanced analytical cloud platform that will harness the power of data to provide maximum accuracy for our clients’ Financial Crime programs. As part of the PaaS/SaaS development group, you will be responsible for developing this platform for Actimize cloud-based solutions and to work with cutting edge cloud technologies. How will you make an impact? NICE Actimize is the largest and broadest provider of financial crime, risk and compliance solutions for regional and global financial institutions & has been consistently ranked as number one in the space At NICE Actimize, we recognize that every employee’s contributions are integral to our company’s growth and success. To find and acquire the best and brightest talent around the globe, we offer a challenging work environment, competitive compensation, and benefits, and rewarding career opportunities. Come share, grow and learn with us – you’ll be challenged, you’ll have fun and you’ll be part of a fast growing, highly respected organization. This new SaaS platform will enable our customers (some of the biggest financial institutes around the world) to create solutions on the platform to fight financial crime. Have you got what it takes? Design, implement, and maintain real-time and batch data pipelines for fraud detection systems. Automate data ingestion from transactional systems, third-party fraud intelligence feeds, and behavioral analytics platforms. Ensure high data quality, lineage, and traceability to support audit and compliance requirements. Collaborate with fraud analysts and data scientists to deploy and monitor machine learning models in production. Monitor pipeline performance and implement alerting for anomalies or failures. Ensure data security and compliance with financial regulations Qualifications: Bachelor’s or master’s degree in computer science, Data Engineering, or a related field. 4-6 years of experience in DataOps role, preferably in fraud or risk domains. Strong programming skills in Python and SQL. Knowledge of financial fraud patterns, transaction monitoring, and behavioral analytics. Familiarity with fraud detection systems, rules engines, or anomaly detection frameworks. Experience with AWS cloud platforms Understanding of data governance, encryption, and secure data handling practices. Experience with fraud analytics tools or platforms like Actimize What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7822 Reporting into: Director Role Type: Tech Manager
Posted 2 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Gurugram
Work from Office
Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 weeks ago
1.0 - 4.0 years
2 - 4 Lacs
Hyderabad
Work from Office
send cvs to shilpa.srivastava@orcapod.work with subject Data Governance Notice period- Immediate to 15 days only Excellent spoken English Location - Hyderabad Responsibilities: Verifying Account/Contact information for new and existing records in databases Conducting secondary research to gather information related to contact and account attributes Validating the gathered information and updating relevant attributes in CRM to improve data quality Demonstrated experience in data governance, including defining business terms and implementing data quality (DQ) rules Understanding how account or contact data is used within the firm, and identifying patterns and logic deficiencies that will need to be refined to improve our tools and processes (e.g. contact matching algorithm in master data management platform) Working across multiple enterprise systems (e.g. salesforce, Seibel CRM, proprietary master data management, relationship intelligence platform) to enhance data quality Prioritizing and actioning a queue of data quality issues in contact and/or opportunity records, ensuring that Service Level Agreements are met Monitoring data quality and adherence to business policies and established conventions Documenting data quality processes and rules as needed Liaising with other teams for proactive data enrichment/correction activities Liaising with practitioners/requestors to understand requests and recommend next steps as appropriate Providing support for projects as required, including participating in system enhancement projects, liaising with IT by reporting and resolving any system exceptions. Keen eye on identifying patterns and scenarios that reoccur in the process and suggest solution for automation. Assisting team members in executing ad-hoc requests, and providing project planning/coordination support on work assignments Ensuring compliance with all data management policies Qualifications and Experience: The successful candidate will meet the following criteria: 1-2 years experience working in an enterprise CRM system. Working knowledge/experience of Salesforce would be an asset Experience or knowledge of tools such as LinkedIn, Factiva, Hoovers, etc. Experience in conducting secondary research (e.g., market, companies, industries) Excellent oral and written communication skills Attention to detail, and ability to be a self-starter Ability to collaborate with culturally diverse offshore teams in different zones Working knowledge of data quality management, data entry improvement and user requirements Demonstrated ability to work effectively in cross-functional, virtual teams Process oriented and must be able to work with a high degree of detail and have high quality standards Ability to assist in development and implementation of policy, standards and procedures Demonstrated PC skills: Microsoft Office-Excel, Word, Access, and querying tools like SQL Strong analytical, conceptual, and problem-solving abilities Ability to present ideas in a user-friendly language Excellent organizational and time-management skills Ability to prioritize and execute tasks in a high-pressure, fast-paced environment Knowledge on CASL the Canadian anti-spam legislation and consent related processes Experience with Tableau is preferred, particularly in measuring data synchronization and working with large data sets. Experience managing marketing campaigns and handling consent-related processes is desirable. Proficiency in French is an advantage. Experience in project coordination will be an asset Experience with Generative AI (Gen AI) technologies would be an added advantage. Must-have Requirements: Working knowledge of CRM Salesforce for managing customer relationships and data. Experience in conducting secondary research. Understanding of data governance principles and best practices. Advanced proficiency in Microsoft Excel, including complex formulas and data analysis tools. Foundational knowledge of SQL for querying and managing relational databases. Awareness of data quality frameworks and techniques to ensure accurate and reliable information. Value-added Requirements: Familiarity with CASL (Canadas Anti-Spam Legislation) and its application in business communications. Experience creating interactive dashboards and reports using Tableau. Proficiency in developing process flows and diagrams using Microsoft Visio. Demonstrated ability to support project coordination activities across cross-functional teams. Exposure to Generative AI (Gen AI) technologies and their business applications. The job description is subject to change based on business/project requirements. Work Location: Hyderabad Shift Timings: 2 PM to 11 PM IST
Posted 2 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
Noida
Work from Office
Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Gurugram
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 2 weeks ago
7.0 - 10.0 years
5 - 8 Lacs
Gurugram
Remote
Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
8.0 - 13.0 years
8 - 13 Lacs
Telangana
Work from Office
Key Responsibilities: Team Leadership: Lead and mentor a team of Azure Data Engineers, providing technical guidance and support. Foster a collaborative and innovative team environment. Conduct regular performance reviews and set development goals for team members. Organize training sessions to enhance team skills and technical capabilities. Azure Data Platform: Design, implement, and optimize scalable data solutions using Azure data services such as Azure Databricks, Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics. Ensure data engineering best practices and data governance are followed. Stay up-to-date with Azure data technologies and recommend improvements to enhance data processing capabilities. Data Architecture: Collaborate with data architects to design efficient and scalable data architectures. Define data modeling standards and ensure data integrity, security, and governance compliance. Project Management: Work with project managers to define project scope, goals, and deliverables. Develop project timelines, allocate resources, and track progress. Identify and mitigate risks to ensure successful project delivery. Collaboration & Communication: Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders to deliver data-driven solutions. Communicate effectively with stakeholders to understand requirements and provide updates. Qualifications: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Team Lead or Manager in data engineering. Extensive experience with Azure data services and cloud technologies. Expertise in Azure Databricks, PySpark, and SQL. Strong understanding of data engineering best practices, data modeling, and ETL processes. Experience with agile development methodologies. Certifications in Azure data services (preferred). Preferred Skills: Experience with big data technologies and data warehousing solutions. Familiarity with industry standards and compliance requirements. Ability to lead and mentor a team.
Posted 2 weeks ago
7.0 - 10.0 years
0 - 1 Lacs
Bengaluru
Work from Office
Data Governance Engineer : Level of Experience: 8+ years of experience Must-Have Skillset: Hands-on experience with Data Governance: Data Governance encompasses various components such as Data Control, Data Privacy, Data Ethics, and Data Strategy. While the candidate may not have experience in all areas, hands-on experience in at least one and an understanding of the others is essential. Existing understanding of Understanding of EDM/DAMA/DCAM would be very useful. Experience in multi-team global collaboration: The CoE team is central to multiple global teams in International. The candidate should be adept at navigating these complexities. Experience with strategic initiatives: The I-DM CoE, particularly the Data Governance segment, is a strategic team within International. Prior experience with strategic solutioning is crucial, whereas experience in delivery roles may not be suitable. Strong communication skills. Good-to-Have Skillset: Pharma background, as the enterprise data landscape in the pharma industry differs from other domains. Experience working with non-US clients. Consulting background Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified by Great Place To Work®, the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We take affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com) Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental LeaveoLearning & Career Development Employee Wellness About Apexon: Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences – to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients’ toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. We enable #HumanFirstDigital
Posted 2 weeks ago
12.0 - 20.0 years
35 - 45 Lacs
Gurugram
Work from Office
Role & responsibility Type of profiles we are looking for - 10+ years of experience in driving large data programs in Banking. 8-10 years of Experience in implementing Data Governance frameworks. In depth understanding of RDAR, BCBS 239, Financial & Non-Financial risks. Experience in data engineering and good understanding of ETL & data platforms. Experience in Risk Regulatory & Data programs. Experience of creating data architectures in GCP. Working knowledge of databrics. BFS domain experience is a must Good Communication skills. Must visit office 3 days a week Key day to day responsibilities of the candidate Work with client technology partners. Be the link b/w engineering team & business stakeholders. Take reporting & data aggregation requirements from business and liaison with Tech to integrate the logical data models into Datahub. Assist the client tech teams in building new data platform. Experience in building data models, quality controls and data profiling. Good to have Understanding of ServiceNow. Worked on BCBS 239. Project management experience
Posted 2 weeks ago
15.0 - 20.0 years
18 - 22 Lacs
Noida
Remote
Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)
Posted 2 weeks ago
8.0 - 12.0 years
10 - 14 Lacs
Noida
Work from Office
Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.
Posted 2 weeks ago
7.0 - 10.0 years
5 - 8 Lacs
Noida
Remote
Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
Noida
Work from Office
Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management
Posted 2 weeks ago
7.0 - 10.0 years
10 - 14 Lacs
Noida
Work from Office
About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.
Posted 2 weeks ago
4.0 - 7.0 years
15 - 18 Lacs
Bhubaneswar, Coimbatore, Bengaluru
Work from Office
Role & responsibilities The candidate must have deep expertise in data management maturity models, data governance frameworks, and regulatory requirements, ensuring businesses can maximize their data assets while complying with both local and international regulations. This is an exciting opportunity to work in a consulting environment, collaborating with industry leaders and driving data-driven business transformation. This role is based in India, with the expectation of traveling to Middle Eastern client locations as required 1. Data Strategy & Advisory Develop and implement enterprise-wide data strategies aligned with business objectives. Assess data maturity levels using industry-standard frameworks and define roadmaps for data-driven transformation. Advise clients on data monetization, data quality, and data lifecycle management 2. Data Governance & Compliance Define and implement data governance frameworks, policies, and best practices. Ensure compliance with local and international data regulations, including GDPR, HIPAA, and region-specific laws. Develop data stewardship programs, ensuring clear roles and responsibilities for data management. 3. Regulatory & Risk Management Provide expertise on data privacy, security, and risk management strategies. Align data strategies with regulatory frameworks such as ISO 27001, NIST, and other industry-specific compliance standards. Advise on data sovereignty and cross-border data transfer policies. 4. Consulting & Pre-Sales Support Conduct client workshops to define data strategy and governance models. Develop thought leadership, whitepapers, and strategic insights to support client engagements. Assist in business development efforts, including proposals and pre-sales discussions. 5. Team Mentorship & Leadership Mentor junior consultants on data governance and strategic advisory. Stay updated on emerging trends in data strategy, regulations, and governance technologies. Represent the company at industry events, conferences, and knowledge-sharing forums. Preferred candidate profile 1. Education & Experience Bachelors or Masters in Data Management, Business Analytics, Information Systems, or a related field. 5 years of experience in data strategy, governance, or regulatory compliance consulting. 2. Technical & Regulatory Expertise Deep understanding of data management maturity models (e.g., DAMA-DMBOK, CMMI for Data Management) ; Should be DAMA Certified Basic Proficiency in data governance tools such as Collibra, Informatica, or Azure Purview. Strong knowledge of local and international data regulations (e.g., GDPR, CCPA, PDPA, UAE’s NDPL, KSA-NDMO , UAE DGE Data Regulations, Dubai Data Law).
Posted 2 weeks ago
9.0 - 12.0 years
16 - 21 Lacs
Pune
Hybrid
So, what’s the role all about? As a Program Manager, you will be responsible for overseeing multiple projects and initiatives that support the organization's strategic goals. You will work closely with cross-functional teams to ensure successful project execution, on-time delivery, and adherence to quality standards. How will you make an impact? Define project scope, goals, and deliverables that support business goals in collaboration with senior management and stakeholders. Develop and maintain a detailed project plan to track progress and ensure timely delivery of project milestones. Monitor project progress and performance, identify and mitigate risks and issues, and communicate status updates to stakeholders and senior management Collaborate with cross-functional teams to identify and resolve project-related issues and roadblocks. End to end Agile project management responsibility– in terms of scope, quality, resources and risk management as well as timeline and organizational release readiness Develop and maintain strong relationships with key stakeholders to ensure project success and alignment with business objectives Ensure adherence to project management methodologies, standards, and best practices, and continuously improve project management processes and tools Lead project meetings and presentations, and facilitate communication and collaboration among team members and stakeholders Have you got what it takes? 10-14 years of experience in IT industry with 5+ years of experience in hard core Software Development Project & Program management Strong understanding of project management methodologies, tools, and techniques Proven track record of successfully managing multiple projects and initiatives simultaneously. Excellent communication, negotiation, and interpersonal skills Ability to work collaboratively with cross-functional teams and manage multiple stakeholders. Strong attention to detail and ability to manage competing priorities. Working knowledge of various methodologies like, Agile-Scrum Practices Ability to drive project decisions through strong Data governance, Metrics. Strong problem-solving and decision-making skills Hands-on knowledge & experience on- Software Development & Quality - Processes & standards Release Management Pre & Post-Production Product Launches Hands on exp on Atlassian Tools (JIRA/Confluence), Basic Knowledge of: Cloud- AWS, DevOps practices PMP certification preferred. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager, Program Management Role Type: Individual Contributor
Posted 2 weeks ago
4.0 - 8.0 years
8 - 12 Lacs
Pune
Hybrid
So, what’s the role all about? We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data—without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact? Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types—eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray.ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes? Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 5+ Years’ hands-on experience with AWS Knowledge Hub, AppFlow, Tray.ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, .NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor
Posted 2 weeks ago
8.0 - 13.0 years
13 - 17 Lacs
Mumbai
Work from Office
Project description Our client is a leading commodity trading and logistics company. They are committed to building and maintaining world-class IT applications and infrastructure. The Trading IT group directly supports the trading business, and this business has started a far-reaching programme to enhance and improve its trading applications using an innovative architecture to support business growth across the full range of business lines and geographies, and to enable the sharing of systems across different businesses. This programme is aimed at delivering functional capabilities, enhancements, and technical infrastructure upgrades to enable continued business growth and enhanced profitability for the firm. Client is looking to replace existing reconciliation system Gresham with Exceptor which will be enterprise-wide recon platform across FO, MO and BO Responsibilities a) Determine and define project scope and objectives b) Predict resources needed to reach objectives and manage resources in an effective and efficient manner c) Develop and manage a detailed project schedule and work plan d) Provide project updates on a consistent basis to various stakeholders about strategy, adjustments, and progress e) Manage vendors and stakeholder tasks and communicating expected deliverables f) Utilize industry best practices, techniques, and standards throughout entire project execution g) Monitor progress and make adjustments as needed h) Measure project performance to identify areas for improvement i) Maintain roadmap and maintain resource allocation / utilization Skills Must have Knowledge & Experience: Overall 8+ years of experience out of which at least 5 years in OTC derivatives space Minimum 5 years of experience as project manager Knows how to handle project complexity in terms of stakeholder management, conflict management, change management etc. Understand concepts such as static data, industry codes, data governance and control as well as financial reporting Have worked in a finance department and understand basic reporting concepts Experience working inteam engagements to finalize new operating models and roadmaps for change across people, process, data and technology Review processes, bypasses, challenges ahead and propose proxy approach Adaptable to an evolving scope of tasks, comfortable with uncertainty as well as changing global requirements Leads by example change management best practice on initiatives driven by the workstreams Familiarity with AGILE methodologies Knowledge of project planning tools. Familiar with and able to apply project management methodologies (for example, PMI, Prince II and agile) Good understanding of current and emerging technologies and how other enterprises are employing them to drive digital business Exceptional verbal and written communication skills; expertise in setting and managing customer expectations Distinctive blend of business, IT, financial and communication skills, as this is a highly visible position with substantial impact Effective influencing and negotiating skills in an environment where this role may not directly control resources Nice to have Prior experience in reconciliation Other Languages EnglishC2 Proficient Seniority Senior
Posted 2 weeks ago
3.0 - 10.0 years
0 Lacs
thane, maharashtra
On-site
You are seeking a dynamic and experienced individual to manage Call Centre operations for service as a Contact Centre Manager. As the Contact Centre Manager, you will be responsible for optimizing Customer Response Centre (CRC) processes, leading a team to achieve service excellence, and managing added responsibilities such as Warranty Administration and Service Master data management. Your role will involve ensuring compliance with manufacturer and company policies, maintaining accurate records, and facilitating excellent customer support both internally and externally. Your strong interpersonal skills will enable you to build and maintain positive relationships with colleagues, clients, and stakeholders, fostering a collaborative and supportive work environment. To qualify for this role, you should have a Bachelor's degree in business administration, Electronics and Telecommunications, Electrical, or a related field. Additionally, you should have at least 10 years of experience in contact centre operations, with a minimum of 3 years in a managerial role. Your key responsibilities will include overseeing the daily operations of subcontracted contact centre, developing customer service strategies, coaching and managing a team of customer service representatives, monitoring key performance indicators, handling escalated customer issues, analyzing call centre data, developing training programs, ensuring compliance with company policies and industry regulations, and working closely with service and IT teams to improve customer support processes. In addition, you will be responsible for reviewing extended warranty claims, communicating with Service Engineers and manufacturers, tracking and monitoring warranty claims, maintaining detailed records, assisting customers and internal teams with warranty-related inquiries, staying updated on internal policies and warranty guidelines, and supporting service department operations as needed. You will also be involved in developing, implementing, and maintaining master data management policies, collaborating with cross-functional teams, managing data lifecycle processes, resolving data quality issues, enforcing data governance frameworks, generating reports from master data, providing training to business users, and using enterprise resource planning tools to log and track warranty claims and service requests. To excel in this role, you should have proven experience in contact centre management or a similar leadership role, a strong understanding of customer service principles and call centre technologies, excellent leadership and team-building skills, the ability to analyze data and make strategic decisions, proficiency in Oracle E Business Suite, call centre software, and workforce management tools, and the ability to handle high-pressure situations and multitask effectively. Furthermore, you should possess good domain knowledge in the field service and service sales domain, including understanding Service Level Agreements (SLAs), Key Performance Indicators (KPIs), service processes, sales processes, problem-solving skills, and critical thinking. Your soft skills should include strong communication and presentation skills, collaboration skills, attention to detail, curiosity, continuous learning, and the ability to work in an interruption-driven environment. Travel may be required up to 5% (domestic and international), and the successful candidate will be expected to embrace Vertiv's Core Principles & Behaviors to help execute the company's Strategic Priorities. Please note that Vertiv will only employ those who are legally authorized to work in the United States, and this position does not offer sponsorship for work authorization.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Quality Engineer, you will collaborate with product, engineering, and customer teams to gather requirements and develop a comprehensive data quality strategy. You will lead data governance processes, including data preparation, obfuscation, integration, slicing, and quality control. Testing data pipelines, ETL processes, APIs, and system performance to ensure reliability and accuracy will be a key responsibility. Additionally, you will prepare test data sets, conduct data profiling, and perform benchmarking to identify inconsistencies or inefficiencies. Creating and implementing strategies to verify the quality of data products and ensuring alignment with business standards will be crucial. You will set up data quality environments and applications in compliance with defined standards, contributing to CI/CD process improvements. Participation in the design and maintenance of data platforms, as well as building automation frameworks for data quality testing and resolving potential issues, will be part of your role. Providing support in troubleshooting data-related issues to ensure timely resolution is also expected. It is essential to ensure that all data quality processes and tools align with organizational goals and industry best practices. Collaboration with stakeholders to enhance data platforms and optimize data quality workflows will be necessary to drive success in this role. Requirements: - Bachelors degree in Computer Science or a related technical field involving coding, such as physics or mathematics - At least three years of hands-on experience in Data Management, Data Quality verification, Data Governance, or Data Integration - Strong understanding of data pipelines, Data Lakes, and ETL testing methodologies - Proficiency in CI/CD principles and their application in data processing - Comprehensive knowledge of SQL, including aggregation and window functions - Experience in scripting with Python or similar programming languages - Databricks and Snowflake experience is a must, with good exposure to notebook, SQL editor, etc. - Experience in developing test automation frameworks for data quality assurance - Familiarity with Big Data principles and their application in modern data systems - Experience in data analysis and requirements validation, including gathering and interpreting business needs - Experience in maintaining QA environments to ensure smooth testing and deployment processes - Hands-on experience in Test Planning, Test Case design, and Test Result Reporting in data projects - Strong analytical skills, with the ability to approach problems methodically and communicate solutions effectively - English proficiency at B2 level or higher, with excellent verbal and written communication skills Nice to have: - Familiarity with advanced data visualization tools to enhance reporting and insights - Experience in working with distributed data systems and frameworks like Hadoop,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough