Home
Jobs

168 Aggregations Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

5 - 10 Lacs

Ahmedabad

On-site

GlassDoor logo

Job title Technical Lead (EPM) - Ahmedabad, India Ref # 212477 Location India - Ahmedabad Job family Corporate & Commercial Closing date: 13-Jul-2025 About the role: In this role, you will serve as the technical subject matter expert pushing technologies to the limits. Work with our cross-technical teams to design and build the next generation applications keeping a mobile first approach where business so demands while adhering to standard coding practices Key Responsibilities : Analyze business requirements, define solution parameters and specifications. Maintenance and Support of implemented Oracle EPM Cloud application modules Enterprise Planning & Budgeting Cloud Services (EPBCS) Financial Consolidation & Close Cloud Services (FCCS) Enterprise Performance Reporting Cloud Services (EPRCS) Solutioning of EPM opportunities and integration with various ERPs/Bespoke applications using EPM Agent and EPM Automate. Ensure EPBCS operates to meet organization Planning and Budgeting objectives, including support, maintenance, testing, and developing functional/technical specifications. Build and maintain Essbase Outlines, Calculation Scripts, Business Rules, Dimension Builds, Data Loads and Batch Automation Scripts. Responsible for managing and monitoring monthly load process, integration with ERP and Non ERP systems, data validation, and addressing load issues. Optimize application to enhance performance for data loads, aggregations and calculations, and data retrievals. Manage master data management and governance for Oracle EPM Cloud. Provide primary support in the development, testing, and maintenance of reports and dashboards utilizing ad-hoc queries, Smart View, Financial Reporting reports requested by users, and other tools as required. Evaluation and testing of monthly Oracle EPM cloud patches. Be part of an extraordinary story Your skills. Your imagination. Your ambition. Here, there are no boundaries to your potential and the impact you can make. You’ll find infinite opportunities to grow and work on the biggest, most rewarding challenges that will build your skills and experience. You have the chance to be a part of our future and build the life you want while being part of an international community. Our best is here and still to come. To us, impossible is only a challenge. Join us as we dare to achieve what’s never been done before. Together, everything is possible Qualifications About you Bachelor's degree and overall, 7 years of hands-on experience in Implementing Oracle EPM Cloud applications Should have minimum 2 full cycle Oracle EPM Cloud Implementation as Techno Functional, experience covering EPBCS, FCCS, EPRCS and Data Integration. Expertise in Integration between Oracle EPM Cloud and On-Prem/Cloud ERP/Non-ERP Systems using EPM Agent. Should have good understanding of multi-dimensional database management. Experience implementing other Oracle EPM Cloud Account Reconciliation and Tax Reporting. Product knowledge of the full suite of cloud based EPM products. Should also have good planning, organizing, communication and presentation skills Should have good problem solving and analytical skills Familiar with core business process including but not limited to General Ledger, Accounts Payable, Accounts Receivable, Fixed Assets, Cash Management (Bank reconciliations), Projects, Budgeting, Forecasting and Financial Consolidation. Oracle Cloud skill set and experience in Oracle Integrated Cloud (OIC) components & capabilities would be an added advantage. Airline working experience would be an added advantage. About Qatar Airways Group Our story started with four aircraft. Today, we deliver excellence across 12 different businesses coming together as one. We’ve grown fast, broken records and set trends that others follow. We don’t slow down by the fear of failure. Instead, we dare to achieve what’s never been done before. So whether you’re creating a unique experience for our customers or innovating behind the scenes, every person contributes to our proud story. A story of spectacular growth and determination. Now is the time to bring your best ideas and passion to a place where your ambition will know no boundaries, and be part of a truly global community About Qatar Airways Group Our story started with four aircraft. Today, we deliver excellence across 12 different businesses coming together as one. We’ve grown fast, broken records and set trends that others follow. We don’t slow down by the fear of failure. Instead, we dare to achieve what’s never been done before. So whether you’re creating a unique experience for our customers or innovating behind the scenes, every person contributes to our proud story. A story of spectacular growth and determination. Now is the time to bring your best ideas and passion to a place where your ambition will know no boundaries, and be part of a truly global community. https://aa115.taleo.net/careersection/QA_External_CS/jobapply.ftl?lang=en&job=212477

Posted 11 hours ago

Apply

5.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

The Role Context: This is an exciting opportunity to join a dynamic and growing organization, working at the forefront of technology trends and developments in social impact sector. Wadhwani Center for Government Digital Transformation (WGDT) works with the government ministries and state departments in India with a mission of “ Enabling digital transformation to enhance the impact of government policy, initiatives and programs ”. We are seeking a highly motivated and detail-oriented individual to join our team as a Data Engineer with experience in the designing, constructing, and maintaining the architecture and infrastructure necessary for data generation, storage and processing and contribute to the successful implementation of digital government policies and programs. You will play a key role in developing, robust, scalable, and efficient systems to manage large volumes of data, make it accessible for analysis and decision-making and driving innovation & optimizing operations across various government ministries and state departments in India. Key Responsibilities: a. Data Architecture Design : Design, develop, and maintain scalable data pipelines and infrastructure for ingesting, processing, storing, and analyzing large volumes of data efficiently. This involves understanding business requirements and translating them into technical solutions. b. Data Integration: Integrate data from various sources such as databases, APIs, streaming platforms, and third-party systems. Should ensure the data is collected reliably and efficiently, maintaining data quality and integrity throughout the process as per the Ministries/government data standards. c. Data Modeling: Design and implement data models to organize and structure data for efficient storage and retrieval. They use techniques such as dimensional modeling, normalization, and denormalization depending on the specific requirements of the project. d. Data Pipeline Development/ ETL (Extract, Transform, Load): Develop data pipeline/ETL processes to extract data from source systems, transform it into the desired format, and load it into the target data systems. This involves writing scripts or using ETL tools or building data pipelines to automate the process and ensure data accuracy and consistency. e. Data Quality and Governance: Implement data quality checks and data governance policies to ensure data accuracy, consistency, and compliance with regulations. Should be able to design and track data lineage, data stewardship, metadata management, building business glossary etc. f. Data lakes or Warehousing: Design and maintain data lakes and data warehouse to store and manage structured data from relational databases, semi-structured data like JSON or XML, and unstructured data such as text documents, images, and videos at any scale. Should be able to integrate with big data processing frameworks such as Apache Hadoop, Apache Spark, and Apache Flink, as well as with machine learning and data visualization tools. g. Data Security : Implement security practices, technologies, and policies designed to protect data from unauthorized access, alteration, or destruction throughout its lifecycle. It should include data access, encryption, data masking and anonymization, data loss prevention, compliance, and regulatory requirements such as DPDP, GDPR, etc. h. Database Management: Administer and optimize databases, both relational and NoSQL, to manage large volumes of data effectively. i. Data Migration: Plan and execute data migration projects to transfer data between systems while ensuring data consistency and minimal downtime. a. Performance Optimization : Optimize data pipelines and queries for performance and scalability. Identify and resolve bottlenecks, tune database configurations, and implement caching and indexing strategies to improve data processing speed and efficiency. b. Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with access to the necessary data resources. They also work closely with IT operations teams to deploy and maintain data infrastructure in production environments. c. Documentation and Reporting: Document their work including data models, data pipelines/ETL processes, and system configurations. Create documentation and provide training to other team members to ensure the sustainability and maintainability of data systems. d. Continuous Learning: Stay updated with the latest technologies and trends in data engineering and related fields. Should participate in training programs, attend conferences, and engage with the data engineering community to enhance their skills and knowledge. Desired Skills/ Competencies Education: A Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or equivalent with at least 5 years of experience. Database Management: Strong expertise in working with databases, such as SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Big Data Technologies: Familiarity with big data technologies, such as Apache Hadoop, Spark, and related ecosystem components, for processing and analyzing large-scale datasets. ETL Tools: Experience with ETL tools (e.g., Apache NiFi, Talend, Apache Airflow, Talend Open Studio, Pentaho, Infosphere) for designing and orchestrating data workflows. Data Modeling and Warehousing: Knowledge of data modeling techniques and experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Data Governance and Security: Understanding of data governance principles and best practices for ensuring data quality and security. Cloud Computing: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services for scalable and cost-effective data storage and processing. Streaming Data Processing: Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink) for handling streaming data. KPIs: Data Pipeline Efficiency: Measure the efficiency of data pipelines in terms of data processing time, throughput, and resource utilization. KPIs could include average time to process data, data ingestion rates, and pipeline latency. Data Quality Metrics: Track data quality metrics such as completeness, accuracy, consistency, and timeliness of data. KPIs could include data error rates, missing values, data duplication rates, and data validation failures. System Uptime and Availability: Monitor the uptime and availability of data infrastructure, including databases, data warehouses, and data processing systems. KPIs could include system uptime percentage, mean time between failures (MTBF), and mean time to repair (MTTR). Data Storage Efficiency: Measure the efficiency of data storage systems in terms of storage utilization, data compression rates, and data retention policies. KPIs could include storage utilization rates, data compression ratios, and data storage costs per unit. Data Security and Compliance: Track adherence to data security policies and regulatory compliance requirements such as DPDP, GDPR, HIPAA, or PCI DSS. KPIs could include security incident rates, data access permissions, and compliance audit findings. Data Processing Performance: Monitor the performance of data processing tasks such as ETL (Extract, Transform, Load) processes, data transformations, and data aggregations. KPIs could include data processing time, CPU usage, and memory consumption. Scalability and Performance Tuning: Measure the scalability and performance of data systems under varying workloads and data volumes. KPIs could include scalability benchmarks, system response times under load, and performance improvements achieved through tuning. Resource Utilization and Cost Optimization: Track resource utilization and costs associated with data infrastructure, including compute resources, storage, and network bandwidth. KPIs could include cost per data unit processed, cost per query, and cost savings achieved through optimization. Incident Response and Resolution: Monitor the response time and resolution time for data-related incidents and issues. KPIs could include incident response time, time to diagnose and resolve issues, and customer satisfaction ratings for support services. Documentation and Knowledge Sharing : Measure the quality and completeness of documentation for data infrastructure, data pipelines, and data processes. KPIs could include documentation coverage, documentation update frequency, and knowledge sharing activities such as internal training sessions or knowledge base contributions. Years of experience of the current role holder New Position Ideal years of experience 3 – 5 years Career progression for this role CTO WGDT (Head of Incubation Centre) ******************************************************************************* Wadhwani Corporate Profile: (Click on this link) Our Culture: WF is a global not-for-profit, and works like a start-up, in a fast-moving, dynamic pace where change is the only constant and flexibility is the key to success. Three mantras that we practice across job roles, levels, functions, programs and initiatives, are Quality, Speed, Scale, in that order. We are an ambitious and inclusive organization, where everyone is encouraged to contribute and ideate. We are intensely and insanely focused on driving excellence in everything we do. We want individuals with the drive for excellence, and passion to do whatever it takes to deliver world class outcomes to our beneficiaries. We set our own standards often more rigorous than what our beneficiaries demand, and we want individuals who love it this way. We have a creative and highly energetic environment – one in which we look to each other to innovate new solutions not only for our beneficiaries but for ourselves too. Open to collaborate with a borderless mentality, often going beyond the hierarchy and siloed definitions of functional KRAs, are the individuals who will thrive in our environment. This is a workplace where expertise is shared with colleagues around the globe. Individuals uncomfortable with change, constant innovation, and short learning cycles and those looking for stability and orderly working days may not find WF to be the right place for them. Finally, we want individuals who want to do greater good for the society leveraging their area of expertise, skills and experience. The foundation is an equal opportunity firm with no bias towards gender, race, colour, ethnicity, country, language, age and any other dimension that comes in the way of progress. Join us and be a part of us! Bachelors in Technology / Masters in Technology

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 1 day ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Delhi, India

On-site

Linkedin logo

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 2 days ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 2 days ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 2 days ago

Apply

6.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

Ascentt is building cutting-edge data analytics & AI/ML solutions for global automotive and manufacturing leaders. We turn enterprise data into real-time decisions using advanced machine learning and GenAI. Our team solves hard engineering problems at scale, with real-world industry impact. We’re hiring passionate builders to shape the future of industrial intelligence. Sr Tableau Developer 6+ years of experience in Tableau development JD Key Responsibilities Build and maintain complex Tableau dashboards with drill-down capabilities, filters, actions, and KPI indicators. Write advanced calculations like Level of Detail (LOD) expressions to address business logic such as aggregations at different dimensions. Design and implement table calculations for running totals, percent change, rankings, etc. Perform data blending and joins across multiple sources, ensuring data accuracy and integrity. Optimize Tableau workbook performance by managing extracts, minimizing dashboard load time, and tuning calculations. Use parameters, dynamic filters, and action filters for interactive user experiences. Design dashboard wireframes and prototypes using Tableau or other tools like Figma. Manage publishing, scheduling, and permissions in Tableau Server/Cloud. Collaborate with data engineering to design performant, scalable data sources. Document data logic, dashboard specs, and technical workflows for governance. Provide mentorship and technical guidance to junior Tableau developers. Experience in any BI Reporting Tool like PowerBI, Looker, Quicksight, Alteryx is a Plus Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, Analytics, or a related field 6+ years of experience in Tableau development Tableau Desktop Certified Professional (preferred) Experience with enterprise BI projects and stakeholder engagement SQL proficiency: Ability to write complex joins, CTEs, subqueries, and window functions. Experience working with large datasets in tools like:Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse, or SQL Server Data preparation tools experience (preferred but not required): Tableau Prep, Alteryx, dbt, or equivalent Knowledge of Tableau Server/Cloud administration (publishing, permissions, data source refreshes)

Posted 2 days ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 2 days ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Position: Search Engineer Location: Remote Experience: 6+ years Job Description: Highly skilled Search Engineer with deep expertise in designing, implementing, and optimizing search solutions using Apache Solr, Elasticsearch, and Apache Spark . Need substantial experience handling big data search and document-based retrieval, with a strong focus on writing complex queries and indexing strategies for large-scale systems. Key Responsibilities: · Design and implement robust, scalable search architectures using Solr and Elasticsearch. · Write, optimize, and maintain complex search queries (including full-text, faceted, fuzzy, geospatial, and nested queries) using Solr Query Parser and Elasticsearch DSL. · Work with business stakeholders to understand search requirements and translate them into performant and accurate queries. · Build and manage custom analyzers, tokenizers, filters, and index mappings/schemas tailored to domain-specific search needs. · Develop and optimize indexing pipelines using Apache Spark for processing large-scale structured and unstructured datasets. · Perform query tuning and search relevance optimization based on precision, recall, and user engagement metrics. · Create and maintain query templates and search APIs for integration with enterprise applications. · Monitor, troubleshoot, and improve search performance and infrastructure reliability. · Conduct evaluations and benchmarking of search quality, query latency, and index refresh times. Required Skills and Qualifications: · 4 to 5 years of hands-on experience with Apache Solr and/or Elasticsearch in production environments. · Proven ability to write and optimize complex Solr queries (standard, dismax, edismax parsers) and Elasticsearch Query DSL, including: o Full-text search with analyzers o Faceted and filtered search o Boolean and range queries o Aggregations and suggesters o Nested and parent/ child queries · Strong understanding of indexing principles, Lucene internals, and relevance scoring mechanisms (BM25, TF-IDF). · Proficiency with Apache Spark for custom indexing workflows and large-scale data processing. · Experience with document parsing and extraction (JSON, XML, PDFs, etc.) for search indexing. · Experience integrating search into web applications or enterprise software platforms.

Posted 3 days ago

Apply

8.0 - 10.0 years

25 - 30 Lacs

Hyderābād

On-site

GlassDoor logo

Primary Skill: Native Android Development Secondary Skills: IOS, Hybrid Apps, API Integration, Cloud Technologies Domain: HealthcareRole Summary: · This role will be instrumental in building and maintaining robust, scalable, and reliable data pipelines using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. · The ideal candidate will have a strong understanding of data streaming concepts, experience with real-time data processing, and a passion for building high-performance data solutions. · This role requires excellent analytical skills, attention to detail, and the ability to work collaboratively in a fast-paced environment. Essential Responsibilities · Design & develop data pipelines for real-time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. · Build and configure Kafka Connectors to ingest data from various sources (databases, APIs, message queues, etc.) into Kafka. · Develop Flink applications for complex event processing, stream enrichment, and real-time analytics. · Develop and optimize ksqlDB queries for real-time data transformations, aggregations, and filtering. · Implement data quality checks and monitoring to ensure data accuracy and reliability throughout the pipeline. · Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement optimizations. · Automate data pipeline deployment, monitoring, and maintenance tasks. · Stay up-to-date with the latest advancements in data streaming technologies and best practices. · Contribute to the development of data engineering standards and best practices within the organization. · Participate in code reviews and contribute to a collaborative and supportive team environment. · Work closely with other architects and tech leads in India & US and create POCs and MVPs · Provide regular updates on the tasks, status and risks to project manager · The experience we are looking to add to our team Required · Bachelor’s degree or higher from a reputed university · 8 to 10 years total experience with majority of that experience related to ETL/ELT, big data, Kafka etc. · Proficiency in developing Flink applications for stream processing and real-time analytics. · Strong understanding of data streaming concepts and architectures. · Extensive experience with Confluent Kafka, including Kafka Brokers, Producers, Consumers, and Schema Registry. · Hands-on experience with ksqlDB for real-time data transformations and stream processing. · Experience with Kafka Connect and building custom connectors. · Extensive experience in implementing large scale data ingestion and curation solutions · Good hands on experience in big data technology stack with any cloud platform - · Excellent problem-solving, analytical, and communication skills. · Ability to work independently and as part of a team Good to have · Experience in Google Cloud · Healthcare industry experience · Experience in Agile Job Type: Full-time Pay: ₹2,500,000.00 - ₹3,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 3 days ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Who You'll Work With You are someone who thrives in a high-performance environment, bringing a growth mindset and entrepreneurial spirit to tackle meaningful challenges that have a real impact. In return for your drive, determination, and curiosity, we’ll provide the resources, mentorship, and opportunities to help you quickly broaden your expertise, grow into a well-rounded professional, and contribute to work that truly makes a difference. When you join us, you will have Continuous learning Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. Exceptional benefits In addition to a competitive salary (based on your location, experience, and skills), we offer a comprehensive benefits package, including medical, dental, mental health, and vision coverage for you, your spouse/partner, and children. Your Impact As a Search Engineer, you will design, develop, and optimize search systems to improve relevance, efficiency, and scalability. Your work will involve building and refining search algorithms, indexing, ranking, and retrieval systems, leveraging Generative AI to deliver precise and meaningful results. You will design search architectures, including components for indexing, ranking, and query parsing, and optimize pipelines for real-time and batch processing. Additionally, you will enhance query understanding and result ranking by implementing machine learning models and natural language processing (NLP) techniques. Being a part of McKinsey's Technology and Digital group, you'll develop internal software solutions to support the firm's business leaders in their global client work. Leading small, agile teams, you'll deliver tailored tools efficiently while integrating cutting-edge open-source technologies with traditional enterprise software. Your Qualifications and Skills Bachelor's degree in computer science, IT, computer engineering, or a related field, or equivalent experience 5+ years of technical experience working in a professional services firm Expertise in OpenSearch/Elasticsearch, including text analyzers, cluster management, query DSL, aggregations, highlighting, percolation, and scaling Experience designing indexing processes for large datasets Proficiency in deploying web applications, ingestion/search pipelines, and microservices in cloud-based production environments Strong skills in Java Spring Boot Microservices and the ability to troubleshoot independently Knowledge of Python programming and familiarity with Solr are a plus Understanding of semantic search and vector databases Ability to translate business needs into effective technical solutions

Posted 3 days ago

Apply

3.0 years

7 - 10 Lacs

Noida

On-site

GlassDoor logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Business Data Analysts act as a bridge between the technology and business (non-technology) teams providing business growth and expansion to a capability Understand and document the business requirements based on the customer’s needs. This includes understanding and evaluating the request, the business process or use case, and general system capabilities Determine operational objectives by defining business functions; gathering information; evaluating output requirements and formats Analyze large amounts of data and business processes to form ideas and solutions to fix the problem Ensure business rules are clearly defined Ensure metrics are clear and have supporting data Ensure metrics are measurable Ensure requirement collection is standardized - Data Selection or Transformations Ensure Data Governance standards are incorporated Research and gain a high-level understanding of Data Elements and Sources Define the source of data - type of ingestion, relationships, tables, variables, and transformations that may be needed Become a SME in how the data will be used and how it functions Test Data Profiling and Creation Ensure that individual requirements do not contradict each other or describe the same requirement using different wording Individual requirements must never be unclear/ambiguous Related requirements must be grouped together so that requirements can be modifiable. This characteristic is exhibited by a logical structuring of the requirements There must be a way to prove that a requirement has been fulfilled. Each requirement should be testable - it must be possible to design a test case that can be used to determine if a solution has met the requirement Presentation and Handoff to IT BDA creates and ensures that artifacts (SAD) are complete per the business and the artifact is ready for review with IT teams to begin the Solutioning phase of the Software Development Life Cycle (SDLC) Presenting ideas and findings in meetings Complete the necessary reviews to hand off to development Create process flows to simplify hand off from business to IT resources Partner and collaborate with developers during solutioning/development phase QA/UAT and Operational Readiness Develop QA and User Acceptance Plan (UAT) and Testing in coordination project and business leads Issue resolution - Proactively troubleshoot data quality issues by identifying the root cause of data discrepancies and determining and implementing recommendations for resolution Work with business counterparts on operational readiness activities to ensure that necessary training and procedure updates have occurred Data troubleshooting and issue management Identify, analyze, and resolve data inconsistencies, errors, and discrepancies in databases and reporting systems Utilize troubleshooting techniques to diagnose the root causes of data-related problems Collaborate with cross-functional teams, including IT, data analysts, and business stakeholders, to investigate and resolve data issues Develop and implement data quality control procedures and perform regular audits to maintain data accuracy and integrity Document data troubleshooting processes, solutions, and resolutions for future reference and knowledge sharing Provide technical support and guidance to system users regarding data-related issues Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor's degree 3+ years of experience supporting data related initiatives (system integration, data warehouse build, data mart build or related) 3+ years of business analyst experience 3+ years of QA/UAT experience 3+ years of problem solving and troubleshooting issues Proven ability to write/understand complex SQL queries involving advanced unions, joins, aggregations, and groups Preferred Qualifications: Experience with Healthcare data Experience with marketing or reporting capabilities Project management experience Knowledge of JSON Knowledge of Azure Knowledge of SAS Knowledge of Adobe products: Adobe Experience Platform, Adobe Journey Optimizer or Customer Journey Analytics Proven solid communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Syniverse is the world’s most connected company. Whether we’re developing the technology that enables intelligent cars to safely react to traffic changes or freeing travelers to explore by keeping their devices online wherever they go, we believe in leading the world forward. Which is why we work with some of the world’s most recognized brands. Eight of the top 10 banks. Four of the top 5 global technology companies. Over 900 communications providers. And how we’re able to provide our incredible talent with an innovative culture and great benefits. Who We're Looking For As a Lead Quality Assurance engineer you will be in charge of designing and developing the QA management systems and tools of the organization. Define test requirements and automate test procedures to help create and maintain an exceptional user experience for Syniverse’s customers. The ideal candidate will be responsible for conducting tests before product launches to ensure software runs smoothly and meets client needs, while being cost-effective. Some Of What You'll Do Duties and responsibilities: Ability to work from Jira requirements to write test plans. Provide input to team leads, manager and stakeholders as needed to manage test plans and schedules. Assign tasks and track Project deliverables. Consult with Development to identify test data. Properly diagnose test results and document product defects. Execute test scripts and cases and initiate modifications if necessary. Provide daily test statuses on testing progress and issues. Actively participate in product and project team meetings. Keep abreast of business needs and stay current with technology trends. ETL Testing Design and execute comprehensive test plans and test cases to validate ETL processes. Ensure data extraction, transformation, and loading operations meet business requirements and data quality standards. Identify and report data anomalies, discrepancies, and inconsistencies. BI Testing Design and execute the BI test cases to validate the Reports and Dashboards Data Validation Develop and maintain data validation scripts and procedures to verify data integrity. Perform data reconciliation between source and target systems. Validate data transformations, aggregations, and calculations. Performance Testing Conduct performance and scalability testing of ETL processes to ensure optimal data flow. Identify bottlenecks and optimize ETL workflows for efficiency. Regression Testing Establish and maintain regression test suites to prevent regressions in ETL pipelines. Automate regression testing where possible to streamline validation processes. Documentation Document test cases, test results, and testing procedures. Maintain documentation for ETL processes and data mappings. Error and Defects are created in Jira and fully documented including description, steps to recreate, and attached failed test collateral. Collaboration Collaborate with data engineers, data analysts, and business stakeholders to understand data requirements and business logic. Work closely with the development team to ensure ETL code changes are tested thoroughly. Issue Resolution Investigate and troubleshoot data-related issues and defects. Work with the development team to resolve identified problems. Requirements 7-12 years Software Engineering experience. Experience with big data technologies: Hadoop Impala, Kafka & knowledge on flink jobs Expertise in formal software testing methodologies. Scripting and automated software testing tools experience is a plus. 3+ years’ experience working with industry standard testing tools like JMeter, Zephyr, Cucumber, Postman, and others. Strong understanding of platforms (UNIX experience preferred). Programming knowledge in any scripting or programming language is a plus. Jira knowledge preferred. Qualifications Bachelor’s degree in computer science, Information Technology, or related field. Proven experience as a Data ETL / BI Test Engineer or similar role. Proficiency in SQL for data validation and querying. Good understanding of ETL processes and data warehousing concepts. Knowledge of data quality best practices and testing methodologies. Working knowledge of DB triggers and procedures is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Knowledge of data governance and data privacy regulations. Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Familiarity with version control systems (e.g., Git). Certification in software testing (e.g., ISTQB) is a plus. Why You Should Join Us Join us as we write a new chapter, guided by world-class leadership. Come be a part of an exciting and growing organization where we offer a competitive total compensation, flexible/remote work and with a leadership team committed to fostering an inclusive, collaborative, and transparent organizational culture. At Syniverse connectedness is at the core of our business. We believe diversity, equity, and inclusion among our employees is crucial to our success as a global company as we seek to recruit, develop, and retain the most talented people who want to help us connect the world. Know someone at Syniverse? Be sure to have them submit you as a referral prior to applying for this position.

Posted 4 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Business Data Analysts act as a bridge between the technology and business (non-technology) teams providing business growth and expansion to a capability Understand and document the business requirements based on the customer’s needs. This includes understanding and evaluating the request, the business process or use case, and general system capabilities Determine operational objectives by defining business functions; gathering information; evaluating output requirements and formats Analyze large amounts of data and business processes to form ideas and solutions to fix the problem Ensure business rules are clearly defined Ensure metrics are clear and have supporting data Ensure metrics are measurable Ensure requirement collection is standardized - Data Selection or Transformations Ensure Data Governance standards are incorporated Research and gain a high-level understanding of Data Elements and Sources Define the source of data - type of ingestion, relationships, tables, variables, and transformations that may be needed Become a SME in how the data will be used and how it functions Test Data Profiling and Creation Ensure that individual requirements do not contradict each other or describe the same requirement using different wording Individual requirements must never be unclear/ambiguous Related requirements must be grouped together so that requirements can be modifiable. This characteristic is exhibited by a logical structuring of the requirements There must be a way to prove that a requirement has been fulfilled. Each requirement should be testable - it must be possible to design a test case that can be used to determine if a solution has met the requirement Presentation and Handoff to IT BDA creates and ensures that artifacts (SAD) are complete per the business and the artifact is ready for review with IT teams to begin the Solutioning phase of the Software Development Life Cycle (SDLC) Presenting ideas and findings in meetings Complete the necessary reviews to hand off to development Create process flows to simplify hand off from business to IT resources Partner and collaborate with developers during solutioning/development phase QA/UAT and Operational Readiness Develop QA and User Acceptance Plan (UAT) and Testing in coordination project and business leads Issue resolution - Proactively troubleshoot data quality issues by identifying the root cause of data discrepancies and determining and implementing recommendations for resolution Work with business counterparts on operational readiness activities to ensure that necessary training and procedure updates have occurred Data troubleshooting and issue management Identify, analyze, and resolve data inconsistencies, errors, and discrepancies in databases and reporting systems Utilize troubleshooting techniques to diagnose the root causes of data-related problems Collaborate with cross-functional teams, including IT, data analysts, and business stakeholders, to investigate and resolve data issues Develop and implement data quality control procedures and perform regular audits to maintain data accuracy and integrity Document data troubleshooting processes, solutions, and resolutions for future reference and knowledge sharing Provide technical support and guidance to system users regarding data-related issues Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor's degree 3+ years of experience supporting data related initiatives (system integration, data warehouse build, data mart build or related) 3+ years of business analyst experience 3+ years of QA/UAT experience 3+ years of problem solving and troubleshooting issues Proven ability to write/understand complex SQL queries involving advanced unions, joins, aggregations, and groups Preferred Qualifications Experience with Healthcare data Experience with marketing or reporting capabilities Project management experience Knowledge of JSON Knowledge of Azure Knowledge of SAS Knowledge of Adobe products: Adobe Experience Platform, Adobe Journey Optimizer or Customer Journey Analytics Proven solid communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 4 days ago

Apply

25.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Recruitment Fraud Alert We’ve learned that scammers are impersonating Commvault team members—including HR and leadership—via email or text. These bad actors may conduct fake interviews and ask for personal information, such as your social security number. What to know: Commvault does not conduct interviews by email or text. We will never ask you to submit sensitive documents (including banking information, SSN, etc) before your first day. If you suspect a recruiting scam, please contact us at www.recruitingteam@commvault.com. About Commvault Commvault (NASDAQ: CVLT) is the gold standard in cyber resilience. The company empowers customers to uncover, take action, and rapidly recover from cyberattacks – keeping data safe and businesses resilient. The company’s unique AI-powered platform combines best-in-class data protection, exceptional data security, advanced data intelligence, and lightning-fast recovery across any workload or cloud at the lowest TCO. For over 25 years, more than 100,000 organizations and a vast partner ecosystem have relied on Commvault to reduce risks, improve governance, and do more with data. We are currently seeking a Senior Developer – Gainsight to lead the technical development and optimization of our Customer Success Platform. This role requires deep hands-on experience in Gainsight configuration, data integration, and workflow automation across the customer lifecycle. The ideal candidate brings both platform expertise and a data-first mindset to drive proactive retention, onboarding, adoption, and renewal strategies. You’ll partner with Customer Success, Revenue Ops, and Salesforce teams to deliver insights, campaigns, and alerts that power our CS motion. Experience with SQL, system integrations, and AI-based engagement scoring is highly valued. Key responsibilities include but are not limited to the following: Key Responsibilities Gainsight Configuration & Development Own configuration of Gainsight features: Rules Engine, Journey Orchestrator, CTAs, Scorecards, Programs, and Playbooks. Design and implement customer lifecycle programs to automate onboarding, health scoring, renewal risk, and expansion signals. Collaborate with stakeholders to turn customer insights into operational cadences, campaigns, and escalations. Build business-centric reports and dashboards using Reporting 2.0 and CX Center. Data Integration & Orchestration Build and maintain end-to-end integrations between Gainsight and platforms such as Salesforce, and product telemetry systems. Translate customer data into actionable playbooks using Data Designer, Rules Engine, and Bionic Rules. Understand and document customer journey workflows and their associated data flows. AI & Automation Enablement Implement and maintain sentiment analysis, churn prediction, and AI-based prioritization within Gainsight or external AI services. Work with teams to embed AI-generated insights into CTAs, dashboards, and health models. Explore and prototype with tools like Gainsight Horizon AI, OpenAI integrations, or Azure AI where applicable. Collaboration & Support Support end users by resolving daily platform issues and training power users. Proactively document solutions, data models, and integration logic. Evangelize Customer Success tools and best practices to technical and non-technical stakeholders. Essential Skills & Experience Bachelor’s degree in Computer Science, Engineering, or equivalent. 3–6 years of experience with Gainsight development and administration. Proven ability to configure Rules Engine, Programs, Scorecards, and playbooks. Strong SQL skills – especially with joins, aggregations, and nested queries. Gainsight Level 3 Admin Certification or equivalent experience. Experience integrating multiple systems and working with APIs. Excellent communication skills and ability to work in a globally distributed team. Preferred Skills & Experience Familiarity with CRM platforms like Salesforce or Dynamics 365. Expertise in working with Web services and APIs (as in RESTful and SOAP). Ability to produce architecture diagrams and integration specifications. Exposure to AI/ML tools like OpenAI, Azure Cognitive Services, or Gainsight AI. Git version control or similar ALM tooling. Commvault is an equal opportunity workplace and is an affirmative action employer. We are always committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status and we will not discriminate against on the basis of such characteristics or any other status protected by the laws or regulations in the locations where we work. Commvault’s goal is to make interviewing inclusive and accessible to all candidates and employees. If you have a disability or special need that requires accommodation to participate in the interview process or apply for a position at Commvault, please email accommodations@commvault.com For any inquiries not related to an accommodation please reach out to wwrecruitingteam@commvault.com.

Posted 5 days ago

Apply

8.0 years

12 Lacs

Hyderābād

On-site

GlassDoor logo

Position : Full Stack Data Engineer Locations : Hybrid – Hyderabad, Noida, Bangalore, Indore Experience : 8+ Years Budget : ₹1 Lakh per Month Job Summary We are looking for an experienced Full Stack Data Engineer with a strong background in Big Data technologies, PySpark, and cloud-based data platforms. The ideal candidate will have hands-on experience in building and maintaining scalable data pipelines and integrating various components of the data ecosystem using Azure and Databricks. Key Responsibilities Design, develop, and maintain large-scale data processing systems using PySpark and Databricks . Create and orchestrate data workflows using Azure Data Factory (ADF) . Work with distributed data systems including Hadoop and Big Data platforms. Perform data transformations and aggregations using Hive and other query engines. Develop and maintain efficient, reusable, and reliable data codebases. Optimize data systems for performance and scalability. Collaborate with cross-functional teams including Data Scientists, Analysts, and DevOps. Job Types: Full-time, Contractual / Temporary Contract length: 6 months Pay: Up to ₹100,000.00 per month Benefits: Flexible schedule Schedule: Day shift Monday to Friday Work Location: In person

Posted 5 days ago

Apply

2.0 years

3 - 4 Lacs

Mohali

On-site

GlassDoor logo

We at Zoptal Solutions Pvt. Ltd. are seeking a skilled and passionate MERN Stack Developer to join our dynamic team. You will be responsible for developing and maintaining scalable, efficient, and user-friendly web applications using the MERN (MongoDB, Express.js, React.js, Node.js) stack. Key Responsibilities : Design, develop, and maintain full-stack web applications using the MERN stack. Build responsive and intuitive user interfaces using React.js Create and manage back-end services and APIs using Node.js and Express.js . Integrate applications with MongoDB for efficient database management. Collaborate with cross-functional teams to define and implement project requirements. Debug and optimize applications for performance and scalability. Stay updated with emerging web development trends and technologies. Ensure application security, including implementing data protection protocols. Required Skills : Strong experience in MongoDB, Express.js, React.js, and Node.js . Proficiency in JavaScript and ES6+ features. Hands-on experience with RESTful APIs and JSON. Knowledge of front-end frameworks and responsive design principles. Familiarity with version control tools like Git . Understanding of deployment processes and cloud services (e.g., AWS, Heroku). Problem-solving skills and a collaborative mindset. Preferred Qualifications : Experience with state management libraries like Redux . Familiarity with CI/CD pipelines and DevOps tools. Knowledge of TypeScript or Next.js is a plus. Previous experience in Agile/Scrum development environments. Why Join Us? Opportunity to work on exciting projects in a supportive team environment. Competitive salary and benefits package. Professional development and growth opportunities. 5 Days work week Experience : 2 Years or above Location : 8B-Mohali, Punjab Work Type: On-Site only Interested candidates can share their CVs with us at our email address or call us directly at our number. Best Regards, Kavita Rai HR Manager Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Schedule: Day shift Monday to Friday Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: MERN Stack: 2 years (Required) MongoDB Aggregations : 2 years (Required) Payment gateway integration : 2 years (Required) Sockets : 2 years (Required) Project Deployment : 1 year (Preferred) AWS: 1 year (Preferred) Work Location: In person

Posted 5 days ago

Apply

7.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction A career in IBM Software means you’ll be part of a team that transforms our customer’s challenges into solutions. Seeking new possibilities and always staying curious, we are a team dedicated to creating the world’s leading AI-powered, cloud-native software solutions for our customers. Our renowned legacy creates endless global opportunities for our IBMers, so the door is always open for those who want to grow their career. We are seeking a skilled back-end developer to join our IBM Software team. As part of our team, you will be responsible for developing and maintaining high-quality software products, working with a variety of technologies and programming languages. IBM’s product and technology landscape includes Research, Software, and Infrastructure. Entering this domain positions you at the heart of IBM, where growth and innovation thrive IBM Planning Analytics® is an enterprise financial planning software platform used by a significant number of Global 500 companies. IBM Planning Analytics® provides a real-time approach to consolidating, viewing, and editing enormous volumes of multidimensional data. At the heart of the IBM Planning Analytics solution is TM1® Server, a patented, 64-bit, in-memory functional database server that can perform real-time complex calculations and aggregations over massive data spaces while allowing concurrent data editing. IBM TM1 Server development team is a dynamic and forward thinking team, and we are looking for a Senior Software Developer with significant experience in designing and developing enterprise-scale software products to join us. Your Role And Responsibilities Your Role and Responsibilities You, the ideal candidate, are expected to have strong technical, critical thinking and communication skills. You are creative and are not afraid of bringing forward ideas and running with them. If you are already product focused, are excited for new technological development that will help users do better in solving their problems, enjoy and appreciate teamwork with people across the globe, then you will be at home with our team. As a key member of our dynamic team, you will play a vital role in crafting exceptional software experiences. Your responsibilities will encompass the design and implementation of innovative features, fine-tuning and sustaining existing code for optimal performance, and guaranteeing top-notch quality through rigorous testing and debugging. Collaboration is at the heart of what we do, and you’ll be working closely with fellow developers, designers, and product managers to ensure our software aligns seamlessly with user expectations. Preferred Education Master's Degree Required Technical And Professional Expertise 7+ years of Developing High Performance, Highly Scalable C/C++ Application. Multi-threaded Programming, High Performance Data Structures and Algorithms. Experience developing and debugging software across multiple platforms including Microsoft Windows and Linux. Experience with Agile Software Development. Preferred Technical And Professional Experience Degree in Computer Science, Engineering, or equivalent professional experience. In Addition to the required skills, knowledge of MDX, OLAP Technologies and Multidimensional Modeling are a plus.

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

OB DESCRIPTION/RESPONSIBILITIES – ISU Billing & Invoice Expert level skill in SAP ISU Billing Master data configurations Expert level skill in RTP Billing, EDM configuration, EDM Profile Management and Custom formulas. Billing & Invoicing processes such as Collectives, Bill Aggregations, Unmetered, Main & Sub meters, Primary & secondary meters billing, Budget Billing Plan, Electronic bill outputs like PDF, EDI & TVEI by using the SAP ISU PWB. System Configuration: Configure and customize the SAP ISU billing module to meet the specific requirements of the utility company. This involves setting up billing schemas, rates, pricing conditions, invoicing cycles, and billing document types. Billing Process Management: Oversee the end-to-end billing process, including meter reading, billing determinants, bill creation, bill correction, bill printing, and bill distribution. Ensure accurate and timely billing of utility services to customers. Customer Master Data Management: Manage customer master data in the SAP ISU system. This includes creating and maintaining customer records, meter data, contract details, tariff structures, and billing addresses. Billing Issue Resolution: Investigate and resolve billing-related issues, discrepancies, and errors. Collaborate with other teams, such as customer service and finance, to resolve customer billing inquiries and disputes. Testing and Quality Assurance: Perform system testing to ensure accurate billing calculations, proper integration with other modules, and compliance with business requirements. Conduct end-to-end testing for billing processes and support user acceptance testing (UAT) activities. Documentation and Training: Create and maintain documentation for billing processes, configuration, and user manuals. Provide training and guidance to end users on SAP ISU billing processes and functionalities. Enhancements and Upgrades: Participate in system enhancements and upgrades related to billing processes. Analyze business requirements, propose solutions, and collaborate with development teams to implement changes. Data Analysis and Reporting: Analyze billing data and generate reports for management, finance, and regulatory purposes. Provide insights and recommendations based on data analysis to improve billing accuracy and efficiency. Compliance and Regulatory Requirements: Stay updated with industry regulations and compliance requirements related to utility billing. Ensure that billing processes adhere to legal and regulatory standards. Stakeholder Collaboration: Collaborate with internal stakeholders, such as finance, customer service, and IT teams, to ensure effective communication, coordination, and alignment of billing processes with overall business objectives. Continuous Improvement: Identify opportunities for process improvement, automation, and efficiency gains in billing operations. Propose and implement enhancements to streamline billing processes and enhance customer experience. Very Sound knowledge in Device Management. Good working experience in FICA and BPEM Configurations Good knowledge of SAP CRM, C4C and ISU integrations.

Posted 5 days ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Mohali, Punjab

On-site

Indeed logo

We at Zoptal Solutions Pvt. Ltd. are seeking a skilled and passionate MERN Stack Developer to join our dynamic team. You will be responsible for developing and maintaining scalable, efficient, and user-friendly web applications using the MERN (MongoDB, Express.js, React.js, Node.js) stack. Key Responsibilities : Design, develop, and maintain full-stack web applications using the MERN stack. Build responsive and intuitive user interfaces using React.js Create and manage back-end services and APIs using Node.js and Express.js . Integrate applications with MongoDB for efficient database management. Collaborate with cross-functional teams to define and implement project requirements. Debug and optimize applications for performance and scalability. Stay updated with emerging web development trends and technologies. Ensure application security, including implementing data protection protocols. Required Skills : Strong experience in MongoDB, Express.js, React.js, and Node.js . Proficiency in JavaScript and ES6+ features. Hands-on experience with RESTful APIs and JSON. Knowledge of front-end frameworks and responsive design principles. Familiarity with version control tools like Git . Understanding of deployment processes and cloud services (e.g., AWS, Heroku). Problem-solving skills and a collaborative mindset. Preferred Qualifications : Experience with state management libraries like Redux . Familiarity with CI/CD pipelines and DevOps tools. Knowledge of TypeScript or Next.js is a plus. Previous experience in Agile/Scrum development environments. Why Join Us? Opportunity to work on exciting projects in a supportive team environment. Competitive salary and benefits package. Professional development and growth opportunities. 5 Days work week Experience : 2 Years or above Location : 8B-Mohali, Punjab Work Type: On-Site only Interested candidates can share their CVs with us at our email address or call us directly at our number. Best Regards, Kavita Rai HR Manager Job Type: Full-time Pay: ₹25,000.00 - ₹40,000.00 per month Schedule: Day shift Monday to Friday Ability to commute/relocate: Mohali, Punjab: Reliably commute or planning to relocate before starting work (Required) Experience: MERN Stack: 2 years (Required) MongoDB Aggregations : 2 years (Required) Payment gateway integration : 2 years (Required) Sockets : 2 years (Required) Project Deployment : 1 year (Preferred) AWS: 1 year (Preferred) Work Location: In person

Posted 5 days ago

Apply

48.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

We are looking for a detail-oriented and technically strong ETL Quality Engineer to join our data engineering and QA team. The ideal candidate will be responsible for ensuring the accuracy, integrity, and reliability of data pipelines and ETL processes. You will work closely with data engineers, business analysts, and developers to validate and verify end-to-end data flows and transformations. Key Responsibilities Review and analyze data requirements, source-to-target mappings, and business rules. Design, develop, and execute comprehensive ETL test plans, test cases, and test scripts. Perform data validation, transformation testing, and reconciliation across source and target systems. Identify and document defects, inconsistencies, and data quality issues. Validate performance of ETL jobs and data pipelines under various workloads. Participate in code reviews, defect triage meetings, and QA strategy planning. Use SQL to query, validate, and compare large datasets across environments. Maintain and enhance test automation frameworks for data pipeline validation. Required Technical Skills Strong experience with ETL testing tools such as Informatica, Talend, SSIS, DataStage, or equivalent. Proficiency in SQL for complex queries, joins, aggregations, and data validation. Experience working with data warehouses, data lakes, or cloud-based data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse). Hands-on experience with test automation tools and frameworks related to data testing (e.g., Python, PyTest, DBT, Great Expectations). Knowledge of data profiling, data cleansing, and data governance practices. Familiarity with version control systems (e.g., Git) and CI/CD pipelines (e.g., Jenkins, Azure DevOps). Exposure to API testing for data integrations and ingestion pipelines (Postman, SoapUI, REST/SOAP APIs). Candidate Profile Bachelors degree in Computer Science, Information Technology, or a related field. 48 years of experience in data quality engineering or ETL QA roles. Excellent analytical and problem-solving skills. Strong communication and documentation abilities. Experience working in Agile/Scrum teams. Preferred Qualifications Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Big Data ecosystems (e.g., Hadoop, Spark, Hive). DataOps or DevOps experience is a plus. Certification in data or QA-related domains (ISTQB, Microsoft, AWS Data Analytics, etc.) Why Join Us? Work with modern data platforms and contribute to enterprise data quality initiatives. Be a key player in ensuring trust and confidence in business-critical data. Collaborate with cross-functional data, engineering, and analytics teams. Enjoy a culture that promotes growth, learning, and innovation (ref:hirist.tech)

Posted 5 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Position : Power BI Developer Experience : - 4 to 5 years Gender : Male Location: Chennai, OMR Mandatory Skills 4+ years of experience in Power BI development, including dashboards, reports, and DAX. Required Skills & Qualifications Strong expertise in SQL (writing complex queries, joins, aggregations, CTEs). Hands-on experience with Oracle databases Strong experience with Power BI (design, development, and deployment). Understanding of ETL processes and data warehousing concepts. Experience in performance tuning of SQL queries and Power BI reports. Ability to integrate Power BI with various data sources (ODBC, APIs, cloud services, etc.). Knowledge of Power BI Service, Power Query, and Power Automate Use Power Query and DAX for data transformation and calculations. Strong analytical and problem-solving skills.

Posted 6 days ago

Apply

4.0 years

4 - 6 Lacs

Chennai

On-site

GlassDoor logo

Position : Power BI Developer Experience : - 4 to 5 years Gender : Male Location: Chennai, OMR Mandatory Skills: 4+ years of experience in Power BI development, including dashboards, reports, and DAX. Required Skills & Qualifications: 1) Strong expertise in SQL (writing complex queries, joins, aggregations, CTEs). 2) Hands-on experience with Oracle databases 3)Strong experience with Power BI (design, development, and deployment). 4)Understanding of ETL processes and data warehousing concepts. 5) Experience in performance tuning of SQL queries and Power BI reports. 6) Ability to integrate Power BI with various data sources (ODBC, APIs, cloud services, etc.). 7)Knowledge of Power BI Service, Power Query, and Power Automate 8) Use Power Query and DAX for data transformation and calculations. 9) Strong analytical and problem-solving skills. #APACSYN

Posted 6 days ago

Apply

0 years

7 - 8 Lacs

Bengaluru

On-site

GlassDoor logo

Would you like to be part of a team that delivers high-quality software to our customers? Are you a highly visible champion with a ‘can do’ attitude and enthusiasm that inspires others? About Our Team Our global team supports products that educate and provide electronic health records, introducing students to digital charting and preparing them to document care in today’s modern clinical environment. We have a stable product that we strive to maintain while valuing trust, respect, collaboration, agility, and quality in our team. About the Role This position is performed by an experienced professional and will undertake difficult research, design and software development assignments within a software functional area or product line, and provides direct input to project plans, schedules and methodology in the development of cross-functional software products under the guidance of more senior members of the squad. This position performs software design - typically across multiple components; is developing the skills of mentoring more junior members of the team and works with others to talk to users/customers and translates their requests into solutions Responsibilities Working in an agile scrum team that works in a T-shape model to achieve a common goal and delivers value in 2-week sprints. Taking advantage of XP techniques such as test-driven development, pair programming and continuous delivery. Delivering software using modern technology stacks (microservices, single page applications). Developing test automation, including unit and integration testing. Suggesting and implementing solutions using AWS Cloud infrastructure. Working closely with Frontend, Data, System and Quality Engineers to build the product; receiving support from the entire team when needed, and willing to support them when necessary. Actively reviewing the implementations done in your team, and also other teams. Requirements Hands-on experience for writing complex queries/aggregations on Elastic Search. Solid experience with Java and modern Java frameworks. Strong knowledge of data structures, algorithms, and designing for performance, scalability, availability and security. Experience with relational and NoSQL data sources. Nice to have Scala and Spark Experience designing / building complex software systems that have been successfully delivered to customer. Demonstrate clear understanding of CI/CD, building and running pipelines (ideally Jenkins, Github Actions). Hands on experience on Test Driven Development and automated integration testing. Cloud experience (ideally AWS). Way that Works for You We promote a healthy work-life balance across the organization. We offer numerous well-being initiatives, shared parental leave, study assistance, and sabbaticals to help you meet both your immediate responsibilities and long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance. Enhanced Health Insurance Options. Group Life Insurance. Group Accident Insurance. Flexible Working Arrangements. Employee Assistance Program. Medical Screening. Modern Family Benefits including maternity, paternity, and adoption support. Long-Service Awards. Celebratory New Baby Gift. Subsidized Meals (location-specific). Various Paid Time Off options including Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport for home-office-home commutes (location-specific). About the Business We are a global leader in information and analytics, assisting researchers and healthcare professionals in advancing science and improving health outcomes. We combine quality information and extensive data sets with analytics to support science and research, health education, and interactive learning. At our company, your work contributes to addressing the world's grand challenges and fostering a sustainable future. We utilize innovative technologies to support science and healthcare, partnering for a better world. - We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here . Please read our Candidate Privacy Policy . We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. USA Job Seekers: EEO Know Your Rights .

Posted 6 days ago

Apply

0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Would you like to be part of a team that delivers high-quality software to our customers? Are you a highly visible champion with a ‘can do’ attitude and enthusiasm that inspires others? About Our Team Our global team supports products that educate and provide electronic health records, introducing students to digital charting and preparing them to document care in today’s modern clinical environment. We have a stable product that we strive to maintain while valuing trust, respect, collaboration, agility, and quality in our team. About the Role This position is performed by an experienced professional and will undertake difficult research, design and software development assignments within a software functional area or product line, and provides direct input to project plans, schedules and methodology in the development of cross-functional software products under the guidance of more senior members of the squad. This position performs software design - typically across multiple components; is developing the skills of mentoring more junior members of the team and works with others to talk to users/customers and translates their requests into solutions Responsibilities Working in an agile scrum team that works in a T-shape model to achieve a common goal and delivers value in 2-week sprints. Taking advantage of XP techniques such as test-driven development, pair programming and continuous delivery. Delivering software using modern technology stacks (microservices, single page applications). Developing test automation, including unit and integration testing. Suggesting and implementing solutions using AWS Cloud infrastructure. Working closely with Frontend, Data, System and Quality Engineers to build the product; receiving support from the entire team when needed, and willing to support them when necessary. Actively reviewing the implementations done in your team, and also other teams. Requirements Hands-on experience for writing complex queries/aggregations on Elastic Search. Solid experience with Java and modern Java frameworks. Strong knowledge of data structures, algorithms, and designing for performance, scalability, availability and security. Experience with relational and NoSQL data sources. Nice to have Scala and Spark Experience designing / building complex software systems that have been successfully delivered to customer. Demonstrate clear understanding of CI/CD, building and running pipelines (ideally Jenkins, Github Actions). Hands on experience on Test Driven Development and automated integration testing. Cloud experience (ideally AWS). Way that Works for You We promote a healthy work-life balance across the organization. We offer numerous well-being initiatives, shared parental leave, study assistance, and sabbaticals to help you meet both your immediate responsibilities and long-term goals. Working for You We understand that your well-being and happiness are essential to a successful career. Here are some benefits we offer: Comprehensive Health Insurance. Enhanced Health Insurance Options. Group Life Insurance. Group Accident Insurance. Flexible Working Arrangements. Employee Assistance Program. Medical Screening. Modern Family Benefits including maternity, paternity, and adoption support. Long-Service Awards. Celebratory New Baby Gift. Subsidized Meals (location-specific). Various Paid Time Off options including Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport for home-office-home commutes (location-specific). About the Business We are a global leader in information and analytics, assisting researchers and healthcare professionals in advancing science and improving health outcomes. We combine quality information and extensive data sets with analytics to support science and research, health education, and interactive learning. At our company, your work contributes to addressing the world's grand challenges and fostering a sustainable future. We utilize innovative technologies to support science and healthcare, partnering for a better world. - We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120. Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here . Please read our Candidate Privacy Policy . We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law. USA Job Seekers: EEO Know Your Rights .

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies