Jobs
Interviews

5099 Informatica Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

5 - 9 Lacs

Gurugram

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications function seamlessly to support organizational goals. You will also participate in testing and refining applications to enhance user experience and efficiency, while staying updated on industry trends and best practices to continuously improve your contributions to the team. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in troubleshooting and resolving application issues to ensure optimal performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and data warehousing concepts.- Experience with SAP BusinessObjects for reporting and analytics.- Familiarity with ETL processes and tools.- Knowledge of SQL for database querying and manipulation. Additional Information:- The candidate should have minimum 2 years of experience in SAP BW/4HANA.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

8.0 - 12.0 years

7 - 11 Lacs

Pune

Work from Office

Experience with ETL processes and data warehousing Proficient in SQL and Python/Java/Scala Team Lead Experience

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Pune

Work from Office

Experience with ETL processes and data warehousing Proficient in SQL

Posted 1 week ago

Apply

5.0 - 8.0 years

11 - 21 Lacs

Pune

Work from Office

This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. SECTION B: KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. SECTION C: QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks This role is accountable to develop, expand and optimize Data Management Architecture, Design & Implementation under Singtel Data Platform & Management Design, develop and implement data governance and management solution, data quality, Privacy, protection & associated control technology solutions as per best industry practice. Review, evaluate and implement Data Management standards primarily Data Classification, Data Retention across systems. Design, develop and implement Automated Data Discovery rules to identify presence of PII attributes. Drive development, optimization, testing and tooling to improve overall data control management (Security, Data Privacy, protection, Data Quality) Review, analyze, benchmark, and approve solution design from product companies, internal teams, and vendors. Ensure that proposed solutions are aligned and conformed to the data landscape, big data architecture guidelines and roadmap. SECTION B: KEY RESPONSIBILITIES AND RESULTS 1 Design and implement data management standards like Catalog Management, Data Quality, Data Classification, Data Retention 2 Drive BAU process, testing and tooling to improve data security, privacy, and protection 3 Identify, design, and implement internal process improvements: automating manual processes, control and optimizing data technology service delivery. 4 Implement and support Data Management Technology solution throughout lifecycle like user onboarding, upgrades, fixes, access management etc.. SECTION C: QUALIFICATIONS / EXPERIENCE / KNOWLEDGE REQUIRED Category Essential for this role Education and Qualifications Diploma in Data Analytics, Data Engineering, IT, Computer Science, Software Engineering, or equivalent. Work Experience Exposure to Data Management and Big Data Concepts Knowledge and experience in Data Management, Data Integration, Data Quality products Technical Skills Informatica CDGC, Collibra, Alatian Informatica Data Quality, Data Privacy Management Azure Data Bricks

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Senior Software Engineer - Snowflake As a Senior Software Engineer – Snowflake with the experience in Snowflake, SQL & AWS you will be with Convera, you will be responsible to Install, configure, and maintain Snowflake environments across development, testing, and production. Responsibilities Snowflake Administration & Maintenance: Install, configure, and maintain Snowflake environments across development, testing, and production. Manage roles, users, access controls, and permissions to enforce security best practices. Monitor and optimize compute resources, storage usage, and query performance. Set up and manage virtual warehouses to balance cost and efficiency. Data Management & Integration: Design and optimize schema, tables, views, and materialized views for performance. Implement data ingestion pipelines using Snowpipe, COPY commands, and external tables. Integrate Snowflake with ETL/ELT tools (Informatica, DBT, Airflow, Fivetran, etc.). Work with cloud storage (AWS S3, Azure Blob, Google Cloud Storage) for data integration. Performance Tuning & Optimization: Monitor and tune query performance using Query Profile and Warehouse best practices. Optimize clustering, caching, and auto-scaling for cost efficiency.Implement data partitioning, pruning, and compression for better performance. Security & Compliance: Ensure compliance with data governance policies and industry regulations (GDPR, HIPAA, etc.). Implement role-based access control (RBAC) and multi-factor authentication (MFA). Configure data masking, row-level security, and encryption for sensitive data. Set up auditing and logging to track user activities and data access. Backup, Recovery, and Disaster Planning: Manage Time Travel & Fail-safe features for data recovery. Implement backup and retention policies to meet business continuity requirements. Develop strategies for disaster recovery and high availability. Automation & Scripting: Automate administrative tasks using Python, SQL, PowerShell, or Bash. Develop and manage CI/CD pipelines for Snowflake using DevOps tools. Implement Infrastructure as Code (IaC) using Terraform or CloudFormation. Support & Documentation: Provide technical support and troubleshoot Snowflake-related issues. Collaborate with data engineers, analysts, and business users to optimize workflows. Document best practices, guidelines, and operational procedures. You Should Apply If You Have Strong experience with Snowflake administration, architecture, and security. Proficiency in SQL and query optimization. Knowledge of cloud platforms (AWS, Azure, GCP). Familiarity with ETL/ELT tools (dbt, Talend, Informatica, Airflow, etc.). Experience with BI tools (Tableau, Power BI, Looker) is a plus. Scripting skills in Python, PowerShell, or Shell scripting. Understanding of data modeling and warehousing concepts (Star Schema, Snowflake Schema). Strong problem-solving and troubleshooting skills. Excellent communication and collaboration with cross-functional teams. Ability to work independently and in an Agile environment. Preferred Qualifications: Bachelor’s degree in computer science, Data Science, Information Systems, or a related field. Snowflake SnowPro Core or Advanced Certification (preferred). Experience with Kubernetes, Terraform, or CI/CD tools is a plus. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers, which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, result-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual bonus. Great career growth and development opportunities in a global organization A flexible approach to work There are plenty of amazing opportunities at Convera for talented, creative problem solvers who never settle for good enough and are looking to transform Business to Business payments. Apply now if you’re ready to unleash your potential.

Posted 1 week ago

Apply

7.0 - 10.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Roles and Responsibilities: Development and implementation of DBT models, ensuring efficient data transformation workflows. Collaborate with data engineers, analysts, and stakeholders to gather requirements and translate them into robust DBT solutions. Optimize DBT pipelines for performance, scalability, and maintainability. Enforce best practices in version control, testing, and documentation within the DBT environment. Monitor and troubleshoot DBT workflows to ensure reliability and timely delivery of data products. Provide guidance and mentorship to the team on DBT practices and advanced modeling techniques. Stay updated on the latest DBT features and incorporate them into the data transformation ecosystem. Critical Skills to Possess: Snowflake and DBT 7+ years of experience Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience Roles and Responsibilities Roles and Responsibilities: Development and implementation of DBT models, ensuring efficient data transformation workflows. Collaborate with data engineers, analysts, and stakeholders to gather requirements and translate them into robust DBT solutions. Optimize DBT pipelines for performance, scalability, and maintainability. Enforce best practices in version control, testing, and documentation within the DBT environment. Monitor and troubleshoot DBT workflows to ensure reliability and timely delivery of data products. Provide guidance and mentorship to the team on DBT practices and advanced modeling techniques. Stay updated on the latest DBT features and incorporate them into the data transformation ecosystem. Critical Skills to Possess: Snowflake and DBT 7+ years of experience Preferred Qualifications: BS degree in Computer Science or Engineering or equivalent experience

Posted 1 week ago

Apply

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to drive digital transformation and achieve operational excellence through cutting-edge software solutions. Whether deployed on-premises, in the cloud, or via SaaS, PTC empowers customers to innovate faster, work smarter, and boost performance. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Product Specialist , you will be part of a high-performing Technical Support team that helps customers resolve technical issues, understand our products, and maximize the value they receive from our solutions. This role is ideal for someone with a solid technical foundation who is eager to grow in the enterprise software support space. You will learn to work across teams, improve support processes, and grow into a trusted technical advisor for customers. Key Responsibilities: Investigate and troubleshoot customer-reported technical issues. Provide timely resolutions or workarounds to ensure customer satisfaction. Escalate complex issues to senior engineers or product teams with detailed analysis. Document resolutions and contribute to knowledge base articles for customer self-help. Collaborate with peers and cross-functional teams to support issue resolution. Manage and track assigned cases using Salesforce. Participate in internal training, knowledge-sharing sessions, and workshops. Follow established processes and contribute to continuous improvement initiatives. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Basic to intermediate experience with SQL (Oracle or SQL Server preferred). Familiarity with application server environments (e.g., Apache Tomcat, web server setups). Exposure to Java-based enterprise applications (from a support or academic background). Ability to analyze logs, perform root cause analysis, and provide actionable insights. Experience with ETL tools (e.g., Informatica, Kettle or IICS) Good problem-solving skills with a focus on delivering customer value. Strong communication and documentation skills. Preferred Qualifications (Nice to Have): Exposure to UNIX/Linux operating systems and basic command-line knowledge. Basic familiarity with cloud platforms like AWS. Interest in or foundational knowledge of machine learning concepts. Bachelor's degree in Computer Science, Information Systems, or related field. 2+ years of relevant experience in technical support, application support, or similar role. Why Join PTC? Be part of a company that values learning, inclusion, and innovation. Work with a supportive team and opportunities for skill development and career growth. Benefits include: Best-in-class insurance policies Generous leave and PTO Flexible work hours and casual dress code Birthday leave, no probation period Employee stock options and support for higher education Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here."

Posted 1 week ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

3.0 years

4 - 7 Lacs

Hyderābād

On-site

About the job Our Team: The Digital Product Technical Complaint (PTC) Team is part of Product Line Compliance Digital Quality & HSE in Digital Manufacturing and Supply department within Sanofi. Our mission is to develop, configure, maintain METEOR product (our Salesforce CRM PTC solution) and deploy new countries using a SaaS platform based on Salesforce service Cloud, using Sanofi standard technologies & tools, in accordance with Sanofi policies. We are looking for a Senior Salesforce config specialist knowing Salesforce Service Cloud, being Salesforce certified. You will work to implement METEOR PTC CRM product Backlog in collaboration with AIMS team doing Salesforce configuration, maintaining existing solution (specific developments) and eventually developing new ones in link with Product Owner & Scrum master. You will work on the identification of the impacts in METEOR at each Salesforce release (3x a year). What you will be doing: Collaborate closely with AIMS Salesforce & Informatica teams (developers and testers), by securing the quality of the delivery Based on User requirements, design, implement, and maintain Salesforce METEOR configuration & developments Analyze METEOR Impact at each Salesforce release and propose a remediation plan Review METEOR Impact assessment at each Release/Change request Work with AIMS team on some INC that requires Salesforce Expertise or to create a case at Salesforce vendor Document in required specifications, Jira & Confluence or Veeva/QualiPSO or any other agreed documentations. Work with Product Owner and partner systems POC to define and follow-up Release schedule including countries rollout Follow GxP rules / guidelines Perform code/config & IT testing reviews and ensure quality and adherence to standards Troubleshoot and debug software issues when necessary and identify patterns with the support of AIMS Participate in team meetings and backlog grooming Continuously learn and keep up to date with the latest technologies and industry trends Main responsibilities: Soft skills : Excellent communication skills, with experience in influencing, listening actively and negotiating within a team environment. Self-motivated, highly organized, able to work in a collaborative Global Team environment Knowledge on Agile, Scrum methodology, Kanban Capacity to lead a team Technical skills : Salesforce Configuration and APEX programming language Education : Bachelor's degree in Computer Science or related field (or equivalent experience) Preferred Qualification: Salesforce Certified Experience in GxP environments Experience with Agile methodologies Experience with JIRA & Confluence Experience in Software development processes and procedures Experience in change management and release processes Experience : Salesforce experience of 3 years minimum (Config and APEX) Languages : English Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue Progress. And let’s discover Extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com!

Posted 1 week ago

Apply

3.0 years

4 - 8 Lacs

Hyderābād

On-site

About the Job : We are looking for a detail-oriented and technically skilled MDM – Data Analyst to join our Customer Master Data Management (MDM) team. This role will primarily focus on data integration activities that ensure high-quality, consistent, and accurate customer data across enterprise systems. You will work closely with data engineers, MDM architects, business stakeholders, and external vendors to support seamless ingestion, transformation, and publishing of customer data. What you will be doing: Key Responsibilities: Support the integration of customer data from multiple source systems (e.g., SAP, Veeva, IQVIA) into the MDM platform (e.g., Semarchy, Informatica). Analyze source-to-target mappings, conduct data profiling, and identify integration gaps or inconsistencies. Collaborate with Data Integration (DI) and Product teams to define and document business and technical requirements for new data pipelines. Perform data validation, transformation checks, and support UAT activities during integration cycles. Monitor and support batch/real-time data flows to and from the MDM platform, ensuring accuracy, completeness, and timeliness. Support change request analysis and data onboarding efforts for new markets and use cases. Work with stakeholders to address data quality issues and contribute to data governance initiatives. Assist in the creation of reporting and analytics dashboards (e.g., Power BI) to track integration and data health KPIs. About You: Qualifications & Skills: Overall excellent understanding of the customer MDM & key features 3+ years of experience as a Data Analyst, preferably in MDM or data integration environments. Hands-on experience with SQL, Snowflake, and data profiling/ETL tools. Familiarity with MDM platforms (e.g., Semarchy, Informatica MDM, Reltio) and CRM systems (e.g., Veeva, Salesforce) is a strong plus. Experience working with structured data from enterprise platforms like SAP, IQVIA, or similar. Strong analytical, problem-solving, and documentation skills. Experience in Agile/Scrum environments is preferred. Excellent communication and stakeholder management skills. Bachelor’s degree in engineering Nice to Have: Exposure to cloud data platforms (e.g., IICS, Azure, AWS, GCP) Knowledge of data privacy regulations (e.g., GDPR) Prior experience working in the Life Sciences or Healthcare domain Pursue Progress Discover Extraordinary Better is out there. Better medications, better outcomes, better science. But Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people.

Posted 1 week ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Location Chennai Work from Office Experience Level 8 10 years Tier T2 We are seeking a highly skilled and experienced Senior Data Engineer to lead the design and development of scalable secure and high performance data pipelines hosted on a cloud platform The ideal candidate will have deep expertise in Databricks Data Fabric MDM Informatica and Unity Catalog and a strong foundation in data modelling software engineering and DevOps practices This role is critical to building a next generation healthcare data platform that will power advanced analytics operational efficiency and business innovation Key Responsibilities 1 Data Pipeline Design Development Translate business requirements into actionable technical specifications defining application components enhancement needs data models and integration workflows Design develop and optimize end to end data pipelines using Databricks and related cloud native tools Create and maintain detailed technical design documentation and provide accurate estimations for storage compute resources cost efficiency and operational readiness Implement reusable and scalable ingestion transformation and orchestration patterns for structured and unstructured data sources Ensure pipelines meet functional and non functional requirements such as latency throughput fault tolerance and scalability 2 Cloud Platform Architecture Build and deploy data solutions on Microsoft Azure Azure Fabric leveraging Data Lake Unity Catalog Integrate pipelines with Data Fabric and Master Data Management MDM platforms for consistent and governed data delivery Follow best practices in cloud security encryption access controls and identity management 3 Data Modeling Metadata Management Design robust and extensible data models supporting analytics AI ML and operational reporting Ensure metadata is cataloged documented and accessible through Unity Catalog and MDM frameworks Collaborate with data architects and analysts to ensure alignment with business requirements 4 DevOps CI CD Automation Adopt DevOps best practices for data pipelines including automated testing deployment monitoring and rollback strategies Work closely with platform engineers to manage infrastructure as code containerization and CI CD pipelines Ensure compliance with enterprise SDLC security and data governance policies 5 Collaboration Continuous Improvement Partner with data analysts and product teams to understand data needs and translate them into technical solutions Continuously evaluate and integrate new tools frameworks and patterns to improve pipeline performance and maintainability Key Skills Technologies Required Databricks Delta Lake Spark Unity Catalog Azure Data Platform Data Factory Data Lake Azure Functions Azure Fabric Unity Catalog for metadata and data governance Strong programming skills in Python SQL Experience with data modeling data warehousing and star snowflake schema design Proficiency in DevOps tools Git Azure DevOps Jenkins Terraform Docker Preferred Experience with healthcare or regulated industry data environments Familiarity with data security standards e g HIPAA GDPR

Posted 1 week ago

Apply

6.0 years

0 Lacs

Gurgaon

On-site

Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to achieve significant digital transformation through our market-leading software solutions. We empower customers to innovate faster, improve operations, and drive business growth—whether on-premises, in the cloud, or through our SaaS platform. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Senior Technical Support Specialist , you will serve as a key technical advisor and escalation point within the Servigistics Support organization. You will bring your rich industry experience to drive strategic customer success, mentor junior team members, and lead complex troubleshooting efforts. You will work cross-functionally with engineering, product management, and customer teams to ensure seamless and proactive technical support delivery. Key Responsibilities: Serve as the primary technical contact for high-priority and complex customer escalations. Lead resolution of mission-critical issues involving product functionality, performance, and deployment. Partner with global cross-functional teams to ensure holistic and timely resolution of customer challenges. Proactively identify and drive improvements in support processes and product usability. Contribute to and review KCS-aligned knowledge articles and promote customer self-service strategies. Collaborate with product and engineering teams to influence product roadmap based on customer feedback and insights. Mentor and guide junior technical support engineers; provide coaching and best practices. Represent support in customer meetings, escalations, and business reviews. Maintain high SLA compliance for enterprise customers with complex environments. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Strong experience in diagnosing and resolving enterprise-grade application issues across multiple layers (web, application, and database). Deep expertise in SQL (Oracle and SQL Server), with ability to write and optimize complex queries. Hands-on experience with ETL tools (Informatica, IICS, Kettle/Pentaho) and resolving batch job failures. Solid understanding of open-source web technologies such as Apache Tomcat and Apache Web Server. Experience in performance tuning, server configuration, log analysis, and application scalability. Knowledge of Java-based enterprise applications and implementation or support lifecycle. Familiarity with enterprise IT environments (networks, load balancing, security protocols, integrations). Proven ability to work independently under pressure while managing multiple complex issues. Preferred Qualifications: Experience with UNIX/Linux environments and command-line utilities. Knowledge of cloud platforms such as AWS including services S3. Exposure to machine learning concepts and their integration within enterprise systems Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 6+ years of relevant technical support, implementation, or consulting experience in enterprise software. Excellent written and verbal communication skills; able to interact confidently with senior stakeholders. Why Join PTC? Work with innovative products and talented global teams. Collaborative and inclusive culture where your voice matters. Extensive benefits including: Best-in-class insurance Employee stock purchase plan and RSUs Generous PTO and paid parental leave Flexible work hours and no probation clause Career growth opportunities and higher education support Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 1 week ago

Apply

2.0 years

7 - 10 Lacs

Gurgaon

On-site

Our world is transforming, and PTC is leading the way. Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to drive digital transformation and achieve operational excellence through cutting-edge software solutions. Whether deployed on-premises, in the cloud, or via SaaS, PTC empowers customers to innovate faster, work smarter, and boost performance. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Product Specialist , you will be part of a high-performing Technical Support team that helps customers resolve technical issues, understand our products, and maximize the value they receive from our solutions. This role is ideal for someone with a solid technical foundation who is eager to grow in the enterprise software support space. You will learn to work across teams, improve support processes, and grow into a trusted technical advisor for customers. Key Responsibilities: Investigate and troubleshoot customer-reported technical issues. Provide timely resolutions or workarounds to ensure customer satisfaction. Escalate complex issues to senior engineers or product teams with detailed analysis. Document resolutions and contribute to knowledge base articles for customer self-help. Collaborate with peers and cross-functional teams to support issue resolution. Manage and track assigned cases using Salesforce. Participate in internal training, knowledge-sharing sessions, and workshops. Follow established processes and contribute to continuous improvement initiatives. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Basic to intermediate experience with SQL (Oracle or SQL Server preferred). Familiarity with application server environments (e.g., Apache Tomcat, web server setups). Exposure to Java-based enterprise applications (from a support or academic background). Ability to analyze logs, perform root cause analysis, and provide actionable insights. Experience with ETL tools (e.g., Informatica, Kettle or IICS) Good problem-solving skills with a focus on delivering customer value. Strong communication and documentation skills. Preferred Qualifications (Nice to Have): Exposure to UNIX/Linux operating systems and basic command-line knowledge. Basic familiarity with cloud platforms like AWS. Interest in or foundational knowledge of machine learning concepts. Bachelor's degree in Computer Science, Information Systems, or related field. 2+ years of relevant experience in technical support, application support, or similar role. Why Join PTC? Be part of a company that values learning, inclusion, and innovation. Work with a supportive team and opportunities for skill development and career growth. Benefits include: Best-in-class insurance policies Generous leave and PTO Flexible work hours and casual dress code Birthday leave, no probation period Employee stock options and support for higher education Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 1 week ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Noida

Remote

Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization

Posted 1 week ago

Apply

8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Telangana

Work from Office

Key Responsibilities ETL Development: Design and implement ETL processes using Informatica PowerCenter, Cloud Data Integration, or other Informatica tools. Data Integration: Integrate data from various sources, ensuring data accuracy, consistency, and high availability. Performance Optimization: Optimize ETL processes for performance and efficiency, ensuring minimal downtime and maximum throughput.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

As a Data Engineer , you are required to: Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification : Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level : At least 3 - 5 years hands-on experience in Data Engineering Desired Knowledge & Experience : Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internals: Catalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL: TSQL/Spark SQL/HiveQL Storage: Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines: ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL: Cosmos, Mongo, Cassandra Cubes: SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server: TSQL, Stored Procedures Hadoop: HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog: Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities : Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.

Posted 1 week ago

Apply

5.0 - 7.0 years

25 - 40 Lacs

Gurugram

Work from Office

Our world is transforming, and PTC is leading the way.Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. About PTC: PTC (NASDAQ: PTC) enables global manufacturers to achieve significant digital transformation through our market-leading software solutions. We empower customers to innovate faster, improve operations, and drive business growth—whether on-premises, in the cloud, or through our SaaS platform. At PTC, we don’t just imagine a better world—we enable it. Role Overview: As a Senior Technical Support Specialist , you will serve as a key technical advisor and escalation point within the Servigistics Support organization. You will bring your rich industry experience to drive strategic customer success, mentor junior team members, and lead complex troubleshooting efforts. You will work cross-functionally with engineering, product management, and customer teams to ensure seamless and proactive technical support delivery. Key Responsibilities: Serve as the primary technical contact for high-priority and complex customer escalations. Lead resolution of mission-critical issues involving product functionality, performance, and deployment. Partner with global cross-functional teams to ensure holistic and timely resolution of customer challenges. Proactively identify and drive improvements in support processes and product usability. Contribute to and review KCS-aligned knowledge articles and promote customer self-service strategies. Collaborate with product and engineering teams to influence product roadmap based on customer feedback and insights. Mentor and guide junior technical support engineers; provide coaching and best practices. Represent support in customer meetings, escalations, and business reviews. Maintain high SLA compliance for enterprise customers with complex environments. Available to work 24x7 on rotational basics and willingness to support weekend shifts when scheduled ensuring readiness for global support needs. Required Skills & Competencies: Strong experience in diagnosing and resolving enterprise-grade application issues across multiple layers (web, application, and database). Deep expertise in SQL (Oracle and SQL Server), with ability to write and optimize complex queries. Hands-on experience with ETL tools (Informatica, IICS, Kettle/Pentaho) and resolving batch job failures. Solid understanding of open-source web technologies such as Apache Tomcat and Apache Web Server. Experience in performance tuning, server configuration, log analysis, and application scalability. Knowledge of Java-based enterprise applications and implementation or support lifecycle. Familiarity with enterprise IT environments (networks, load balancing, security protocols, integrations). Proven ability to work independently under pressure while managing multiple complex issues. Preferred Qualifications: Experience with UNIX/Linux environments and command-line utilities. Knowledge of cloud platforms such as AWS including services S3. Exposure to machine learning concepts and their integration within enterprise systems Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 6+ years of relevant technical support, implementation, or consulting experience in enterprise software. Excellent written and verbal communication skills; able to interact confidently with senior stakeholders. Why Join PTC? Work with innovative products and talented global teams. Collaborative and inclusive culture where your voice matters. Extensive benefits including: Best-in-class insurance Employee stock purchase plan and RSUs Generous PTO and paid parental leave Flexible work hours and no probation clause Career growth opportunities and higher education support Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

3.0 - 6.0 years

0 Lacs

Andhra Pradesh

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies