Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Master Data Migration Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Migration.- Strong understanding of data governance and data quality principles.- Experience with data mapping and transformation processes.- Familiarity with SAP data management tools and methodologies.- Ability to troubleshoot and resolve data migration issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in SAP Master Data Migration.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : IBM Netezza Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Netezza.- Good To Have Skills: Experience with data warehousing concepts and practices.- Strong understanding of ETL processes and data integration techniques.- Familiarity with data modeling and database design principles.- Experience with performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 7.5 years of experience in IBM Netezza.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Informatica developer informatica, informatica powercenter Exp: 5 to 9 yrs Notice period: Immediate to 30 days Location: Hyderabad & Noida Thanks & Regards, Indumati N Indumati.nakate@alphacom.in https://www.linkedin.com/company/alphacom-systems-and-solutions-pvt-ltd/ www.alphacom.in
Posted 1 week ago
2.0 - 7.0 years
0 Lacs
India
On-site
WHO WE ARE Sapaad is a global leader in all-in-one unified commerce platforms, dedicated to delivering world-class software solutions. Its flagship product, Sapaad, has seen tremendous success in the last decade, with thousands of customers worldwide, and many more signing on. Driven by a team of passionate developers and designers, Sapaad is constantly innovating, introducing cutting-edge features that reshape the industry. Headquartered in Singapore, with offices across five countries, Sapaad is backed by technology veterans with deep expertise in web, mobility, and e commerce, making it a key player in the tech landscape. THE OPPORTUNITY Sapaad PTE LTD is seeking a Data Engineer who will take charge of constructing our distributed processing and big data infrastructure, as well as the accompanying applications and tools. We're looking for someone with a fervent drive to tackle intricate data challenges and collaborate closely with the data team, all while staying abreast of the latest features and tools in Big Data and Data Science. The successful candidate will play a pivotal role in supporting software developers, data architects, data analysts, and data scientists in various data initiatives. He/She will ensure the smooth and efficient delivery of data across ongoing projects. We require an individual who is self-directed and capable of adeptly managing the data requirements of multiple teams, systems, and products. ROLES AND RESPONSIBILITIES Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into key business performance metrics. Work with stakeholders including the Executive, Product, Data Architects, and Design teams to assist with data-related technical issues and support their data infrastructure needs. To keep the data separated and secure. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. ROLE REQUIREMENTS Candidate with 2 to 7 years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, IT or Statistics, or another quantitative field. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency, and workload management. Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores. Good communication skills and team player. Experience supporting and working with cross-functional teams in a dynamic environment. Preferred to have experience using the following software/tools: Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases. Experience with data pipeline / ETL tools like Informatica, DataStage, and with any cloud tool like Azure Data Factory (ADF), etc. Experience with data engineering on cloud services like Azure, AWS, or GCP. Experience with stream-processing systems: Storm, Spark-Streaming, etc. Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
Posted 1 week ago
2.0 - 4.0 years
7 - 11 Lacs
Hyderabad
Remote
We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).
Posted 1 week ago
2.0 - 4.0 years
7 - 11 Lacs
Mumbai
Remote
We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).
Posted 1 week ago
2.0 - 4.0 years
7 - 11 Lacs
Kolkata
Remote
We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).
Posted 1 week ago
2.0 - 4.0 years
7 - 11 Lacs
Bengaluru
Remote
We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).
Posted 1 week ago
6.0 - 8.0 years
6 - 9 Lacs
Hyderābād
On-site
Informatica MDM – Sr. Data Engineer, DT-US PxE DnA The Informatica MDM – Sr. Data Engineer is an integral part of the MDM engineering team focused on implementing master data management and governance solutions as part of US Data Strategy initiative. Requires expertise in master data management, data integration, data quality, data governance, and experience with Informatica platform solutions. Work you will do This is a unique opportunity to be a part of growing data operations and enterprise controls team that works on new and existing high-end data management platforms within Deloitte. You will be responsible for leading, development, implementation, and support of Informatica solutions to support organizational master data management and data governance capabilities in a complex hybrid, multi-cloud environment. Responsibilities include: • Understanding the business rules and sourcing data from multiple source systems Plan and lead MDM requirement analysis sessions with business users Coordinate MDM program activities for the Customer and Product data domains Conduct source system analysis, and data profiling for lifecycle management Define measurable metrics and the minimal required attributes for each domain or subject area to support a robust and successful deployment of a MDM platform Demonstrated business acumen and the ability to apply technology solutions to solve business problems. Automated/Scheduled the cloud jobs to run daily with email notifications for any failures and Timeout errors Documented the Sources, Targets and Mappings Exposure to and conceptual understanding of data integration tools and technologies such as Informatica PowerCenter, IICS CDI,CIH,CAI, Microsoft SSIS and RDBMS concepts and methods within SQL Server Performed testing activity by using SQL Queries to validate the data after loading Maintain expertise in Informatica IDMC products and capabilities. Develop data quality / data governance solutions as part of an onshore/offshore team working in an agile environment. Execute work assignments and keep team updated on status/issues/challenges. Work closely cross functional teams; align and influence stakeholders and resources to deliver data solutions to market. Technical ability to understand project requirements. Communication skills to work with application and vendor teams as required. Perform troubleshooting and problem resolution when needed. Work with other team members to provide 24/7 support for Production issues/outages. Create and maintain technical documentation. Design, develop, and support our Data applications and infrastructure utilizing various technologies to process large volumes of data. Create solutions and environments to enable Analytics and Business Intelligence capabilities. Implement data quality checks & Standardization in the code. Deploy mappings that will run in a scheduled batch or real-time environment. Document all mappings mapplets and rules in detail and hand over documentation to the customer. Education: Bachelor’s degree in Computer Science or Business Information Systems or MCA or equivalent degree. Qualifications: 6-8 years direct experience in MDM Hub 10.5 design, development, support, and operations with Informatica MDM tool suite Hands-on expertise required with Informatica tools and utilities such as Data Controls, Data Hands on experience in developing Informatica Master data management tool Extensive knowledge and experience in defining Master Data Management strategies, implementing MDM services and capabilities, Metadata Management, Data Governance, and Data Quality solutions Familiarity with enterprise data integration technologies and methods such as IICS CDI, CAI,CIH, data replication, and data services Experience with enterprise level data analysis and integration work and/or providing data focused systems integration solutions Configure, Customize and develop inbound / outbound data loads Build the data model on oracle customer data master & Informatica MDM Build integration & do match merge processes Should be able to manage customer & other technical stakeholders & support engineers Should be able to lead the team along with other technical stakeholders Should understand configure MDM Components such as Merge Tasks, Hierarchies, Provisioning Tool, External Components (DaaS), ActiveVOS, SOAP SIF API Framework, BES REST API Framework, etc. Extensively used Informatica Workflow Manager and Workflow monitor for creating and monitoring workflows, Worklets and sessions Experience in maintenance and enhancement of existing data extraction, cleansing, transformation and loading processes to improve efficiency Worked with SQL Server Stored Procedures and experienced in loading data into Data Warehourse/Data Marts using Informatica, SQL*loader, Export/Import utilities Strong analytical, problem solving and multi-tasking skills, as well as communication and interpersonal skills. Strong verbal and written communication skills, with an ability to express complex business concepts in non-technical terms. Experience in Technical delivery using Agile Methodologies. Experience working with cross-functional teams in delivery of new products or services. Strong collaboration skills and self-motivated, taking ownership of deliverables and responsibilities. Strong systems and application delivery ability. The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications DT-US Product Engineering (PxE) Product Engineering (PxE) is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. How you will grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources, including live classrooms, team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learning center in the Hyderabad office, is an extension of the DU in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/engagement requirements and at the discretion of the management. #EAG-Technology Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302583
Posted 1 week ago
3.0 years
6 - 9 Lacs
Hyderābād
On-site
Data Analytics Engineer – CL3 Role Overview : As a Data Analytics Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement, and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and data designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong data engineering foundation with deep understanding of data-structure, algorithms, code instrumentations, etc. § 3+ years of experience with data integration/governance technologies such as Informatica IDMC (CDI, CAI, CDGC, CDMP/Marketplace), PowerCenter, IICS, and Metadata Manager. § Hands-on experience with Secure Agent management (installation, configuration, monitoring, upgrades, automation). Proficiency in Data Governance tools: Informatica Metadata Manager, CDGC, CDMP (Marketplace). Experience with metadata management, data lineage, and impact analysis. Strong knowledge of data governance principles, frameworks, and best practices. Proficiency in scripting (Python, Bash, Shell) for automation. Strong in writing complex SQL queries, procedures, etc., using MS SQL Server or similar databases. Familiarity with cloud platforms (Google Cloud, Azure) and containerization (Kubernetes). § Strong preference will be given to candidates with experience in AI/ML and GenAI. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 306871
Posted 1 week ago
3.0 - 4.0 years
6 - 9 Lacs
Hyderābād
On-site
Summary We are seeking a highly-skilled and experienced Marketing Cloud Testing team to join our team Marketing Automation team who works closely with brand teams; understands various data sources, adept in building data ingestion pipelines, skilled in testing end-to-end data ingestion layers, data models and visualization dashboards based on previously built test scripts. About the Role Key Responsibilities: Build e2e test scripts for each release based on user epics across the data value chain – ingestion, data model and visualization Post development, run the test scripts using any of testing platforms viz Proton etc Document results and highlight any bugs / errors to development team and work closely with development team to resolve the issues Must audit technical developments and solutions and validate matching of source data with MCI Additional responsibilities may include creating and updating knowledge documents in the repository as needed. Work closely with Technical Lead and Business Analysts to help design testing strategy and testing design as part of pre-build activities Participate in data exploration and data mapping activities along with technical lead and business and DDIT architects for any new data ingestion needs from business along with Development team Build and maintain standard SOPs to run smooth operations that enable proper upkeep of visualization data and insights Qualifications: Minimum of 3-4 years of experience in Dataroma / MCI as hands on developer Prior experience in any of visualization platforms viz Tableau, Qlik, Power BI as core developer is a plus Experience of working on Data Cloud and other data platforms is a plus Hand-on experience in using any ETL tools such as Informatica, Alteryx, DataIKU preferred Prior experience in testing automation platforms preferred Excellent written and verbal skills. Strong interpersonal and analytical skills Ability to provide efficient, timely, reliable, and courteous service to customers. Ability to effectively present information Demonstrated knowledge of the Data Engineering & Business Intelligence ecosystem Salesforce MCI certification. Familiarity with AppExchange deployment, Flow, Aura component and Lightning Web component will be a plus. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 1 week ago
1.0 - 2.0 years
1 - 5 Lacs
Gurgaon
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company and a leader in the convenience store and fuel space with over 16,700 stores. It has footprints across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT’s Global Data & Analytics Team, and the Associate ML Ops Analyst will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. About the role The incumbent will be responsible for implementing Azure data services to deliver scalable and sustainable solutions, build model deployment and monitor pipelines to meet business needs. Roles & Responsibilities Development and Integration Collaborate with data scientists to deploy ML models into production environments Implement and maintain CI/CD pipelines for machine learning workflows Use version control tools (e.g., Git) and ML lifecycle management tools (e.g., MLflow) for model tracking, versioning, and management. Design, build as well as optimize applications containerization and orchestration with Docker and Kubernetes and cloud platforms like AWS or Azure Automation & Monitoring Automating pipelines using understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Implement model monitoring and alerting systems to track model performance, accuracy, and data drift in production environments. Collaboration and Communication Work closely with data scientists to ensure that models are production-ready Collaborate with Data Engineering and Tech teams to ensure infrastructure is optimized for scaling ML applications. Optimization and Scaling Optimize ML pipelines for performance and cost-effectiveness Operational Excellence Help the Data teams leverage best practices to implement Enterprise level solutions. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Helping to define common coding standards and model monitoring performance best practices Continuously evaluate the latest packages and frameworks in the ML ecosystem Build automated model deployment data engineering pipelines from plain Python/PySpark mode Stakeholder Engagement Collaborate with Data Scientists, Data Engineers, cloud platform and application engineers to create and implement cloud policies and governance for ML model life cycle. Job Requirements Education & Relevant Experience Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) 1-2 years of relevant working experience in MLOps Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Knowledge of core computer science concepts such as common data structures and algorithms, OOPs Programming languages (R, Python, PySpark, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Exposure to ETL tools and version controlling Experience in building and maintaining CI/CD pipelines for ML models. Understanding of machine-learning, information retrieval or recommendation systems Familiarity with DevOps tools (Docker, Kubernetes, Jenkins, GitLab). #LI-DS1
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence, and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Job Description Development Build FinTech solutions for banking, trading, and finance across all segments of the global market. These include award winning web & mobile applications, data science and analytics, complex event processing, cloud solutions, low latency applications, and responsive experiences. Work with global development teams and business partners across USA, UK, Europe and Asia Pacific Capture and translate business / functional requirements for banking, trading, markets Good at problem solving and quantitative skills Design and architect solutions based on requirements or based on your innovative ideas Develop software in agile and iterative cycles using continuous improvement tools and techniques Test software using test driven development and embedded QA teams Identify, escalate, and resolve incidents and issues Participate in innovation programs, developer forums, Hackathons Good written and verbal communications skills with good positive attitude We work on cutting edge technologies like AI, Machine Learning, Hadoop, Python, Scala, Pega, .NET, Java, Angular, React, Cassandra, memSQL, Tableau, ETL and among several others Business Analysis Change enabler in an organizational context by defining needs and recommending solutions that delivers value to clients. Good at problem solving and quantitative skills Work closely with the business to capture requirements Analyze business and functional requirements provided from the business Document functional and operational impacts to associates and customers Assist in completion and documentation of designs (functional and technical) Provide expert knowledge on assigned application(s), functionality and associate/customer processes Develop expert knowledge on business processes, rules, and regulations Document the interaction of data, functions and business processes for selected functionality Prepare analysis schedule Conduct the feasibility study of the current system Track issues / reporting Good written and verbal communications skills with good positive attitude Opportunity to utilize tools like Microsoft Visio (Diagramming) & cutting-edge Change Management / Wireframingtools (Mockups) Testing Functional & Technical Specialist in discovering the unexpected & bring confidence in software Good at problem solving and quantitative skills Verify that the application meets all functional business requirements Ensure that all component changes are tested against areas impacted and that solutions work from an integration/operations perspective Include the scope, test cycles, risks, regression testing approach, environment requirements, data requirements, metrics, and work plan Develop test conditions and build test scripts based on functional design specifications and the test approach Confirm the architectural stability of the system with a focus on functional, load testing, fail-over/recoverability and operational testing. In some systems will also monitor, measure, and optimize individual and combined hardware and/or software components for optimal performance Perform unit testing and component integration testing Design and Develop Technical Test Approach, Load Tests, Fail-over and Recoverability Tests and Operational Tests Document and execute Test Scripts & report the execution progress Identify & escalate stoppers / concerns /issues to the project management team early. Ability to work as a team player in an agile way of working. Serve as a quality gatekeeper for the application releases. Opportunity to validate the applications using latest tools & technologies like Selenium, Appium, SpecFlow, Lettuce, Cucumber, UFT, qTest, LoadRunner, SOA Tester, TOSCA, Test Complete, Java , Python ,VBScript & JIRA Infrastructure Operations Infrastructure & Environment control specialists supporting all streams Support the efforts of development teams through development and testing environment creation, hardware and software configuration, build and migration coordination and technical support Handle escalated production support issues Configure software for supporting specific developer applications Coordinate the migration of configuration changes across environments Migrate code from component integration test to systems integration test Install and configure server applications Track issues Good written and verbal communications skills with good positive attitude Opportunity to handle SVN, Citrix, Informatica, Autosys, SQL servers, Coral 8, TeamCity, Jenkins, AS 400, Unix, Oracle Production Support Front face of IT department and an all-rounder in support Provide application support to the production environment Maintain detailed support processes and operations framework to make sure the application availability 24/7 Production control to ensure applications are available and running at peak efficiency All aspects required to process batch production within application services Proactively monitor application availability, performance, response time, exceptions, faults and failures using a range of proprietary as well as third party monitoring tools Provide usage trend analysis and status reports Be part of incident Triages, provide relevant information and proper communication to stakeholders Good written and verbal communications skills with good positive attitude Opportunity to monitor & control using Geneos, Citrix, Sybase Central, SQL server, Coral 8, Tibco, Quartz, BOB job monitor, Appwatch, PEGA Cyber Security Defense and Assessment Front face for Cyber Security events, incidents and an all-rounder in technical & operational support Regular analysis of Cyber Security information Replying to general Cyber Security queries Assist in Cyber Security investigation Supporting Identity and Access Management Identify vulnerability in Cyber Security which requires remediation Recording and responding to Cyber Security events and incidents in timely fashion Review, monitor and maintain Cyber Security controls and their implementation Auditing of systems, services and processes against policy, best practice and standards in a methodical and clearly documented fashion Opportunity to work on different Cyber Security tools, like DLP products, Data Classification tools, Splunk, SIEM tools eg. ArcSight etc Cyber Security Technology Responsible for defining, documenting, and publicizing strategic roadmap for various cyber security technology stacks for Bank of America Contributing to the development of innovative software capabilities to secure Bank products using DevSecOps pipelines and automation Participating in rapid prototyping and product security software research and development projects Innovating new software-based capabilities to secure software containers from internal and external cyber-attacks by being able to detect, respond, and recover without human intervention or mission degradation Participating in the development of algorithms, interfaces and designs for cyber-secure and resilient software systems Performing collaborative design & development with other engineers and suppliers Joining a team performing cyber risk assessments and developing risk mitigation plans Performing analysis of systems and components for risks, vulnerabilities, and threats Supporting incident response and mitigation Monitor networks for security breaches and investigate a violation when one occurs Develop security standards and best practices Assist with maintaining a strong cybersecurity posture Assist in developing new policies, design processes, and procedures, and develop technical designs to secure the development environment and trainer systems Assess system vulnerabilities, implement risk mitigation strategies, and validate secure systems, and test security products and systems to detect security weakness We work on cutting edge technologies like Machine Learning, Hadoop, Python, Scala, Pega, .NET, Java, Angular, React, Cassandra, Tableau, ETL and among several others with exposure to web application security and secure platform development Job Locations Mumbai, Chennai, Gurugram, Gandhinagar (GIFT), Hyderabad. Campus Hiring Eligibility for students is as listed below: ✓ Final year Graduates from the Class of 2025 ONLY ✓ Must Have Major Specialization in Computer Science & Information Technology ONLY ✓ Must have scored 60% in the last semester OR CGPA of 6 on a scale of 10 in the last semester ✓ No Active Backlogs in any of the current or prior semesters Campus Job Description - Tech ✓ Students should be willing to join any of the roles/skills/segment as per company requirement ✓ Students should be willing to work in any shifts/night shifts as per company requirement ✓ Students should be willing to work in any locations namely – Mumbai, Chennai, Gurugram, Gandhinagar (GIFT), Hyderabad as per company requirement
Posted 1 week ago
5.0 years
6 - 10 Lacs
Noida
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Become part of Operations, Support, & Maintenance team Need someone who is technical and who can review existing scripts and code to debug, fix code/scripts, enhance code. Should be able to write intermediate level SQL and MongoDB queries. This is needed to support customers with their issues and enhancement requests Support applications/products/platforms during testing & post-production Develop new code / scripts. Not heads-down development! Analyze & report on data, manage data (data validation, data clean up) Monitor scheduled jobs and take proactive actions. Resolve job failures, and communicate to stakeholders Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor's Degree 5+ years of Relational Database experience (Oracle, SQL Server, DB2, etc.) 3+ years of ETL tool experience (Talend, SQL Server SSIS, Informatica, etc.) 3+ years of programming experience (e.g. Java, Javascript, Visual Basic, etc.) 2+ years of experience with NOSQL (e.g. MongoDB) 2+ years of experience in SDLC development process 1+ years of experience in job scheduler (e.g. Rundeck, Tivoli) Thorough understanding of Production control such as change control process Thorough understanding of REST API services Preferred Qualifications: Understanding of the following: Vaults such as CyberArk, Hashi Corp Document Management such as Nuxeo Version control tool (Git) technology Healthcare terminology Atlassian Tools such as JIRA / Bit Bucket / Crowd / Confluence ETL / ELT tool such as FiveTran Understanding of Agile methodology At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 week ago
8.0 years
4 - 6 Lacs
Noida
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 8+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
2.0 - 4.0 years
6 - 10 Lacs
Kolkata
Remote
Were hiring a Data Analyst with expert-level SQL skills to deliver critical insights and business intelligence. Key Responsibilities:Analyze large datasets using advanced SQL queries. Build reports and dashboards for business stakeholders. Identify patterns, anomalies, and business trends. Collaborate with engineering and product teams. Required Qualifications:2+ years in data analysis with SQL as a core skill. Proficiency in writing complex joins, CTEs, and window functions. Familiarity with reporting tools (e.g., Tableau, Power BI is a plus).
Posted 1 week ago
2.0 - 4.0 years
6 - 10 Lacs
Bengaluru
Remote
Were hiring a Data Analyst with expert-level SQL skills to deliver critical insights and business intelligence. Key Responsibilities:Analyze large datasets using advanced SQL queries. Build reports and dashboards for business stakeholders. Identify patterns, anomalies, and business trends. Collaborate with engineering and product teams. Required Qualifications:2+ years in data analysis with SQL as a core skill. Proficiency in writing complex joins, CTEs, and window functions. Familiarity with reporting tools (e.g., Tableau, Power BI is a plus).
Posted 1 week ago
2.0 - 4.0 years
6 - 10 Lacs
Mumbai
Remote
Were hiring a Data Analyst with expert-level SQL skills to deliver critical insights and business intelligence. Key Responsibilities:Analyze large datasets using advanced SQL queries. Build reports and dashboards for business stakeholders. Identify patterns, anomalies, and business trends. Collaborate with engineering and product teams. Required Qualifications:2+ years in data analysis with SQL as a core skill. Proficiency in writing complex joins, CTEs, and window functions. Familiarity with reporting tools (e.g., Tableau, Power BI is a plus).
Posted 1 week ago
2.0 - 4.0 years
6 - 10 Lacs
Hyderabad
Remote
Were hiring a Data Analyst with expert-level SQL skills to deliver critical insights and business intelligence. Key Responsibilities:Analyze large datasets using advanced SQL queries. Build reports and dashboards for business stakeholders. Identify patterns, anomalies, and business trends. Collaborate with engineering and product teams. Required Qualifications:2+ years in data analysis with SQL as a core skill. Proficiency in writing complex joins, CTEs, and window functions. Familiarity with reporting tools (e.g., Tableau, Power BI is a plus).
Posted 1 week ago
7.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Job Title: - Manager/Sr Manager – ETL - Pyspark Requisition ID: Job Location: -Pune Job Summary: - This role will be responsible for developing and maintaining data models to support data warehouse and reporting requirements. It requires a strong background in data engineering, excellent leadership capabilities, and the ability to drive projects to successful completion. Job Responsibilities: - Working experience in building Data Lake, DWH architecture using Databricks platform Engage with Client to participate in requirement gathering, Status update on work, UAT and be the key partner in the overall engagement Participates in ETL Design using any python framework of new or changing mappings and workflows with the team and prepares technical specifications Crafts ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter Write complex SQL queries with performance tuning and optimization Should be able to handle task independently and lead the team Responsible for unit testing, integration testing and UAT as and when required Good communication Skills Coordinate with cross-functional teams to ensure project objectives are met. Collaborate with data architects and engineers to design and implement data models. Managing projects in fast paced agile ecosystem and ensuring quality deliverables within stringent timelines Responsible for Risk Management, maintaining the Risk documentation and mitigations plan. Drive continuous improvement in a Lean/Agile environment, implementing DevOps delivery approaches encompassing CI/CD, build automation and deployments. Communication & Logical Thinking – Demonstrates strong analytical skills, employing a systematic and logical approach to data analysis, problem-solving, and situational assessment. Capable of effectively presenting and defending team viewpoints, while securing buy-in from both technical and client stakeholders. Handle Client Relationship – Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Job Requirements: - Should have 7+ years of working experience in ETL & Data Warehousing Advanced knowledge of PySpark/python, pandas, numpy frameworks. Minimum 4 years of extensive experience in design, build and deployment of Spark/Pyspark - for data integration. Deep experience in developing data processing tasks using pySpark such as reading data from external sources, merge data, perform data enrichment and load in to target data destinations Create Spark jobs for data transformation and aggregation Spark query tuning and performance optimization - Good understanding of different file formats (ORC, Parquet, AVRO) to optimize queries/processing and compression techniques. Deep understanding of distributed systems (e.g. CAP theorem, partitioning, replication, consistency, and consensus) Experience in Modular Programming & Robust programming methodologies ETL knowledge and have done ETL development using any pyspark/Python framework Advanced SQL knowledge Ability to perform multiple task in continually changing environment Worked with Redshift/Synapse/Snowflake in the past Preferable. Good understanding and experience in the SDLC phases like the Requirements Specification, Analysis, Design, Implementation, Testing, Deployment and Maintenance Qualification: - BE/ B. Tech/ /M Tech/MBA Must have Skills: - Expertise in pharma commercial domain Proficiency in ETL using PySpark Strong experience in data warehousing Skills that give you an edge: - Experience in AWS or Azure cloud and its service offerings Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision We will provide– (Employee Value Proposition) Offer an inclusive environment that encourages diverse perspectives and ideas Delivering challenging and unique opportunities to contribute to the success of a transforming organization Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online Axtria Institute, knowledge sharing opportunities globally, learning opportunities through external certifications Sponsored Tech Talks & Hackathons Possibility of relocating to any Axtria office for short and long-term projects Benefit package: -Health benefits -Retirement benefits -Paid time off -Flexible Benefits -Hybrid /FT Office/Remote Axtria is an equal-opportunity employer that values diversity and inclusiveness in the workplace. Who we are Axtria 14 years journey Axtria, Great Place to Work Life at Axtria Axtria Diversity
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Atos Atos is a global leader in digital transformation with c. 78,000 employees and annual revenue of c. € 10 billion. European number one in cybersecurity, cloud and high-performance computing, the Group provides tailored end-to-end solutions for all industries in 68 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos is a SE (Societas Europaea) and listed on Euronext Paris. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. Roles & Responsibilities Strong informatica PowerCenter Development experience. Performance tuning skills are required with SQL queries and Informatica, Identify and analyse data discrepancies and data quality issues, Proficient in designing and developing of mappings, transformations, sessions and workflows, and deploying integration solutions on Informatica PowerCenter Strong experience on Microsoft SQL Server, SQL/PLSQL with database design and programming skills are required. Requirements Primary Skill - informatica PowerCenter, Sql Secondary / Good to have Skill – Insurance Domain experience Total Experience Range - 3-5 Yrs Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role : Manager / Sr Manager - MDM Experience : 7-12 yeatrs Job Location : Gurgaon/Noida/Bangalore/Hyderabad Your Responsibilities includes, but not limited to: Participate in overall architecture, Capacity planning, development, and implementation of Master Data Management solutions (MDM). Using MDM technologies and tools across an enterprise to enable the management and integration of master data. Understand the technical landscape current as well as desired future state Assess the current state architecture & understand current business processes for managing Master Data Management solutions. Assess the functional and non-functional requirements of desired future state MDM solution Prepare the to-be architecture including data ingestion, data quality rules, data model, match/merge, workflows, UI, batch integration and real-time services. Extensive hands-on experience in installation and configuration of core Informatica MDM Hub components such as Hub console, Hub Store, Hub Server, Cleanse/Match Server and Cleanse Adaptor. Ability to deliver full lifecycle MDM projects for clients including Data modeling, Metadata management, design and configuration of matching and merging rules, design and configuration of standardizing, cleansing and deduplication rules. Create Design Document and data models addressing business needs for the client MDM environment - Contribute to creating reusable assets and accelerators for MDM platforms. Will also be involved in integration/transfer of data across multiple systems, streamlining data processes and providing access to MDM data across the enterprise. Make technology decisions related to the Client MDM environment & Interpret requirements and architect MDM solutions. Provide subject matter expertise on data architecture and data integration implementations across various downstream system. Coordinate with Project Managers and participate in project planning and recurring meetings. Collaborate with other team members to review prototypes and develop iterative revisions. Must have Skills : 5-12 years of experience & should have hands on experience of working in MDM Projects and hands on experience in one or more MDM tools like Informatica or Reltio and has expertise in defining matching/ merging & survivor-ship rules. Should have strong commercial knowledge of key business processes & compliance requirements within Pharma Industry across multiple master data domains like Physician & Product Hands on experience in industry data quality tools like Informatica IDQ, IBM Data Quality. Must be proficient reading and understanding data models and experience working with data and databases. Strong technical experience in the areas of Master Data Management, Meta data management, Data Quality, Data Governance, Data Integration (ETL) and Data Security Experience with (all stages of MDM SDLC) planning, designing, building, deploying and maintaining scalable, highly available, mission critical enterprise-wide applications for large enterprises Should have experience in integrating MDM with Data Warehouses and Data Lakes Excellent query writing skills with Working knowledge of Oracle, SQL server, and other major databases Good knowledge of SOA/Real-time integration , Pub-Sub Model and Data Integration with Various CRM systems like Veeva, Siebel. Expertise in engaging with business users to understand the business requirements and articulate the value proposition. Should have experience working with 3rd Party Data Providers like IQVIA, SHS, Veeva etc Solid experience in configuring 3rd Party Address standardization tools Like or Tools Similar to Address Doctor, Loqate Provide subject matter expertise on data architecture and data integration implementations across various downstream systems Possesses excellent communication skills, both written and verbal, innovative presentation skills Education BE/B.Tech, MCA, M.Sc., M. Tech, MBA with 60%+ Why Axtria: - Axtria (www.Axtria.com) is truly a New-Age Software Product Unicorn, a first of its kind in providing the cloud software and data analytics to the Life Sciences industry globally. We help Life Sciences companies transform the product commercialization journey to drive sales growth and improve healthcare outcomes for patients. We are acutely aware that our work impacts millions of patients and lead passionately to improve their lives. Since our founding in 2010, technology innovation has been our winning differentiation, and we continue to leapfrog competition with platforms that deploy Artificial Intelligence and Machine Learning. Our cloud-based platforms - Axtria DataMAX ™, Axtria InsightsMAX ™, Axtria SALESIQ ™, Axtria CUSTOMERIQ ™ and Axtria MarketingIQ - enable customers to efficiently manage data, leverage data science to deliver insights for sales and marketing planning and manage end-to-end commercial operations. With customers in over 20 countries, Axtria is one of the biggest global commercial solutions providers in the Life Sciences industry. We continue to win industry recognition for growth and are featured in some of the most aspirational lists - INC 5000, Deloitte FAST 500, NJBiz FAST 50, SmartCEO Future 50, Red Herring 100, and several other growth and technology awards. Axtria is looking for exceptional talent to join our rapidly growing global team People are our biggest perk! Our transparent and collaborative culture offers a chance to work with some of the brightest minds in the industry Our data analytics and software platforms support data science, commercial operations, and cloud information management. We enable commercial excellence through our cloud-based sales planning and operations platform We are leaders in managing data using the latest cloud information management and big data technologies Axtria Institute, our in-house university, offers the best training in the industry and an opportunity to learn in a structured environment. A customized career progression plan ensures every associate is setup for success and able to do meaningful work in a fun environment. We want our legacy to be the leaders we produce for the industry 3500+ employees worldwide – growing rapidly & strengthening our product engineering team. We would almost double our India headcount in the coming year
Posted 1 week ago
7.0 years
7 - 8 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Associate Analyst – MDM Operations Location: Bangalore / Pune Shift: Open to Shifts Key Responsibilities Master data maintenance & cleansing (Vendor, Customer, Material, Pricing) in SAP/S4 Execute create, update, change processes for master data Perform data audits, validations, and reconciliation Support data migration using LSMW, LTMC, Winshuttle Prepare data design documents (Data Models, Standards, CRUD Matrix) Collaborate on projects and process improvement initiatives Skills & Experience 2–7 years in Master Data Management Strong SAP functional knowledge (tables, T-codes, MDG) Familiar with ETL, data transformation & error handling Advanced MS Excel skills (Pivots, Index Match, etc.) Exposure to MDM tools like Stibo, Collibra, Informatica (optional) Knowledge of P2P process (added advantage) Education BE in Mechanical, Electrical, or Electronics Skills: etl,data reconciliation,excel,data migration,standards,lsmw,crud matrix,ltmc,sap functional,sap,master data maintenance,winshuttle,sap/s4,error handling,data transformation,data audits,data validations,data models,mdm tools,p2p process,sap functional knowledge,cleansing,master data management,advanced ms excel,mdm operations,data design documents
Posted 1 week ago
10.0 - 15.0 years
27 - 30 Lacs
Nagpur, Pune
Work from Office
10+ years in ETL / Data Engineering Strong in SQL, Python, Unix Shell, PL/I Tools: DataStage, Informatica, Databricks, Talend Experience with AWS, Spark, Hadoop Certifications (CSM, CSPO) and MTech in Data Science are a plus Required Candidate profile Strong in SQL, Python, Unix Shell, PL/I Certifications (CSM, CSPO) and MTech in Data Science are a plus
Posted 1 week ago
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Mercer Marsh Benefits Mercer Marsh Benefits is seeking candidates to support its vision of driving value to clients through data and analytics based insights. The following position is based in Mumbai. Mercer Marsh Benefits Analytics – Senior Principal Engineer / Manager - Data Quality Engineer Mercer Marsh Benefits ™ (MMB) is part of the Marsh McLennan family, bringing together a broad spectrum of expertise to help clients navigate the complex world of people risks, cost management and employee benefits. MMB is a global leader in the health and benefits marketplace. Operating in 135 countries, our team of specialists design benefits solutions that meet the needs of businesses and their people, drawing from global intelligence and adapting that wealth of experience to local markets. Mercer Marsh Benefit Analytics is the specialized and technologically advanced data analytics outfit curated to provide data driven insights to our clients in the health and employee benefits space. What can you expect? Joining a rapidly growing organization with an environment that fosters personal and professional development Opportunity to learn new tools and technology. Participate in building a solid data driven foundation for the organization. Work on breakthrough data and analytics products and projects that will create significant value for our clients Opportunity to make a difference to the world by being a part of the health and benefits industry. A chance to work with Industry leaders, global clients and access to latest trends in the industry. What is in it for you? Discover what's great about working at Marsh and McLennan Companies – from the opportunities that our size brings, to our commitment to our communities and understanding the benefits you’ll receive. We are four businesses with one PURPOSE: building the confidence to thrive through the power of perspective As a global leader in insurance broking and risk management, we are devoted to finding diverse individuals who are committed to the success of our clients and our organization. Joining us will provide a solid foundation for you to accelerate your career in the risk and insurance industry. We can promise you extraordinary challenges, extraordinary colleagues, and the opportunity to make a difference. Our rich history has created a client service culture that we believe is second to none. Our commitments to Diversity and Inclusion, Corporate Social Responsibility, and sustainability demonstrate our commitment to stand for what is right. As a Marsh and McLennan Company colleague, you will also receive additional benefits such as: A competitive salary Employee friendly policies Health care and insurance for you and your dependents Healthy Work life balance A great working environment Flexible benefits packages to suit your needs and lifestyle Future career opportunities across a global organization We will count on you to: To bring in your experience to help mature data quality offerings and lead data quality initiatives across MMB business. This role will report to the MMB Analytics Data Governance Leader. Roles and Responsibilities will include – Data Quality Management: Support team in implementing & driving end-to-end data quality framework and operating model using relevant DQ tools & capabilities. Help business in driving governance associated with data quality by enforcing policies & standards at both technical & functional level. Understand the data landscape, data flows and respective regulatory requirements throughout the organization, working with regions to overcome bottlenecks Implement strategies to improve data quality for various analytics products & platforms using methods such as:- data profiling, data cleansing, data enrichment & data validation Work with MMB business to understand & build data quality rules to address end-to-end business and data governance requirements Data Quality Monitoring & Issue Remediation: Develop solutions for automated DQ checks & threshold alerts Responsible for establishing data quality metrics, data quality audits. Support with issue identification, RCA and its remediation. Use data profiling techniques to identify DQ issues leveraging data quality dimensions Providing recommendations and guidelines for periodic data quality checks and drive governance to ensure long-term data trust and delivery of meaningful analytics Create score cards and reporting process to support data governance councils for monitoring progress and tracking blockers. Track KPIs around DQ dimensions Data Quality Implementation: Support projects with end-to-end DQ rules implementation and development activities. Collaboration: Collaborate with data engineers, data analysts, and data scientists to ensure data quality across all stages of the data lifecycle Develop relationships with various markets (APAC, UK & Europe, Latin America, Middle East & Canada) stakeholders to drive & enforce data quality initiatives. Work with data stewards (business & technical) to assess and apply the latest developments in data management and standards. Responsible for maintenance of MMB data & its metadata. Bring clarity in understanding as to what data means, who owns it, where it is stored, what’s its quality etc. Assist with data classification, data retention & disposal policies. Define & translate data security requirements into data policies & rules to meet data privacy requirements. Support stakeholders in improving data literacy and advocate data governance & data quality management adoption What you need to have: 8+ years of hands-on experience in data profiling, DQ rules implementation, DQ issue management, Data Quality Management, DQ metrics reporting & automation activities. Proven experience as a Data Quality Team Lead, Data Quality Engineer or similar role Masters/bachelor’s degree in information sciences/ engineering, or equivalent Strong experience in any enterprise data quality and cataloging tool (Informatica- IDQ ( Informatica Data Quality ), IDMC- CDI/CDQ, CDQ (Cloud Data Quality), CDI (Cloud Data Integration), IICS, Talend, Databricks, etc.) preferred Familiarity with AWS, Azure, or GCP and associated DQ capabilities. Understanding of cloud-based data storage and data pipeline architecture. Familiarity with languages such as SQL, Python, R etc. AI/ML: Basic understanding of AI/ML algorithms can help in building predictive models for data quality. Excellent written & verbal communication skills with demonstrated experience of working with international stakeholders What makes you stand out: Strong functional & technical skills on Data Quality Management Experience of Healthcare/ Insurance industry Experience of building relationships with stakeholders across the globe Demonstrated experience of executing data governance processes & improving data quality for analytics through use of right tools & capabilities
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi