Jobs
Interviews

23587 Etl Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

India

On-site

Coursera was launched in 2012 by Andrew Ng and Daphne Koller with a mission to provide universal access to world-class learning. It is now one of the largest online learning platforms in the world, with 183 million registered learners as of June 30, 2025 . Coursera partners with over 350 leading university and industry partners to offer a broad catalog of content and credentials, including courses, Specializations, Professional Certificates, and degrees. Coursera’s platform innovations enable instructors to deliver scalable, personalized, and verified learning experiences to their learners. Institutions worldwide rely on Coursera to upskill and reskill their employees, citizens, and students in high-demand fields such as GenAI, data science, technology, and business. Coursera is a Delaware public benefit corporation and a B Corp. Join us in our mission to create a world where anyone, anywhere can transform their life through access to education. We're seeking talented individuals who share our passion and drive to revolutionize the way the world learns. At Coursera, we are committed to building a globally diverse team and are thrilled to extend employment opportunities to individuals in any country where we have a legal entity. We require candidates to possess eligible working rights and have a compatible timezone overlap with their team to facilitate seamless collaboration. Coursera has a commitment to enabling flexibility and workspace choices for employees. Our interviews and onboarding are entirely virtual, providing a smooth and efficient experience for our candidates. As an employee, we enable you to select your main way of working, whether it's from home, one of our offices or hubs, or a co-working space near you. Job Overview: Does architecting high quality and scalable data pipelines powering business critical applications excite you? How about working with cutting edge technologies alongside some of the brightest and most collaborative individuals in the industry? Join us, in our mission to bring the best learning to every corner of the world! We’re looking for a passionate and talented individual with a keen eye for data to join the Data Engineering team at Coursera! Data Engineering plays a crucial role in building a robust and reliable data infrastructure that enables data-driven decision-making, as well as various data analytics and machine learning initiatives within Coursera. In addition, Data Engineering today owns many external facing data products that drive revenue and boost partner and learner satisfaction. You firmly believe in Coursera's potential to make a significant impact on the world, and align with our core values: Learners first: Champion the needs, potential, and progress of learners everywhere. Play for team Coursera: Excel as an individual and win as a team. Put Coursera’s mission and results before personal goals. Maximize impact: Increase leverage by focusing on things that produce bigger results with less effort. Learn, change, and grow: Move fast, take risks, innovate, and learn quickly. Invite and offer feedback with respect, courage, and candor. Love without limits: Celebrate the diversity and dignity of every one of our employees, learners, customers, and partners. Your Responsibilities Architect scalable data models and construct high quality ETL pipelines that act as the backbone of our core data lake, with cutting edge technologies such as Airflow, DBT, Databricks, Redshift, Spark. Your work will lay the foundation for our data-driven culture. Design, build, and launch self-serve analytics products. Your creations will empower our internal and external customers, providing them with rich insights to make informed decisions. Be a technical leader for the team. Your guidance in technical and architectural designs for major team initiatives will inspire others. Help shape the future of Data Engineering at Coursera and foster a culture of continuous learning and growth. Partner with data scientists, business stakeholders, and product engineers to define, curate, and govern high-fidelity data. Develop new tools and frameworks in collaboration with other engineers. Your innovative solutions will enable our customers to understand and access data more efficiently, while adhering to high standards of governance and compliance. Work cross-functionally with product managers, engineers, and business teams to enable major product and feature launches. Your Skills 5+ years experience in data engineering with expertise in data architecture and pipelines Strong programming skills in Python Proficient with relational databases, data modeling, and SQL Experience with big data technologies (eg: Hive, Spark, Presto) Familiarity with batch and streaming architectures preferred Hands-on experience with some of: AWS, Databricks, Delta Lake, Airflow, DBT, Redshift, Datahub, Elementary Knowledgeable on data governance and compliance best practices Ability to communicate technical concepts clearly and concisely Independence and passion for innovation and learning new technologies If this opportunity interest you, you might like these courses on Coursera - Big Data Specialization Data Warehousing for Business Intelligence IBM Data Engineering Professional Certificate Coursera is an Equal Employment Opportunity Employer and considers all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, age, marital status, national origin, protected veteran status, disability, or any other legally protected class. If you are an individual with a disability and require a reasonable accommodation to complete any part of the application process, please contact us at accommodations@coursera.org. For California Candidates, please review our CCPA Applicant Notice here. For our Global Candidates, please review our GDPR Recruitment Notice here.

Posted 5 hours ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The BI Data Engineer is a key role within the Enterprise Data team. We are looking for expert Azure data engineer with deep Data engineering, ADF Integration and database development experience. This is a unique opportunity to be involved in delivering leading-edge business analytics using the latest and greatest cutting-edge BI tools, such as cloud-based databases, self-service analytics and leading visualisation tools enabling the company’s aim to become a fully digital organisation. Job Description: Key Responsibilities: Build Enterprise data engineering and Integration solutions using the latest Azure platform, Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Development of enterprise ETL and integration routines using ADF Evaluate emerging Data enginnering technologies, standards and capabilities Partner with business stakeholder, product managers, and data scientists to understand business objectives and translate them into technical solutions. Work with DevOps, engineering, and operations teams to implement CI/CD pipelines and ensure smooth deployment of data engiinering solutions Required Skills And Experience Technical Expertise : Expertise in the Azure platform including Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Exposure to Data bricks and lakehouse arcchitect & technologies Extensive knowledge of data modeling, ETL processes and data warehouse design principles. Experienc in machine learning and AI services in Azure. Professional Experience : 5+ years of experience in database development using SQL 5+ Years integration and data engineering experience 5+ years experience using Azure SQL DB, ADF and Azure Synapse 2+ Years experience using Power BI Comprehensive understanding of data modelling Relevant certifications in data engineering, machine learning, AI. Key Competencies: Expertise in data engineering and database development. Familiarity with the Microsoft Fabric technologies including One Lake, Lakehouse and Data Factory Strong understanding of data governance, compliance, and security frameworks. Proven ability to drive innovation in data strategy and cloud solutions. A deep understanding of business intelligence workflows and the ability to align technical solutions Strong database design skills, including an understanding of both normalised form and dimensional form databases. In-depth knowledge and experience of data-warehousing strategies and techniques e.g., Kimball Data warehousing Experience in Cloud based data integration tools like Azure Data Factory Experience in Azure Dev Ops or JIRA is a plus Experience working with finance data is highly desirable Familiarity with agile development techniques and objectives Location: Pune Brand: Dentsu Time Type: Full time Contract Type: Permanent

Posted 6 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Talent Worx is thrilled to announce an exciting opportunity for the roles of Snowflake and Spark Developers! Join us in revolutionizing the data analytics landscape as we partner with one of the Big 4 firms in India. What impact will you make? Your contributions will play a vital role in shaping our clients' success stories by utilizing innovative technologies and frameworks. Envision a dynamic culture that supports inclusion, collaboration, and exceptional performance. With us, you will discover unrivaled opportunities to accelerate your career and achieve your goals. The Team In our Analytics & Cognitive (A&C) practice, you will find a dedicated team committed to unlocking the value hidden within large datasets. Our globally-connected network ensures that our clients gain actionable insights that support fact-driven decision-making, leveraging advanced techniques including big data, cloud computing, cognitive capabilities, and machine learning. Work you will do As a key player in our organization, you will contribute directly to enhancing our clients' competitive positioning and performance with innovative and sustainable solutions. We expect you to collaborate closely with our teams and clients to deliver outstanding results across various projects. Requirements 5+years of relevant experience in Spark and Snowflake with practical experience in at least one project implementation Strong experience in developing ETL pipelines and data processing workflows using Spark Expertise in Snowflake architecture, including data loading and unloading processes, table structures, and virtual warehouses Proficiency in writing complex SQL queries in Snowflake for data transformation and analysis Experience with data integration tools and techniques, ensuring the seamless ingestion of data Familiarity with building and monitoring data pipelines in a cloud environment Exposure to Agile methodology and tools like Jira and Confluence Strong analytical and problem-solving skills, with meticulous attention to detail Excellent communication and interpersonal skills to foster collaborations with clients and team members Ability to travel as required by project demands Qualifications Snowflake certification or equivalent qualification is a plus Prior experience working with both Snowflake and Spark in a corporate setting Formal education in Computer Science, Information Technology, or a related field Proven track record of working with cross-functional teams Benefits Work with one of the Big 4's in India Healthy work Environment Work Life Balance

Posted 6 hours ago

Apply

6.0 years

4 - 6 Lacs

Hyderābād

On-site

We are seeking a Senior Data Engineer for our Marketing team in Thomson Reuters. Design and develop our data transformation initiatives as we build the data foundation to drive our marketing strategy to enhance our internal and external customer experiences and personalization. This is a mission-critical role with substantial scope, complexity, executive visibility, and has a large opportunity for impact. You will play a critical role in ensuring that customer data is effectively managed and utilized to drive business insights and facilitating informed decision-making and help Thomson Reuters rapidly scale our digital customer experiences. About the Role In this role as a Senior Data Engineer, you will: Independently own and manage assigned projects and meet deadlines, clearly communicating progress and barriers to manager and stakeholders. Serve as a visible Subject Matter Expert on our Customer Data Platform, maintaining up-to-date awareness of industry trends, cutting-edge technologies, and best practices on relevant topics including unified customer profiles, deterministic and probabilistic matching, identity graphs, data enrichment, etc. Design and implement data ingestion pipelines to collect and ingest customer data into the Customer Data Platform from various sources. This involves setting up data pipelines, APIs, and ETL (Extract, Transform, Load) processes. Create and design data models, schemas, and database structures in Snowflake and the Customer Data Platform. Carry out comprehensive data analysis from various system sources to yield enhanced insights into customer behavior and preferences. Gather and analyze data from various touchpoints, including online interactions, transactional systems, and customer feedback channels, creating a comprehensive customer profile that presents a 360-degree view. Ensure the launch of new data, segmentation, and profile capabilities, as well as the evolutions of the platform, go smoothly. This includes testing, post-launch monitoring, and overall setup for long-term success. Collaborate with marketers and other stakeholders to understand their data needs and translate those needs into technical requirements. Actively identify and propose innovations in data practices that evolve capabilities, improve efficiency or standardization, and better support stakeholders. Shift Timings: 2 PM to 11 PM (IST). Work from office for 2 days in a week (Mandatory). About You You’re a fit for the role of Senior Data Engineer, if your background includes: Bachelor’s or master’s degree in data science, business, technology, or an equivalent field. Strong Data Engineering background with 6+ years of experience working on large data transformation projects, related to customer data platforms, Identity Resolution, and Identity Graphs. Solid foundation in SQL and familiarity with other query engines, along with hands-on experience with Snowflake, AWS Cloud, DBT, and Real-time APIs. Expertise in using Presto for querying data across multiple sources and Digdag for workflow management, including the ability to create, schedule, and monitor data workflows. Proficient in configuring and implementing any industry-leading customer data platform, including data integration, segmentation, and activations is a must. Experience using marketing data sources such as CRM especially Salesforce, marketing automation platform especially Eloqua, web tracking Adobe Analytics is a plus. Exposure to Gen AI, capable of leveraging AI solutions to address complex data challenges. Excellent oral, written, and visual (Power point slides) communication skills, especially in breaking down complex information into understandable pieces, telling stories with data, and translating technical concepts for non-technical audiences. Strong ability to organize, prioritize, and complete tasks with a high attention to detail, even in the face of ambiguity and environmental barriers. Knowledge of marketing or digital domains and of professional services industry, especially legal, tax, and accounting is a plus. Experience in working in iterative development and a solid grasp of agile practices. #LI-GS2 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 6 hours ago

Apply

0 years

3 - 6 Lacs

Hyderābād

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Job description We are looking for a Senior Software Engineer to join our Ascend Cloud Foundation Platform team. Background: We unlock the power of data to create opportunities for consumers, businesses and society. At life’s big moments – from buying a home or car, to sending a child to university, to growing a business exponentially by connecting it with new customers – we empower consumers and our clients to manage their data with confidence so they can maximize every opportunity. We require a senior software engineer in Hyderabad, India to work alongside our UK colleagues to deliver business outcomes for the UK&I region. You will join an established agile technical team, where you will work with the Lead Engineer and Product Owner to help develop the consumer data attributes, work with data analytics to validate the accuracy of the calculations whilst ensuring that you work to the highest technical standards. Key responsibilities: Design, develop, and maintain scalable and efficient data pipelines and ETL processes to extract, transform, and load data from various sources into our data lake or warehouse. Collaborate with cross-functional teams including data scientists, analysts, and software engineers to understand data requirements, define data models, and implement solutions that meet business needs. Ensure the security, integrity, and quality of data throughout the data lifecycle, implementing best practices for data governance, encryption, and access control. Develop and maintain data infrastructure components such as data warehouses, data lakes, and data processing frameworks, leveraging cloud services (e.g., AWS, Azure, GCP) and containerization technologies (e.g., Docker, Kubernetes). Implement monitoring, logging, and alerting mechanisms to ensure the reliability and availability of data pipelines and systems, and to proactively identify and address issues. Work closely with stakeholders to understand business requirements, prioritize tasks, and deliver solutions in a timely manner within an Agile working environment. Collaborate with the risk, security and compliance teams to ensure adherence to regulatory requirements (e.g., GDPR, PCI DSS) and industry standards related to data privacy and security. Stay updated on emerging technologies, tools, and best practices in the field of data engineering, and propose innovative solutions to improve efficiency, performance, and scalability. Mentor and coach junior engineers, fostering a culture of continuous learning and professional development within the team. Participate in code reviews, design discussions, and other Agile ceremonies to promote collaboration, transparency, and continuous improvement. Qualifications Qualifications Qualified to Degree, HND or HNC standard in a software engineering and/or data engineering discipline or can demonstrate commercial experience Required skills/ experience: Experience of the full development lifecycle Strong communication skills with the ability to explain solutions to technical and non-technical audiences Write clean, scalable and re-usable code that implements SOLID principles, common design patterns where applicable and adheres to published coding standards Excellent attention to detail, ability to analyse, investigate and compare large data sets when required. 3 or more years of programming using Scala 2 or more years of programming using Python Some experience of using Terraform to provision and deploy cloud services and components Experience of developing on Apache Spark Experience of developing with AWS cloud services including (but not limited to) AWS Glue, S3, Step Functions, Lambdas, EventBridge and SQS BDD / TDD experience Jenkins CI / CD experience Application Lifecycle Management Tools - BitBucket & Jira Performing Pull Request reviews Understanding of Agile methodologies Automated Testing Tools Advantageous experience: Mentoring or coaching junior engineers Cloud Solution Architecture Document databases Relational Databases Experience with Container technologies (e.g. Kubernetes) Would consider alternative skills and experience: Java (rather than Scala) Google Cloud or Microsoft Azure (rather than AWS) Azure Pipelines or TeamCity (rather than Jenkins) Github (rather than BitBucket) Azure DevOps (rather than Jira) CloudFormation (rather than Terraform) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 6 hours ago

Apply

5.0 - 7.0 years

4 - 10 Lacs

Hyderābād

On-site

Description The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities. Brief Job Overview The Digital & Innovation group at USP is seeking a Full Stack Developers with programming skills in Cloud technologies to be able to build innovative digital products. We are seeking someone who understands the power of Digitization and help drive an amazing digital experience to our customers. How will YOU create impact here at USP? In this role at USP, you contribute to USP's public health mission of increasing equitable access to high-quality, safe medicine and improving global health through public standards and related programs. In addition, as part of our commitment to our employees, Global, People, and Culture, in partnership with the Equity Office, regularly invests in the professional development of all people managers. This includes training in inclusive management styles and other competencies necessary to ensure engaged and productive work environments. The Sr. Software Engineer/Software Engineer has the following responsibilities: Build scalable applications/ platforms using cutting edge cloud technologies. Constantly review and upgrade the systems based on governance principles and security policies. Participate in code reviews, architecture discussions, and agile development processes to ensure high-quality, maintainable, and scalable code. Document and communicate technical designs, processes, and solutions to both technical and non-technical stakeholders Who is USP Looking For? The successful candidate will have a demonstrated understanding of our mission, commitment to excellence through inclusive and equitable behaviors and practices, ability to quickly build credibility with stakeholders, along with the following competencies and experience: Education Bachelor's or Master's degree in Computer Science, Engineering, or a related field Experience Sr. Software Engineer: 5-7 years of experience in software development, with a focus on cloud computing Software Engineer: 2-4 years of experience in software development, with a focus on cloud computing Strong knowledge of cloud platforms (e.g., AWS , Azure, Google Cloud) and services, including compute, storage, networking, and security Extensive knowledge on Java spring boot applications and design principles. Strong programming skills in languages such as Python Good experience with AWS / Azure services, such as EC2, S3, IAM, Lambda, RDS, DynamoDB, API Gateway, and Cloud Formation Knowledge of cloud architecture patterns, best practices, and security principles Familiarity with data pipeline / ETL / Orchestration tools, such as Apache NiFi, AWS Glue, or Apache Airflow. Good experience with front end technologies like React.js/Node.js etc Strong experience in micro services, automated testing practices. Experience leading initiatives related to continuous improvement or implementation of new technologies. Works independently on most deliverables Strong analytical and problem-solving skills, with the ability to develop creative solutions to complex problems Ability to manage multiple projects and priorities in a fast-paced, dynamic environment Additional Desired Preferences Experience with scientific chemistry nomenclature or prior work experience in life sciences, chemistry, or hard sciences or degree in sciences Experience with pharmaceutical datasets and nomenclature Experience with containerization technologies, such as Docker and Kubernetes, is a plus Experience working with knowledge graphs Ability to explain complex technical issues to a non-technical audience Self-directed and able to handle multiple concurrent projects and prioritize tasks independently Able to make tough decisions when trade-offs are required to deliver results Strong communication skills required: Verbal, written, and interpersonal Supervisory Responsibilities No Benefits USP provides the benefits to protect yourself and your family today and tomorrow. From company-paid time off and comprehensive healthcare options to retirement savings, you can have peace of mind that your personal and financial well-being is protected Who is USP? The U.S. Pharmacopeial Convention (USP) is an independent scientific organization that collaborates with the world's top authorities in health and science to develop quality standards for medicines, dietary supplements, and food ingredients. USP's fundamental belief that Equity = Excellence manifests in our core value of Passion for Quality through our more than 1,300 hard-working professionals across twenty global locations to deliver the mission to strengthen the supply of safe, quality medicines and supplements worldwide. At USP, we value inclusivity for all. We recognize the importance of building an organizational culture with meaningful opportunities for mentorship and professional growth. From the standards we create, the partnerships we build, and the conversations we foster, we affirm the value of Diversity, Equity, Inclusion, and Belonging in building a world where everyone can be confident of quality in health and healthcare. USP is proud to be an equal employment opportunity employer (EEOE) and affirmative action employer. We are committed to creating an inclusive environment in all aspects of our work—an environment where every employee feels fully empowered and valued irrespective of, but not limited to, race, ethnicity, physical and mental abilities, education, religion, gender identity, and expression, life experience, sexual orientation, country of origin, regional differences, work experience, and family status. We are committed to working with and providing reasonable accommodation to individuals with disabilities.

Posted 6 hours ago

Apply

10.0 years

5 - 10 Lacs

Hyderābād

Remote

Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and delivering ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Specialist IS Software engineer Live What you will do Let’s do this. Let’s change the world. In this vital role We are looking for a creative and technically skilled Specialist IS Software engineer - Data Management Lead . This role will be responsible for leading data management initiatives collaborating across business, IT, and data governance teams. The ideal candidate will have extensive experience in configuring and implementing Collibra products, established track record of building high-quality data governance and data quality solutions with a strong hands-on design and engineering skills. The candidate must also possess strong analytical and communication skills. As a Collibra Lead Developer, you will play a key role in the design, implementation, and management of our Collibra Data Governance and Data Quality platform. You will work closely with stakeholders across the organization to ensure the successful deployment of data governance processes, solutions, and best practices. Building and integrating information systems to meet the company’s needs. Design and implement data governance frameworks, policies, and procedures within Collibra. Configure, implement, and maintain Collibra Data Quality Center to support enterprise-wide data quality initiatives Lead the implementation and configuration of Collibra Data Governance platform. Develop, customize, and maintain Collibra workflows, dashboards, and business rules. Collaborate with data stewards, data owners, and business analysts to understand data governance requirements and translate them into technical solutions Provide technical expertise and support to business users and IT teams on Collibra Data Quality functionalities. Collaborate with data engineers and architects to implement data quality solutions within data pipelines and data warehouses. Participate in data quality improvement projects, identifying root causes of data issues and implementing corrective actions Integrate Collibra with other enterprise data management systems (e.g., data catalogs, BI tools, data lakes). Provide technical leadership and mentoring to junior developers and team members. Troubleshoot and resolve issues with Collibra environment and data governance processes. Assist with training and enablement of business users on Collibra platform features and functionalities. Stay up to date with new releases, features, and best practices in Collibra and data governance. Basic Qualifications: Master’s degree in computer science & engineering preferred with 10+ years of software development experience OR, Bachelor’s degree in computer science & engineering preferred with 10+ years of software development experience Proven experience (7+ years) in data governance or data management roles. Strong experience with Collibra Data Governance platform, including design, configuration, and development. Hands-on experience with Collibra workflows, rules engine, and data stewardship processes. Experience with integrations between Collibra and other data management tools. Proficiency in SQL and scripting languages (e.g., Python, JavaScript). Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills to work with both technical and non-technical stakeholders Self-starter with strong communication and collaboration skills to work effectively with cross-functional teams. Excellent problem-solving skills and attention to detail. Domain knowledge of the Life sciences Industry Recent experience working in a Scaled Agile environment with Agile tools, e.g. Jira, Confluence, etc. Preferred Qualifications: Deep expertise in Collibra platform including Data Governance and Data Quality. In-depth knowledge of data governance principles, data stewardship processes, data quality concepts, data profiling and validation methodologies, techniques, and best practices. Hands-on experience in implementing and configuring Collibra Data Governance, Collibra Data Quality, including developing metadata ingestion, data quality rules, scorecards, and workflows. Strong experience in configuring and connecting to various data sources for metadata, data lineage, data profiling and data quality. Experience integrating data management capabilities (MDM, Reference Data) Good experience with Azure cloud services, Azure Data technologies and Databricks Solid understanding of relational database concepts and ETL processes. Proficient use of tools, techniques, and manipulation including programming languages (Python, PySpark, SQL etc.), for data profiling, and validation. Data modeling with tools like Erwin and knowledge of insurance industry standards (e.g., ACORD) and insurance data (policy, claims, underwriting, etc.). Familiarity with data visualization tools like Power BI. Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect of us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 6 hours ago

Apply

14.0 years

0 Lacs

Hyderābād

On-site

We are seeking a highly experienced Salesforce Architect with over 14 years of expertise in designing and implementing scalable Salesforce solutions. The ideal candidate will possess a deep understanding of Salesforce platform capabilities, architecture best practices, and enterprise application integration. You will play a pivotal role in defining architectural roadmaps, ensuring optimal performance, and leading technical teams to deliver business-critical solutions. Key Responsibilities: 1. Architecture and Design Define and design scalable Salesforce architecture, ensuring alignment with business goals and IT strategy. Lead the technical design process and ensure compliance with architectural standards, security policies, and governance frameworks. Evaluate and select appropriate Salesforce tools, technologies, and APIs to build robust solutions. Develop and maintain architectural blueprints and technical documentation. 2. Solution Implementation and Integration Lead end-to-end Salesforce implementation projects, including configuration, custom development, and integration with enterprise applications. Define and implement data models, security models, and sharing rules across Salesforce platforms. Design and oversee the integration of Salesforce with other enterprise systems such as ERP, marketing automation, and custom applications using APIs, middleware, and integration tools. 3. Technical Leadership and Governance Provide technical leadership to development teams, ensuring adherence to Salesforce best practices and coding standards. Collaborate with stakeholders, business analysts, and product owners to translate business requirements into scalable technical solutions. Conduct code reviews, troubleshoot performance issues, and provide guidance on optimizing Salesforce implementations. 4. Security and Compliance Ensure Salesforce solutions comply with security standards, data privacy regulations, and industry best practices. Implement role-based security, object-level permissions, and data encryption to protect sensitive information. 5. Continuous Improvement and Innovation Stay up to date with Salesforce product releases, new features, and industry trends. Drive the adoption of new tools and technologies to enhance Salesforce platform efficiency and performance. Required Skills and Experience: Technical Skills: Strong expertise in Salesforce Service Cloud, Experience Cloud, Sales Cloud, and Marketing Cloud. Proficiency in Apex, Agent force, Visualforce, AURA, Lightning Web Components (LWC), SOQL, and SOSL. Experience with Salesforce API integrations (REST/SOAP), middleware, and ETL tools. Hands-on experience in CI/CD pipelines, version control (Git), and deployment tools (Copado). Knowledge of data migration strategies and tools such as Data Loader, MuleSoft, and Informatica. Architectural Expertise: Strong understanding of Salesforce architecture patterns, multi-org strategy, and governance models. Expertise in designing multi-cloud solutions and integrating Salesforce with enterprise systems. Experience with Salesforce DevOps, release management, and sandbox management. Certifications: Salesforce Certified Technical Architect (CTA) (Preferred or willingness to pursue) Salesforce Certified Application Architect Salesforce Certified System Architect Other relevant certifications (e.g., Platform Developer II, Integration Architecture Designer) Soft Skills: Strong leadership and mentorship skills to guide development teams. Excellent communication and collaboration skills to engage with business and technical stakeholders. Ability to manage multiple projects and prioritize tasks effectively. Analytical mindset with problem-solving capabilities. Education: Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Preferred Qualifications: Experience working in Agile environments with a strong understanding of Agile delivery frameworks (Scrum/SAFe). Hands-on experience with Salesforce Einstein Analytics, CPQ, and Field Service Lightning is a plus. Work Environment: Opportunity to work on cutting-edge Salesforce implementations and enterprise-level solutions. Work from office per policy guidelines and work with teams across EU, APAC and US time zones. Education: Bachelor's degree in computer science, engineering, information systems and/or equivalent formal training or work experience. Relevant Master’s degree, TOGAF certification and SAFe Agile certification strongly preferred. Experience: Eight (8) years equivalent work experience in information technology or engineering environment with a direct responsibility for strategy formulation and solution/technical architecture, as well as designing, architecting, developing, implementing and monitoring efficient and effective solutions to diverse and complex business problems. Knowledge, Skills and Abilities Fluency in English Accuracy & Attention to Detail Influencing & Persuasion Planning & Organizing Problem Solving Project Management Preferred Qualifications: Pay Transparency: Pay: Additional Details: FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 6 hours ago

Apply

4.0 years

0 Lacs

Hyderābād

On-site

About Us: Location - Hyderabad, India Department - Product R&D Level - Support Working Pattern - Work from office. Benefits - Benefits at Ideagen Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! As a Level 2 Software Engineer, you will build high-quality, innovative, and fully performing integration solutions that complies with coding standards and technical design. You will contribute to the design and implementation of several integrations across different Ideagen modules. You will take ownership of the entire B2B integration lifecycle, from preliminary planning through requirements gathering, design, development, documentation, testing, deployment, and ongoing maintenance of integration solutions. Finally, you will contribute thoughts positively within the Agile development team and demonstrate an enthusiastic ‘can-do’ attitude. Responsibilities: Design, Develop, and Manage Integration Solutions which involve building robust data integration pipelines using Biztalk and C# DotNet Technologies. Develop and support B2B solutions using BizTalk Server 2020 having excellent knowledge with Biztalk artifacts like Schemas, Maps, Pipelines, Orchestrations, Adapters. Utilize XML and XSLT extensively for data transformation and enrichment between systems. Deploying integration solutions to Biztalk Server Console. Configure and manage secure file transfers (SFTP, FTPS) for data exchange. Implement and support data exchange using HL7 and other industry standards. Develop and Optimize T-SQL Queries, stored procedures, and scripts to support data processing and transformation. Design and maintain end to end solutions to accomplish connections with several vendors and partners to establish secure, scalable integration frameworks using Biztalk Adapters and DotNet Code. Create robust ETL workflows using tools like SSIS, Azure Data Factory, or other ETL platforms. Design and consume RESTful and SOAP APIs for real-time and batch data integration. Develop custom connectors and middleware when needed. Troubleshoot Integration Solutions and ensure timely delivery. Monitor and support integration flows, ensuring error handling, logging, and alerts are in place. Develop and maintain Technical Design Documents for all Integration processes and solutions. Collaborate with multiple cross functional teams including Dev, QA, Infra, Business teams to understand the technical customer requirements and deliver robust solutions. Work within an Agile Development Team using e.g., Scrum framework. Provide unit tests to support and validate for any development work undertaken. Perform tasks with limited supervision and require substantial use of independent judgment within the scope. Skills and Experience: A minimum of 4 years of hands-on experience in a Data Integration role is highly preferred. Primary skills: Biztalk Server 2020, C#, .Net, SQL Server 2017, exposure to Rest API. Secondary skills: SSIS (Good to have). A proven ability to deliver end to end integration solutions with Biztalk Server 2020, T-SQL and REST APIs. Demonstrated proficiency in writing T-SQL queries, stored procedures, and scripts for efficient data processing and transformation. Experience in deploying solutions to Biztalk Server Console. Experience using XML and XSLT to develop robust solutions. Hands on experience of Biztalk Artifacts like Schemas, Maps, Pipelines, Orchestrations, Adapters for developing B2B real time solutions. Strong understanding of C# DotNet and experience with its usage in integration solutions. Solid understanding of RESTful APIs and experience with their integration. Experience using Source Control, preferably BitBucket and Git. Exceptional communication and presentation skills in English, both verbal and written, are essential for this role. Ability to write unit test cases and perform unit testing. Ability to create Technical Design Documents with optimal design. Understanding of Agile software development methodologies/frameworks such as Scrum Desirable Database development experience, preferably SQL Server and MongoDB. Exposure to ETL platforms like SSIS, Azure DataFactory. Exposure to AWS/Azure, Postman. About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. We’re building a future-ready team, and AI is part of how we work smarter. If you're curious, adaptable and open to using AI to improve how you work, you’ll thrive at Ideagen! What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place! #LI-FullTime

Posted 6 hours ago

Apply

3.0 years

6 - 8 Lacs

Hyderābād

On-site

- 3+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience - Experience with data visualization using Tableau, Quicksight, or similar tools - Experience with data modeling, warehousing and building ETL pipelines - Experience in Statistical Analysis packages such as R, SAS and Matlab - Experience using SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling The ShipTech BI team is looking for a smart and ambitious individual to support developing the operational reporting structure in Amazon Logistics. The potential candidate will support analysis, improvement and creation of metrics and dashboards on Transportation by Amazon, In addition, they will work with internal customers at all levels of the organization – Operations, Customer service, HR, Technology, Operational Research. The potential candidate will enjoy the challenges and rewards of working in a fast-growing organization. This is a high visibility position. As an Amazon Data Business Intelligence Engineer you will be working in one of the world's largest and most complex data warehouse environments. You should have deep expertise in the design, creation, management, and business use of extremely large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions, and to build data sets that answer those questions. You should be expert at designing, implementing, and operating stable, scalable, low cost solutions to flow data from production systems into the data warehouse and into end-user facing applications. You should be able to work with business customers in understanding the business requirements and implementing reporting solutions. Above all you should be passionate about bringing large datasets together to answer business questions and drive change. Key Responsibilities: - Design automated solutions for recurrent reporting (daily/weekly/monthly). - Design automated processes for in-depth analysis databases. - Design automated data control processes. - Collaborate with the software development team to build the designed solutions. - Learn, publish, analyze and improve management information dashboards, operational business metrics decks and key performance indicators. - Improve tools, processes, scale existing solutions, create new solutions as required based on stakeholder needs. - Provide in-depth analysis to management with the support of accounting, finance, transportation and supply chain teams. - Participate in annual budgeting and forecasting efforts. - Perform monthly variance analysis and identify risks & opportunities. Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 6 hours ago

Apply

2.0 - 3.0 years

0 Lacs

Telangana

On-site

Role : ML Engineer (Associate / Senior) Experience : 2-3 Years (Associate) 4-5 Years (Senior) Mandatory Skill: Python/ MLOps/ Docker and Kubernetes/FastAPI or Flask/CICD/Jenkins/Spark/SQL/RDB/Cosmos/Kafka/ADLS/API/Databricks Location: Bangalore Notice Period: less than 60 Days Job Description: Other Skills: Azure /LLMOps / ADF/ETL We are seeking a talented and passionate Machine Learning Engineer to join our team and play a pivotal role in developing and deploying cutting-edge machine learning solutions. You will work closely with other engineers and data scientists to bring machine learning models from proof-of-concept to production, ensuring they deliver real-world impact and solve critical business challenges. Collaborate with data scientists, model developers , software engineers, and other stakeholders to translate business needs into technical solutions. Experience of having deployed ML models to production Create high performance real-time inferencing APIs and batch inferencing pipelines to serve ML models to stakeholders. Integrate machine learning models seamlessly into existing production systems. Continuously monitor and evaluate model performance and retrain the models automatically or periodically Streamline existing ML pipelines to increase throughput. Identify and address security vulnerabilities in existing applications proactively. Design, develop, and implement machine learning models for preferably insurance related applications. Well versed with Azure ecosystem Knowledge of NLP and Generative AI techniques. Relevant experience will be a plus. Knowledge of machine learning algorithms and libraries (e.g., TensorFlow, PyTorch) will be a plus. Stay up-to-date on the latest advancements in machine learning and contribute to ongoing innovation within the team.

Posted 6 hours ago

Apply

10.0 years

3 - 5 Lacs

Hyderābād

On-site

If you are a current employee who is interested in applying to this position, please navigate to the internal Careers site to apply. Disclaimer: MarketStar is committed to ensuring integrity and transparency in our recruitment practices. We DO NOT charge any fees at any stage of the recruitment process. In case you receive any unsolicited requests for payments, please report to immediately. Job Title: Manager - Business Intelligence Experience: 10+ Years. Location: Hyderabad, India About MarketStar: MarketStar is a global sales and marketing outsourcing company that helps businesses drive growth through innovative sales solutions and customer engagement strategies. With over 30 years of experience, we partner with leading brands to accelerate sales performance and deliver exceptional customer experiences. Role Overview: In everything we do, we believe in creating growth, for our clients, employees, and community. For the past 35+ years, we have been generating revenue for the most innovative tech companies globally through our outsourced B2B demand, sales, customer success, and revenue operations solutions. We are passionate about cultivating career advancements for our people and supporting them through mentorship, leadership, and career development programs. We provide service and support to our communities through the MarketStar Foundation. Our exceptional team is the cornerstone of MarketStar's accomplishments. We are proud of our award-winning workplace culture and for being named a top employer. These achievements are a testament to our six core values, embraced by our 3,000+ employees worldwide. From our headquarters in Utah, USA, to our global offices in India, Ireland, Bulgaria, Mexico, the Philippines, and Australia, we all work together to drive innovation and success. We are excited to have you apply to join our MarketStar team and can’t wait to discuss how we can help you find growth! Key Responsibilities & What will you need to succeed in this role Hands on experience in managing BI/BA/Data teams and working closely with all stakeholders including Sr management. End to End knowledge of Operations management and Business analysis. Collaborate with cross-functional teams to understand business needs, define data requirements, and ensure data accuracy and integrity. Collect, analyze, and interpret complex data from multiple sources to identify trends, patterns, and opportunities for business improvement. Conduct ad-hoc analysis to support business initiatives, such as market research, competitive analysis, and customer segmentation. Develop and maintain interactive dashboards, reports, and visualizations using BI tools such as Tableau, Power BI, or similar, to provide actionable insights to stakeholders. Support business leaders in making data-driven decisions by presenting insights and recommendations in a clear and concise manner. Conduct ad-hoc data analysis and research to address specific business questions and challenges. Monitor key performance indicators (KPIs) and create alerts to proactively identify anomalies or potential issues. Drive continuous improvement of BI processes, tools, and methodologies to enhance the overall effectiveness of the BI function. Provide training and support to team members and end-users on BI tools and data analysis techniques. Stay informed about industry trends, best practices, and emerging technologies in business intelligence and data analytics. Partner with sales leaders and account managers to understand business needs and develop actionable insights and recommendations. Creating business review decks and presenting to the leadership team. Qualifications: Minimum 10 -12 years of experience in a position monitoring, managing, manipulating and drawing insights from data, and someone with at least 5 years of experience leading a team. Degree in Business Analytics/ Business Administration /Business Communication / Marketing. Proven work experience in business intelligence, data analysis, or related roles. Experience with data visualization tools such as Tableau, Power BI, or similar. Solid understanding of data concepts, data modeling, and database design principles. Excellent analytical and problem-solving skills, with the ability to think critically and draw insights from data. Strong business acumen and the ability to understand and interpret business requirements. Exceptional communication and presentation skills, with the ability to convey complex data insights to non-technical stakeholders. Experience in project management and the ability to prioritize and manage multiple tasks effectively. Familiarity with ETL (Extract, Transform, Load) processes and data integration techniques. Experience with data visualization tools: Tableau and others. Must have Skills: End to End knowledge of Operations management and Business analysis. Knowledge of operational analytics and modeling techniques. Experience with BI platforms (Power BI, SQL and Macros). Understanding of data governance and data quality principles. Understanding of Business concepts and business financials. Creating business review decks and presenting to the leadership team. What’s in it for you? Constant Learning and an entrepreneurial growth mindset. Employee centric benefits plan including but not limited to Comprehensive Health Insurance, generous Leave Policy, Covid Support, Vaccination drives, Wellbeing sessions, real time grievance redressal and work flexibility. We are a people-first organization with policies and process that help you bring the best version of yourself into work including fast track growth for high potential folks. An opportunity to be associated with the world’s leading brands as clients. To be a part of an organization with more than 60% of homegrown Leaders. Customized training programs that are catered to personal and professional development. We are an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. If You're up for this position, hit the Apply Now Button!

Posted 6 hours ago

Apply

4.0 years

4 - 6 Lacs

Hyderābād

On-site

Overview: We have an exciting role to head our creative studio for one of Omnicom’s largest advertising agency. This leadership role will require to lead and drive world-class advertising, creative and studio deliverables working with global brands and agency leaders. This role would be overall responsible for production, practice a people management. Ab Omnicom Global Solutions Omnicom Global Solutions (OGS) is an agile innovation hub of Omnicom Group, a leading global marketing and corporate communications company. Guided by the principles of Knowledge, Innovation, and Transformation, OGS is designed to deliver scalable, customized, and contextualized solutions that meet the evolving needs of our Practice Areas within Omnicom. OGS India plays a key role for our group companies and global agencies by providing stellar products, solutions, and services in the areas of Creative Services, Technology, Marketing Science (Data & Analytics), Advanced Analytics, Market Research, Business Support Services, Media Services, and Project Management. We currently have 4000+ awesome colleagues in OGS India who are committed to solving our clients’ pressing business issues. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together! Responsibilities: About our Agency Omnicom Health Shared Services Omnicom Health Group is the world’s largest and most diverse global healthcare network, pioneering solutions that shape a healthier future for all. At OHG, you’re not just part of a network—you’re part of a movement. Our ambition is to be the case study others aspire to, challenging the status quo and redefining what’s possible. With flagship locations globally, we deliver local expertise and groundbreaking healthcare solutions across consulting, strategy, creative, media, and more. Our 29 specialized companies work seamlessly to drive innovation with precision and impact. Know more at: https://omnicomhealthgroup.com/ The OGS-OH partnership empowers some of the world’s iconic brands with Knowledge, Innovation, and Transformation. When you join, you become part of a dynamic team that delivers high-impact solutions in the healthcare marketing and communications space. Here’s what makes us unique: We are a growing community that blends creativity, technology, and data-driven insights to transform healthcare. Bringing you the best of both worlds – our team partners with key OH strategists while staying rooted in OGS’ culture and values. Access to top healthcare and biopharmaceutical brands. Helping you own your career – unlock diverse learning and upskilling opportunities, along with personalized talent development programs. Empowering you with an inclusive, rewarding, and engaging work environment centred around your well-being. Qualifications: JD Shared by Agency: Reporting & Insights – Specialist (Subject Matter Expert) Function: Market Science Level: SME Experience Required: 4–6 years of experience in marketing analytics, reporting architecture, data pipeline optimization, or performance intelligence strategy 1. Role Summary As a Specialist (SME) in the Reporting & Insights team within Market Science, you will serve as a domain expert in building robust reporting frameworks, optimizing data flows, and enabling scalable reporting systems across clients and platforms. You will lead reporting innovations, consult on best practices, and ensure governance across measurement and dashboarding processes. Your expertise will directly influence the development of strategic performance reporting for Omnicom Health clients, ensuring insights are timely, trusted, and actionable. 2. Key Responsibilities Architect reporting ecosystems using BI tools and advanced analytics workflows. Standardize KPIs, data definitions, and visualization best practices across clients. Collaborate with data engineering teams to enhance data warehousing/reporting infrastructure. Drive adoption of reporting automation, modular dashboards, and scalable templates. Ensure compliance with data governance, privacy, and client reporting SLAs. Act as the go-to expert for dashboarding tools, marketing KPIs, and campaign analytics. Conduct training and peer reviews to improve reporting maturity across teams. 3. Skills & Competencies Skill / Competency Proficiency Level Must-Have / Good-to-Have Criticality Index BI Tools Mastery (Power BI, Tableau) Advanced Must-Have High Data Architecture & ETL Intermediate Must-Have High Cross-Platform Reporting Logic Advanced Must-Have High Stakeholder Consulting Advanced Must-Have High Data Governance & QA Intermediate Must-Have High Leadership & Influence Intermediate Must-Have Medium Training & Enablement Intermediate Good-to-Have Medium 4. Day-to-Day Deliverables Will Include Designing and reviewing dashboards for performance, scalability, and accuracy Standardizing metrics, filters, and visualizations across platforms and markets Troubleshooting data discrepancies and establishing QA protocols Supporting onboarding of new clients or business units into the reporting framework Publishing playbooks and SOPs on reporting automation and delivery standards Conducting stakeholder walkthroughs and enablement sessions 5. Key Attributes for Success in This Role Strategic thinker with a hands-on approach to reporting and automation High attention to detail and process consistency Confident in translating business needs into scalable BI solutions Adaptable to changing client needs, tools, and data environments Collaborative, yet assertive in driving reporting excellence 6. Essential Tools/Platforms & Certifications Tools : Power BI, Advance Excel, Redshift , Alteryx (basics) Certifications : Power BI/Tableau Professional, , Data Engineering/ETL certifications – Preferred

Posted 6 hours ago

Apply

12.0 - 15.0 years

2 - 4 Lacs

Hyderābād

Remote

Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and delivering ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Principal IS Architect Live What you will do Let’s do this. Let’s change the world. In this vital role We are seeking a visionary and technically exceptional Principal IS Architect to lead the design and development of enterprise-wide intelligent search solutions. s a senior-level IT professional who designs and oversees the implementation of robust and scalable data and AI solutions, often utilizing the Java programming language and related technologies. This role requires a strong understanding of both data architecture principles and AI/ML concepts, along with expertise in Java development and cloud platforms You’ll lead by example—mentoring engineers, setting standards, and driving the technical vision for our next-generation search capabilities. This person will also be responsible for defining the roadmap for Products They will work closely with Development teams and act as a bridge between Product owners and Development teams to perform Proof of Concepts on provided design and technology, develop re-usable components etc. This is a senior role in the organization which along with a team of other architects will help design the future state of technology at Amgen India Design and Strategy: Responsibilities include developing and maintaining foundational architecture for data and AI initiatives, defining the technical roadmap, and translating business requirements into technical specifications Data Architecture: This involves designing and implementing data models, database designs, and ETL processes, as well as leading the design of scalable data architectures. The role also includes establishing best practices for data management and ensuring data security and compliance. AI Architecture and Implementation: Key tasks include architecting and overseeing the implementation of AI/ML frameworks and solutions, potentially with a focus on generative AI models, and defining processes for AI/ML development and MLOps. Develop end-to-end solution architectures for data-driven and AI-focused applications, ensuring alignment with business objectives and technology strategy. Lead architecture design efforts across data pipelines, machine learning models, AI applications, and analytics platforms in our Gap Data Platform area. Collaborate closely with business partners, product managers, data scientists, software engineers, and the broader Global Technology Solutions teams in vetting solution design and delivering business value. Provide technical leadership and mentoring in data engineering and AI best practices. Evaluate and recommend emerging data technologies, AI techniques, and cloud services to enhance business capabilities. Ensure the scalability, performance, and security of data and AI architectures. Establish and maintain architectural standards, including patterns and guidelines for data and AI projects. Create architecture artifacts(concept, system, data architecture) for data and AI projects/initiatives. Create and oversee architecture center of excellence for data and AI area to coach and mentor resources working in this area. Set technical direction, best practices, and coding standards for search engineering across the organization. Review designs, mentor senior and mid-level engineers, and champion architecture decisions aligned with product goals and compliance needs. Own performance, scalability, observability, and reliability of search services in production. Resolving technical problems as they arise. Providing technical guidance and mentorship to junior developers. Continually researching current and emerging technologies and proposing changes where needed. .Assessing the business impact that certain technical choices have. Providing updates to stakeholders on product development processes, costs, and budgets. Work closely with Information Technology professionals within the company to ensure hardware is available for projects and working properly Work closely with project management teams to successfully monitor progress of initiatives Current understanding of best practices regarding system security measures Positive outlook in meeting challenges and working to a high level Advanced understanding of business analysis techniques and processes Account for possible project challenges on constraints including, risks, time, resources and scope Possesses strong rapid prototyping skills and can quickly translate concepts into working code Take ownership of complex software projects from conception to deployment. Manage software delivery scope, risk, and timeline Participate to both front-end and back-end development using cloud technology. Develop innovative solution using generative AI technologies Define and implement robust software architectures on the cloud, AWS preferred Conduct code reviews to ensure code quality and alignment to best practices. Create and maintain documentation on software architecture, design, deployment, disaster recovery, and operations. Identify and resolve technical challenges effectively. Stay updated with the latest trends and advancements Work closely with product team, business team, and other key partners. Basic Qualifications: Master’s degree in computer science & engineering preferred with 12-15 years of software development experience OR, Bachelor’s degree in computer science & engineering preferred with 11-15 years of software development experience Minimum of 7 years of professional experience in technology, including at least 3 years in a data architecture and AI solution architect role. Strong expertise in cloud platforms, preferably Azure and GCP, and associated data and AI services. Proven experience in architecting and deploying scalable data solutions, including data lakes, warehouses, and streaming platforms. Working knowledge of tools/technologies like Azure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Deep understanding of AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML. Preferred Qualifications: Programming Languages: Proficiency in multiple languages (e.g., Python, Java,Data bricks, Vertex) is crucial and must Experienced with API integration, serverless, microservices architecture. Proficiency with programming languages like Python, Java, or Scala. Proficiency vAzure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Proficiency with AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML Solid understanding of data governance, security, privacy, and compliance standards. Exceptional communication, presentation, and stakeholder management skills. Experience working in agile project environments Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect of us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.

Posted 6 hours ago

Apply

8.0 years

3 - 6 Lacs

Hyderābād

Remote

Company Description It all started in sunny San Diego, California in 2004 when a visionary engineer, Fred Luddy, saw the potential to transform how we work. Fast forward to today — ServiceNow stands as a global market leader, bringing innovative AI-enhanced technology to over 8,100 customers, including 85% of the Fortune 500®. Our intelligent cloud-based platform seamlessly connects people, systems, and processes to empower organizations to find smarter, faster, and better ways to work. But this is just the beginning of our journey. Join us as we pursue our purpose to make the world work better for everyone. Job Description What you get to do in this role: Develop and maintain AI-powered internal tools that automate workflows and boost stakeholders’ productivity, with specialized focus on sales analytics and strategic planning operations Build and deliver ETL pipelines for Power BI/Snowflake datasets optimized LLM consumption, enabling efficient AI-driven analysis and empowering power users. Collaborate cross-functionally with Data & Analytics teams and Sales Operations teams to identify high-value AI use cases and rapidly prototype AI-enabled utilities that align with business goals Transform enterprise data from Power BI and Snowflake into LLM-optimized formats while ensuring data integrity and reliable performance across AI-driven solutions Manage complete AI agent development lifecycle from ideation, testing, production deployment, and user adoption while implementing continuous integration and documenting best practices Champion organizational AI adoption by ensuring seamless system integration, demonstrating clear business value, and maintaining high standards for performance and user experience Qualifications In order to be successful in this role: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AI's potential impact on the function or industry. 8+ years of proven track record supporting sales organizations or sales business processes through analytics, automation, or technical solutions Demonstrated history of building and deploying AI agents or automation tools in real-world business settings for sales workflow automation Proven experience with semantic modeling in Power BI or Snowflake, plus familiarity with transforming sales data models for LLM integration and sales analytics optimization. (include data restructuring for LLM integration) Strong understanding of data engineering, APIs, and cloud-based architecture with experience in sales data Ability to function both independently and as part of cross-functional teams including sales teams and business stakeholders in fast-paced environments Hands-on experience with rapid prototyping, iterative testing, and agile methodologies specifically applied to sales tools and business process improvements Additional Information Work Personas We approach our distributed world of work with flexibility and trust. Work personas (flexible, remote, or required in office) are categories that are assigned to ServiceNow employees depending on the nature of their work and their assigned work location. Learn more here. To determine eligibility for a work persona, ServiceNow may confirm the distance between your primary residence and the closest ServiceNow office using a third-party service. Equal Opportunity Employer ServiceNow is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, national origin or nationality, ancestry, age, disability, gender identity or expression, marital status, veteran status, or any other category protected by law. In addition, all qualified applicants with arrest or conviction records will be considered for employment in accordance with legal requirements. Accommodations We strive to create an accessible and inclusive experience for all candidates. If you require a reasonable accommodation to complete any part of the application process, or are unable to use this online application and need an alternative method to apply, please contact globaltalentss@servicenow.com for assistance. Export Control Regulations For positions requiring access to controlled technology subject to export control regulations, including the U.S. Export Administration Regulations (EAR), ServiceNow may be required to obtain export control approval from government authorities for certain individuals. All employment is contingent upon ServiceNow obtaining any export license or other approval that may be required by relevant export control authorities. From Fortune. ©2025 Fortune Media IP Limited. All rights reserved. Used under license.

Posted 6 hours ago

Apply

2.0 - 6.0 years

3 - 8 Lacs

Hyderābād

On-site

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Associate IS Engineer - Veeva Vault Promomats_Medcomms What you will do Let’s do this. Let’s change the world. In this vital role in the Veeva Vault team you will be responsible for designing, developing, and maintaining software applications and solutions in Amgen’s Vault PromoMats and Vault MedComm, that meet business needs and ensuring the availability and performance of critical systems and applications. This role involves working closely with product managers, designers, and other engineers to create high-quality, scalable software solutions and automating operations, monitoring system health, and responding to incidents to minimize downtime. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Lead day to day operations and maintenance of Amgen’s Amgen’s Vault PromoMats and Vault MedComm and its hosted applications. Stay updated with the latest trends, advancements and standard processes for Veeva Vault Platform ecosystem. Design, develop, and implement applications and modules, including custom reports, SDKs, interfaces, and enhancements. Analyze and understand the functional & technical requirements of applications, solutions and systems, translate them into software architecture and design specifications. Develop and complete unit tests, integration tests, and other testing strategies to ensure the quality of the software following IS change control and GxP Validation process while exhibiting expertise in Risk Based Validation methodology. Work closely with multi-functional teams, including product management, design, and QA, to deliver high-quality software on time. Maintain detailed documentation of software designs, code, and development processes. Work on integrating with other systems and platforms to ensure seamless data flow and functionality. Stay up to date on Veeva Vault Features, new releases and standard methodologies around Veeva Platform Governance. What we expect of you Basic Qualifications and Experience: Bachelor’s degree and 2 to 6 years of Information Systems experience or related field Functional Skills: Must-Have Skills: Experience with Amgen’s Vault PromoMats and Vault MedComm, including Veeva configuration settings and custom builds. Strong knowledge of information systems and network technologies. Experience in building configured and custom solutions on Veeva Vault Platform. Experience in managing systems, implementing and validating projects in GxP regulated environments. Extensive expertise in SDLC, including requirements, design, testing, data analysis, creating and managing change controls. Proficiency in programming languages such as Python, JavaScript etc. Solid understanding of software development methodologies, including Agile and Scrum. Experience with version control systems such as Git. Good-to-Have Skills: Familiarity with relational databases (such as MySQL, SQL server, PostgreSQL etc.) Proficiency in programming languages such as Python, JavaScript or other programming languages Outstanding written and verbal communication skills, and ability to translate technical concepts for non-technical audiences. Experience with ETL Tools (Informatica, Databricks). Experience with API integrations such as MuleSoft. Solid understanding & Proficiency in writing SQL queries. Hands on experience on reporting tools such as Tableau, Spotfire & Power BI. Professional Certifications: Veeva Vault Platform Administrator or Equivalent Vault Certification (Must-Have) SAFe for Teams (Preferred) Soft Skills: Excellent analytical and fix skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Work Hours: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required. Potential Shifts (subject to change based on business requirements): Second Shift: 2:00pm – 10:00pm IST; Third Shift: 10:00 pm – 7:00 am IST. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 6 hours ago

Apply

4.0 years

8 - 8 Lacs

Hyderābād

On-site

Hyderabad, India Technology In-Office 11047 Job Description Job Purpose The Property Data Engineer is responsible for developing and maintaining data conversion programs that transform raw property assessment data into standardized formats based on specifications by Property Data Analyst and Senior Analysts. This role requires not only advanced programming and ETL skills but also a deep understanding of the structure, nuances, and business context of assessment data. Even with clear and well-documented conversion instructions, engineers without prior exposure to this domain often face significant challenges in interpreting and transforming the data accurately. Data Engineer plays a critical role in ensuring the accuracy, efficiency and scalability of data processing pipelines that support the Assessor Operations. Responsibilities Depending on the specific team and role, the Property Data Engineer may be responsible for some or all the following tasks: Develop and maintain data conversion programs using C#, Python, JavaScript, and SQL. Implement ETL workflows using tools such as Pentaho Kettle, SSIS, and internal applications. Collaborate with Analysts and Senior Analysts to interpret conversion instructions and translate them into executable code. Troubleshoot and resolve issues identified during quality control reviews. Recommend and implement automation strategies to improve data processing efficiency. Perform quality checks on converted data and ensure alignment with business rules and standards. Contribute to the development of internal tools and utilities to support data transformation tasks. Maintain documentation for code, workflows, and processes to support team knowledge sharing. Programming (Skill Level: Advanced to Expert) Create and maintain conversion programs in SQL, Visual Studio using C#, Python or JavaScript. Use JavaScript within Pentaho Kettle workflows and SSIS for data transformation. Build and enhance in-house tools to support custom data processing needs. Ensure code is modular, maintainable, and aligned with internal development standards. Ensure code quality through peer reviews, testing and adherence to development standards. ETL Execution (Skill Level: Advanced to Expert ) Execute and troubleshoot ETL processes using tools like Kettle, SSIS, and proprietary tools. Input parameters, execute jobs, and perform quality checks on output files. Troubleshoot ETL failures and optimize performance. Recommend and implement automation strategies to improve data processing efficiency and accuracy. Data File Manipulation (Skill Level: Advanced to Expert) Work with a wide variety of file formats (CSV, Excel, TXT, XML, etc.) to prepare data for conversion. Apply advanced techniques to clean, merge, and structure data. Develop scripts and tools to automate repetitive data preparation tasks. Ensure data is optimized for downstream ETL and analytical workflows. Data Analysis (Skill Level: Supportive – Applied) Leverage prior experience in data analysis to independently review and interpret source data when developing or refining conversion programs. Analyze data structures, field patterns, and anomalies to improve the accuracy and efficiency of conversion logic. Use SQL queries, Excel tools, and internal utilities to validate assumptions and enhance the clarity of analyst-provided instructions. Collaborate with Analysts and Senior Analysts to clarify ambiguous requirements and suggest improvements based on technical feasibility and data behavior. Conduct targeted research using public data sources (e.g., assessor websites) to resolve data inconsistencies or fill in missing context during development. Quality Control (Skill Level: Engineer-Level) Perform initial quality control on converted data outputs before formal review by Associates, Analysts, or Senior Analysts for formal review. Validate that the program output aligns with conversion instructions and meets formatting and structural expectations. Use standard scripts, ad-hoc SQL queries, and internal tools to identify and correct discrepancies in the data. Address issues identified during downstream QC reviews by updating conversion logic or collaborating with analysts to refine requirements. Ensure that all deliverables meet internal quality standards prior to release or further review. Knowledge and Experience Minimum Education: Bachelor’s degree in Computer Science, Information Systems, Software Engineering, Data Engineering, or a related technical field; or equivalent practical experience in software development or data engineering. Preferred Education: Bachelor’s degree (as above) plus additional coursework or certifications in: Data Engineering ETL Development Cloud Data Platforms (e.g., AWS, Azure, GCP) SQL and Database Management Programming (C#, Python, JavaScript) 4+ years of experience in software development, data engineering, or ETL pipeline development. Expert-level proficiency in programming languages such as SQL, Visual Studio using C#, Python, and JavaScript. Experience with ETL tools such as Pentaho Kettle, SSIS, or similar platforms. Strong understanding of data structures, file formats (CSV, Excel, TXT, XML), and data transformation techniques. Familiarity with relational databases and SQL for data querying and validation. Ability to read and interpret technical documentation and conversion instructions. Strong problem-solving skills and attention to detail. Ability to work independently and collaboratively in a fast-paced environment. Familiarity with property assessment, GIS, tax or public property records data. Preferred Skills Experience developing and maintaining data conversion programs in Visual Studio. Experience with property assessment, GIS, tax or public records data. Experience building internal tools or utilities to support data transformation workflows. Knowledge of version control systems (e.g., Git, Jira) and agile development practices. Exposure to cloud-based data platforms or services (e.g., Azure Data Factory, AWS Glue). Ability to troubleshoot and optimize ETL performance and data quality. Strong written and verbal communication skills for cross-functional collaboration.

Posted 6 hours ago

Apply

7.0 years

5 - 18 Lacs

India

Remote

Role: Senior Appian Developer (Hybrid) Position Type: Full-Time Contract (40hrs/week) Contract Duration: Long Term Work Schedule: 8 hours/day (Mon-Fri) Location: Hyderabad, India - Hybrid (3 days/week on site) What You'll Do: Troubleshoot and resolve technical issues related to Appian applications, ensuring minimal downtime and optimal performance. Diagnose and fix problems in Talend workflows, focusing on data extraction, transformation, and loading processes. Manage and troubleshoot SQL Server databases, ensuring data integrity, performance, and security Working knowledge of automations built on Power Automate from troubleshooting and maintenance perspective Handle Autosys job scheduling and automation, ensuring smooth execution of batch jobs and workflows. Collaborate with cross-functional teams to gather requirements, design solutions, and implement troubleshooting strategies. Document and track issues, resolutions, and best practices to improve the overall troubleshooting process. Provide technical support during production releases and maintenance windows, working closely with the Operations team. Stay up-to-date with the latest industry trends and best practices in troubleshooting and technical support. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Talents Needed for Success: Minimum of 7 years of experience in technical troubleshooting and support. Proven experience in troubleshooting Appian applications, with a strong understanding of Appian architecture and integration patterns. Expertise in Talend, including designing and troubleshooting ETL processes. Proficiency in SQL Server, including database design, optimization, and performance tuning. Experience with Autosys job scheduling and automation, including setting up and managing jobs using Autosys. Strong analytical and problem-solving skills, with the ability to work independently and as part of a team. Excellent communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at all levels. Additional Skills: Experience with version control systems (e.g., Git) and collaboration tools (e.g., Jira, Confluence). Knowledge of scripting languages such as Python and Shell/Batch programming is a plus. Understanding of Agile processes and methodologies, with experience in working in an Agile framework using Scrum. Job Types: Full-time, Contractual / Temporary Contract length: 12 months Pay: ₹543,352.07 - ₹1,855,655.80 per year Application Question(s): How many years of experience do you have of troubleshooting in Appian applications? How much experience do you have inTalend, including designing and troubleshooting ETL processes? How much Experience do you have with Autosys job scheduling and automation? Are you comfortable to work 3 days onsite and 2 days remote in a week? How soon you can join us? License/Certification: Appian L2 certification (Required) Location: Hyderabad Jubilee Ho, Hyderabad, Telangana (Required)

Posted 6 hours ago

Apply

0 years

0 Lacs

Telangana

On-site

We are looking for an experienced and motivated Senior Data Engineer to join our dynamic team. In this role, The role primarily focuses on MDM and associated ETL and real-time feeds monitoring and support. This engineer will be part of the global L1/L2 production support team , which is split between Chubb Engineering Centers in India and Mexico. Key responsibilities will include monitoring ETL processes, handling automated issues, and ensuring compliance with security policies. A good understanding of MDM Informatica and Data Factory is preferred The ideal candidate will have experience with Powercenter, MDM, Azure Data Factory, be able to identify and resolve data quality issues, proactively monitor production systems, performance bottlenecks, and other ETL-related problems. Responsibilities: Monitor ETL jobs including Powercenter/IICS, kafka based near real-time updates, batch processes. Troubleshoot production incidents Understands data mapping and data modeling methodologies including normal form, star, and snowflake to reduce data redundancy and improve data integrity. Maintains knowledge on current and emerging developments/trends for assigned area(s) of responsibility, assesses the impact, and collaborates with Scrum Team and Leadership to

Posted 6 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Cortex is urgently hiring for the role : ''Data Engineer'' Experience: 5 to 8 years Location: Bangalore, Noida, and Hyderabad (Hybrid, weekly 2 Days office must) NP: Immediate to 10days only Key skills: Candidates Must have experience in Python, Kafka Stream, Pyspark, and Azure Databricks Role Overview We are looking for a highly skilled with expertise in Kafka, Python, and Azure Databricks (preferred) to drive our healthcare data engineering projects. The ideal candidate will have deep experience in real-time data streaming, cloud-based data platforms, and large-scale data processing. This role requires strong technical leadership, problem-solving abilities, and the ability to collaborate with cross-functional teams. Key Responsibilities Lead the design, development, and implementation of real-time data pipelines using Kafka, Python, and Azure Databricks. Architect scalable data streaming and processing solutions to support healthcare data workflows. Develop, optimize, and maintain ETL/ELT pipelines for structured and unstructured healthcare data. Ensure data integrity, security, and compliance with healthcare regulations (HIPAA, HITRUST, etc.). Collaborate with data engineers, analysts, and business stakeholders to understand requirements and translate them into technical solutions. Troubleshoot and optimize Kafka streaming applications, Python scripts, and Databricks workflows. Mentor junior engineers, conduct code reviews, and ensure best practices in data engineering. Stay updated with the latest cloud technologies, big data frameworks, and industry trends. If you are interested kindly send your resume to us by just clicking '' easy apply''. This job is posted by Aishwarya.K Business HR - Day recruitment Cortex Consultants LLC (US) | Cortex Consulting Pvt Ltd (India) | Tcell (Canada) US | India | Canada

Posted 6 hours ago

Apply

10.0 years

20 - 30 Lacs

Hyderābād

On-site

Job Title: SAP FICO Consultant ( Carve out) Experience required: 10+ Years Location: Hyderabad Work mode: Onsite Availability: immediate to 15 days Job Description: All the candidates must have worked on Carve-out 10+ years of experience in SAP FICO implementation and support. At least 2–3 full-lifecycle carve-out projects or M&A separation projects in SAP environment. Strong understanding of SAP Financial Accounting and Controlling, including: GL, AP, AR, Asset Accounting Cost Center Accounting, Internal Orders, Product Costing, and Profitability Analysis (COPA) Experience with SAP S/4HANA is highly desirable. Deep knowledge of legal entity structuring, company code creation, and data partitioning. Experience with cross-module integration (SD, MM, PP). Strong data migration, cleansing, and mapping skills. Excellent communication and stakeholder management skills. Understanding of compliance (IFRS/GAAP), SOX controls, and audit readiness during separation. Responsibilities: Lead or support SAP FICO stream in carve-out or divestiture projects, ensuring smooth financial separation and reporting. Perform financial impact analysis, legal entity setup, and company code restructuring. Design and configure SAP FICO modules (GL, AR, AP, AA, CO, PCA, CCA, COPA) for the new entity or separated business unit. Manage data separation, including historical and open financial transactions, master data, and cost objects. Work with SAP Migration tools (LSMW, BODS, or third-party ETL tools) to extract and transform financial data for the new entity. Coordinate closely with the Basis, Security, SD/MM/PP teams, and external stakeholders to ensure complete functional carve-out. Support cutover planning, testing (SIT/UAT), and hyper care phases. Provide advisory support on taxation, intercompany transactions, and financial consolidation implications. Document business process design, configurations, and user guides. Job Types: Full-time, Permanent Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Schedule: Day shift Experience: SAP Finance & Controlling: 10 years (Required) SAP S/4HANA: 8 years (Required) Data migration: 10 years (Required) Carve-Out Project: 4 years (Required) SAP FICO: 10 years (Required) Location: Hyderabad, Telangana (Preferred) Work Location: In person

Posted 6 hours ago

Apply

4.0 years

0 Lacs

Telangana

On-site

We are looking for an experienced and motivated Senior Data Engineer to join our dynamic team. In this role, The role primarily focuses on MDM and associated ETL and real-time feeds monitoring and support. This engineer will be part of the global L1/L2 production support team , which is split between Chubb Engineering Centers in India and Mexico. Key responsibilities will include monitoring ETL processes, handling automated issues, and ensuring compliance with security policies. A good understanding of MDM Informatica and Data Factory is preferred The ideal candidate will have experience with Powercenter, MDM, Azure Data Factory, be able to identify and resolve data quality issues, proactively monitor production systems, performance bottlenecks, and other ETL-related problems. Responsibilities: Monitor ETL jobs including Powercenter/IICS, kafka based near real-time updates, batch processes. Troubleshoot production incidents Understands data mapping and data modeling methodologies including normal form, star, and snowflake to reduce data redundancy and improve data integrity. Maintains knowledge on current and emerging developments/trends for assigned area(s) of responsibility, assesses the impact, and collaborates with Scrum Team and Leadership to 4 Year/bachelor’s degree or equivalent work experience (4 years of experience in lieu of Bachelors)_ At least 5+ years of Strong understanding of ETL development concepts and tools such as ETL development solutions (e.g. Powercenter and/or IICS, Informatica MDM, Azure Data Factory, Snowflake) Experience with Data Warehousing and Business Intelligence concepts and technologies Knowledge of SQL and advanced programming languages such as Python and Java Demonstrated critical thinking skills and the ability to identify and resolve data quality issues, performance bottlenecks, and other ETL-related problems Experience with Agile methodologies and project-management skills Excellent communication and interpersonal skills 2+ years of experience in scheduling jobs using Autosys (or comparable distributed scheduler) 3+ years of experience writing Unix/Linux or Windows Scripts in tools such as PERL, Shell script, Python, etc. 3+ years of experience in creating complex technical specifications from business requirements/specifications

Posted 6 hours ago

Apply

3.0 years

6 - 7 Lacs

Hyderābād

Remote

Working with Us Challenging. Meaningful. Life-changing. Those aren't words that are usually associated with a job. But working at Bristol Myers Squibb is anything but usual. Here, uniquely interesting work happens every day, in every department. From optimizing a production line to the latest breakthroughs in cell therapy, this is work that transforms the lives of patients, and the careers of those who do it. You'll get the chance to grow and thrive through opportunities uncommon in scale and scope, alongside high-achieving teams. Take your career farther than you thought possible. Bristol Myers Squibb recognizes the importance of balance and flexibility in our work environment. We offer a wide variety of competitive benefits, services and programs that provide our employees with the resources to pursue their goals, both at work and in their personal lives. Read more: careers.bms.com/working-with-us . Roles & Responsibilities Develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions for HR stakeholders Partner with senior analysts build visualizations to communicate insights and recommendations to stakeholders at various levels of the organization Partner with HR senior analysts to implement statistical models, decision support models, and optimization techniques to solve complex business problems. Collaborate with cross-functional teams to gather/analyse data, define problem statements and identify KPIs for decision-making Perform and document data analysis, data validation, and data mapping/design. Collaborate with HR stakeholders to understand business objectives and translate them into projects and actionable recommendations Stay up to date with industry trends, emerging methodologies, and best practices related to reporting analytics / visualization optimization and decision support The HR data Analyst will play a critical role in ensuring the availability, integrity of HR data to drive informed decision-making. Skills and competencies Strong analytical thinking and problem-solving skills, with working knowledge of statistical analysis, optimization techniques, and decision support models. Ability to present complex information to non-technical stakeholders in a clear and concise manner; skilled in creating relevant and engaging PowerPoint presentations. Proficiency in data analysis techniques, including the use of Tableau, ETL tools (Python, R, Domino), and statistical software packages. Advanced skills in Power BI, Power Query, DAX, and data visualization best practices. Experience with data modelling, ETL processes, and connecting to various data sources. Solid understanding of SQL and relational databases. Exceptional attention to detail, with the ability to proactively detect data anomalies and ensure data accuracy. Ability to work collaboratively in cross-functional teams and manage multiple projects simultaneously. Strong capability to work with large datasets, ensuring the accuracy and reliability of analyses. Strong business acumen, with the ability to translate analytical findings into actionable insights and recommendations. Working knowledge of data modelling to support analytics needs. Experience conducting thorough Exploratory Data Analysis (EDA) to summarize, visualize, and validate data quality and trends. Ability to apply foundational data science or basic machine learning techniques (such as regression, clustering, or forecasting) when appropriate. Experience: Bachelor's or master's degree in a relevant field such as Statistics, Mathematics, Economics, Operations Research or a related discipline. Minimum of 3+ years of total relevant experience Business experience with visualization tools (e.g., PowerBI) Experience with data querying languages (e.g., SQL), scripting languages (Python) Problem-solving skills with understanding and practical experience across most Statistical Modelling and Machine Learning Techniques. Only academic knowledge is also acceptable. Ability to handle, and maintain the confidentiality of highly sensitive information Experience initiating and completing analytical projects with minimal guidance Experience communicating results of analysis to using compelling and persuasive oral and written storytelling techniques Hands-on experience working with large datasets, statistical software packages (e.g., R, Python), and data visualization tools such as Tableau and Power BI. Experience with ETL processes, writing complex SQL queries, and data manipulation techniques. Experience in HR analytics a nice to have If you come across a role that intrigues you but doesn't perfectly line up with your resume, we encourage you to apply anyway. You could be one step away from work that will transform your life and career. Uniquely Interesting Work, Life-changing Careers With a single vision as inspiring as Transforming patients' lives through science™ , every BMS employee plays an integral role in work that goes far beyond ordinary. Each of us is empowered to apply our individual talents and unique perspectives in a supportive culture, promoting global participation in clinical trials, while our shared values of passion, innovation, urgency, accountability, inclusion and integrity bring out the highest potential of each of our colleagues. On-site Protocol BMS has an occupancy structure that determines where an employee is required to conduct their work. This structure includes site-essential, site-by-design, field-based and remote-by-design jobs. The occupancy type that you are assigned is determined by the nature and responsibilities of your role: Site-essential roles require 100% of shifts onsite at your assigned facility. Site-by-design roles may be eligible for a hybrid work model with at least 50% onsite at your assigned facility. For these roles, onsite presence is considered an essential job function and is critical to collaboration, innovation, productivity, and a positive Company culture. For field-based and remote-by-design roles the ability to physically travel to visit customers, patients or business partners and to attend meetings on behalf of BMS as directed is an essential job function. BMS is dedicated to ensuring that people with disabilities can excel through a transparent recruitment process, reasonable workplace accommodations/adjustments and ongoing support in their roles. Applicants can request a reasonable workplace accommodation/adjustment prior to accepting a job offer. If you require reasonable accommodations/adjustments in completing this application, or in any part of the recruitment process, direct your inquiries to adastaffingsupport@bms.com . Visit careers.bms.com/ eeo -accessibility to access our complete Equal Employment Opportunity statement. BMS cares about your well-being and the well-being of our staff, customers, patients, and communities. As a result, the Company strongly recommends that all employees be fully vaccinated for Covid-19 and keep up to date with Covid-19 boosters. BMS will consider for employment qualified applicants with arrest and conviction records, pursuant to applicable laws in your area. If you live in or expect to work from Los Angeles County if hired for this position, please visit this page for important additional information: https://careers.bms.com/california-residents/ Any data processed in connection with role applications will be treated in accordance with applicable data privacy policies and regulations.

Posted 6 hours ago

Apply

2.0 - 4.0 years

6 - 9 Lacs

Hyderābād

On-site

Summary As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. About the Role Location – Hyderabad #LI Hybrid About the Role: As a Data Analyst, you will be responsible for Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Key Responsibilities: Design, develop, and maintain efficient and scalable data pipelines for data ingestion, transformation, and storage. Collaborate with cross-functional teams, including data analysts, business analyst and BI, to understand data requirements and design appropriate solutions. Build and maintain data infrastructure in the cloud, ensuring high availability, scalability, and security. Write clean, efficient, and reusable code in scripting languages, such as Python or Scala, to automate data workflows and ETL processes. Implement real-time and batch data processing solutions using streaming technologies like Apache Kafka, Apache Flink, or Apache Spark. Perform data quality checks and ensure data integrity across different data sources and systems. Optimize data pipelines for performance and efficiency, identifying and resolving bottlenecks and performance issues. Collaborate with DevOps teams to deploy, automate, and maintain data platforms and tools. Stay up to date with industry trends, best practices, and emerging technologies in data engineering, scripting, streaming data, and cloud technologies Essential Requirements: Bachelor's or Master's degree in Computer Science, Information Systems, or a related field with an overall experience of 2-4 Years. Proven experience as a Data Engineer or similar role, with a focus on scripting, streaming data pipelines, and cloud technologies like AWS, GCP or Azure. Strong programming and scripting skills in languages like Python, Scala, or SQL. Experience with cloud-based data technologies, such as AWS, Azure, or Google Cloud Platform. Hands-on experience with streaming technologies, such as AWS Streamsets, Apache Kafka, Apache Flink, or Apache Spark Streaming. Strong experience with Snowflake (Required) Proficiency in working with big data frameworks and tools, such as Hadoop, Hive, or HBase. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with data modelling and schema design principles. Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment. Excellent communication and teamwork skills. Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation: Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.

Posted 6 hours ago

Apply

3.0 years

6 - 7 Lacs

Hyderābād

On-site

Job Title: Data Engineer Total Experience: 3+ Years Location: Hyderabad Job Type: Contract Work Mode: On-site Notice Period: Immediate to 15 Days Work Timings: Monday to Friday, 10 am to 7 pm (IST) Interview Process Level 1: HR Screening (Personality Assessment) Level 2: Technical Round Level 3: Final Round (Note: The interview levels may vary) Company Overview Compileinfy Technology Solutions Pvt. Ltd. is a fast-growing IT services and consulting company delivering tailored digital solutions across industries. At Compileinfy, we promote a culture of ownership, critical thinking, and technological excellence. Job Summary We are seeking a highly motivated Data Engineer to join our expanding Data & AI team. This role offers the opportunity to design and develop robust, scalable data pipelines and infrastructure, ensuring the delivery of high-quality, timely, and accessible data throughout the organization. As a Data Engineer, you will collaborate across teams to build and optimize data solutions that support analytics, reporting, and business operations. The ideal candidate combines deep technical expertise, strong communication, and a drive for continuous improvement. Who You Are: Experienced in designing and building data pipelines for ingestion, transformation, and loading (ETL/ELT) of data from diverse sources to data warehouses or lakes. Proficient in SQL and at least one programming language, such as Python, Java, or Scala. Skilled at working with both relational databases (e.g., PostgreSQL, MySQL) and big data platforms (e.g., Hadoop, Spark, Hive, EMR). Competent in cloud environments (AWS, GCP, Azure), data lake, and data warehouse solutions. Comfortable optimizing and managing the quality, reliability, and timeliness of data flows. Ability to translate business requirements into technical specifications and collaborate effectively with stakeholders, including data scientists, analysts, and engineers. Detail-oriented, with strong documentation skills and a commitment to data governance, security, and compliance. Proactive, agile, and adaptable to a fast-paced environment with evolving business needs. What You Will Do: Design, build, and manage scalable ETL/ELT pipelines to ingest, transform, and deliver data efficiently from diverse sources to centralized repositories such as lakes or warehouses. Implement validation, monitoring, and cleansing procedures to ensure data consistency, integrity, and adherence to organizational standards. Develop and maintain efficient database architectures, optimize data storage, and streamline data integration flows for business intelligence and analytics. Work closely with data scientists, analysts, and business users to gather requirements and deliver tailored data solutions supporting business objectives. Document data models, dictionaries, pipeline architectures, and data flows to ensure transparency and knowledge sharing. Implement and enforce data security and privacy measures, ensuring compliance with regulatory requirements and best practices. Monitor, troubleshoot, and resolve issues in data pipelines and infrastructure to maintain high availability and performance. Preferred Qualifications: Bachelor’s or higher degree in Computer Science, Information Technology, Engineering, or a related field. 3-4years of experience in data engineering, ETL development, or related areas. Strong SQL and data modeling expertise with hands-on experience in data warehousing or business intelligence projects. Familiarity with AWS data integration tools (e.g., Glue, Athena), messaging/streaming platforms (e.g., Kafka, AWS MSK), and big data tools (Spark, Databricks). Proficiency with version control, testing, and deployment tools for maintaining code and ensuring best practices. Experience in managing data security, quality, and operational support in a production environment. What You Deliver Comprehensive data delivery documentation (data dictionary, mapping documents, models). Optimized, reliable data pipelines and infrastructure supporting the organization’s analytics and reporting needs. Operations support and timely resolution of data-related issues aligned with service level agreements. Interdependencies / Internal Engagement Actively engage with cross-functional teams to align on requirements, resolve issues, and drive improvements in data delivery, architecture, and business impact. Become a trusted partner in fostering a data-centric culture and ensuring the long-term scalability and integrity of our data ecosystem Why Join Us? At Compileinfy, we value innovation, collaboration, and professional growth. You'll have the opportunity to work on exciting, high-impact projects and be part of a team that embraces cutting-edge technologies. We provide continuous learning and career advancement opportunities in a dynamic, inclusive environment. Perks and Benefits Competitive salary and benefits package Flexible work environment Opportunities for professional development and training A supportive and collaborative team culture Application Process Submit your resume with the subject line: “Data Engineer Application – [Your Name]” to recruitmentdesk@compileinfy.com Job Types: Full-time, Contractual / Temporary Contract length: 12 months Pay: ₹600,000.00 - ₹700,000.00 per year Benefits: Health insurance Provident Fund Work Location: In person

Posted 6 hours ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies