Home
Jobs

3070 Informatica Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Manage and maintain OBIEE (Oracle Business Intelligence Enterprise Edition) environments. Oversee installation, configuration, and performance tuning of OBIEE systems to ensure high availability and efficient reporting.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Develops ETL solutions using Informatica PowerCentre.

Posted 2 days ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Pune

Work from Office

Naukri logo

Develop and manage data solutions using Snowflake, focusing on optimizing data storage, integration, and processing. Ensure data consistency and provide analytical insights through Snowflake’s cloud data platform.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Design and implement data integration and management solutions using Informatica Big Data Management (BDM). Ensure efficient handling of large data sets, optimizing performance and ensuring seamless data flow across systems.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Develop and manage ETL processes using Informatica, ensuring smooth data extraction, transformation, and loading across multiple systems. Optimize data workflows to ensure high-quality data management.

Posted 2 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Design and optimize ETL workflows using Talend. Ensure data integrity and process automation.

Posted 2 days ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey? What We Are Looking For You’ll be driving the adoption of the latest technologies in our solutions, bringing in thought leadership to guide clients on complex data management problems, and driving business performance. You will work with the leadership team to bring subject matter expertise in areas such as Big Data, ETL, Reporting, CRM, Data Warehousing, MDM, DevOps, Software Development, etc. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You’ll Do Leading end-to-end data management solution projects for multiple clients across data engineering and BI technologies. Responsible for creating a project management plan and ensuring adherence to project timelines. Integrate multiple data sources into one visualization to tell a story. Interact with customers to understand their business problems and provide best-in-class analytics solutions. Interact with Data Platform leaders and understand data flows that integrate into Tableau/analytics. Understand data governance, quality, security, and integrate analytics with these enterprise platforms. Interact with UX/UI global functions and design best-in class visualization for customers, harnessing all product capabilities. Must have 7 - 10 years of data warehousing and data engineering. Experience in interacting with Life Science clients directly, discussing requirements, and stakeholder management. Experience in requirement gathering and designing enterprise warehouse solutions from scratch. Hands-on experience with ETL tools like ADF, Databricks, and Informatica; experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc; experience in data warehouse: SQL/NoSQL, Amazon Redshift, Snowflake, Apache Hive, HDFS, etc. BI tools knowledge and experience in leading the implementation of dashboards. Deep understanding of data governance and data quality management frameworks. Strong communication and presentation skills with a strong problem-solving attitude. Excellent analytical, problem-solving, and debugging skills, with a strong ability to quickly learn and comprehend business processes and problems to effectively develop technical solutions to their requirements. Skills: mdm,sql,hdfs,data warehousing,big data,devops,cloud,amazon redshift,snowflake,pharmaceutical consulting,data management,apache hive,azure,reporting,problem-solving,luigi,informatica,analytical skills,presentation skills,data governance,adf,data engineering,crm,databricks,bi technologies,airflow,team management,business technology,aws,azkaban,software development,etl,client management,data quality management,life science

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Lead data governance initiatives using Collibra. Manage data policies, procedures, and standards across the organization. Ensure data quality, compliance, and accessibility. Collaborate with data stewards and business units to define data ownership and accountability. Provide training on data governance tools and principles, and support the implementation of Collibra across business units.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Design and implement data integration solutions using IBM Cognos. Focus on extracting, transforming, and loading data between various systems to provide accurate and actionable business insights.

Posted 2 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Lead the development, implementation, and management of Power BI reporting solutions, focusing on data modeling, dashboards, and business intelligence.

Posted 2 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Design and optimize ETL processes using Informatica PowerCenter, PL/SQL, and Oracle 10g. Improve data integration performance.

Posted 2 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Implement and manage Collibra’s data governance platform, ensuring the proper classification, access, and compliance of data across the organization. Provide support and training to users.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Design and implement data architectures and models, focusing on data warehouses and Snowflake-based environments. Ensure that data is structured for efficient querying and analysis, aligning with business goals and performance requirements.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Develop and optimize Business Intelligence solutions using QlikView. Perform data visualization, reporting, and analytics to help businesses make data-driven decisions. Ensure data integration and database administration are handled effectively.

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Implement and configure Informatica PowerCenter for data integration, transformation, and ETL processes. Leverage Informatica CDI for cloud data integration and ensure data quality and consistency.

Posted 2 days ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Kochi

Work from Office

Naukri logo

Implement and manage SAP BusinessObjects Data Services (BODS) solutions for data integration, transformation, and cleansing. You will optimize data workflows and ensure high-quality data. Expertise in SAP BODS is required.

Posted 2 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Design, implement, and manage data integration solutions using Oracle Data Integrator (ODI). Focus on automating data flows, transforming data between systems, and ensuring data quality across various platforms.

Posted 2 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job title : Analyst - Data & Process Management Location: Hyderabad % of travel expected: Travel required as per business need, if any Job type: Permanent and Full time About The Job Our Team: Sanofi Business Operation (SBO) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions . SBO strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, globally. Main Responsibilities The overall purpose and main responsibilities are listed below: At our Sanofi we are leveraging analytics and technology, on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Deliverables support planning and decision making across multiple commercial areas such as Analytics, Campaign Ops and market mix. In addition to ensuring high-quality deliverables, our team drives synergies across the franchise, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavor, we are seeking a dynamic talent for the role of “Analyst – Data & Process Management”. We are looking for a team member to support our data management team based out of US. Robust data management is a priority for our businesses, as the product potential has major implications to a wide range of disciplines. It is essential to have someone who understands and aspires to implement innovative techniques to drive our data management strategies across franchises. He/she will ensure on time and accurate delivery of data requirements by collaborating with relevant stakeholders. He/she will ensure Data availability, data quality and data completeness are maintained as per requirements and are delivered in timely manner. Ensuring data consistency across the sources and downstream teams. Pro-actively identifying data requirements and gaps in system. Developing SOPs for data processes and leading the process enhancements. Providing training to end users on usage of data across the sources. Building advance tools and automate or improve processes for analytical and other needs People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop education and communication content as per requirement Actively lead and develop SBO operations associates and ensure new technologies are leveraged Initiate the contracting process and related documents within defined timelines; and Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget Performance: Ensure data supplied by CDM is used effectively by stakeholders in commercial operations processes (forecasting targeting, call planning, alignments, field reporting, incentive compensation). Administer CDM activities related to the sales operations quarterly cycle. Monitor data quality reports and investigate problems. Maintain requirements documents, business rules and metadata. Provide first-level support for sales data inquiries. Provide ad-hoc support to US CDM colleagues. Works to develop deal tracking analytics and reporting capabilities Collaborates with Digital to enhance data access across various sources, develop tools, technology, and process to constantly improve quality and productivity Process: Contribute to overall quality enhancement by ensuring high standards for the output produced by the digital and data management team; and Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Refresh reports on frequency/cycle basis (weekly/monthly/quarterly/annually), along with QC checks for each refresh Manage opt out compliance for universe of healthcare professionals. Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of with complete data availability. About You Experience: 2+ years of experience in pharmaceutical product commercial omni channel datasets, data governance and data stewardship. In-depth knowledge of common databases like IQVIA, APLD, SFMC, Google analytics, HCP Engagement data and execution data, etc. Soft skills: Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills: At least 2+ years direct experience with pharmaceutical sales data and data management with the emphasis on syndicated data, Specialty Pharmacy and digital/omnichannel data. Strong technical background in AWS, Snowflake, Data Bricks, SQL, Python, Informatica, Dataiku etc. Strong knowledge of pharmaceutical sales and marketing data sources (IQVIA, Veeva etc.) Knowledge of and/or experience in pharmaceuticals sales operations; understands how data is applied in a pharmaceutical’s commercial operations context. Ability to translate business needs into data requirement Understands the basic principles of data management and data processing. Understands the basic principles of data governance and data stewardship (data quality). Strong communication skills, including the ability to communicate the data management subject matter to a non-technical/unfamiliar internal customer. Experience of using analytical tools like Power BI, VBA and Alteryx etc is a plus. Proficient of Excel/word/power point. An aptitude for problem solving and strategic thinking and ensuring high quality data output with strong quality assurance Ability to synthesize complex information into clear and actionable insights Proven ability to work effectively across all levels of stakeholders and diverse functions Solid understanding of pharmaceutical development, manufacturing, supply chain and marketing functions Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Effectively collaborate across differing levels of management, functions and role Strong decision-making skills, identifying key issues, developing solutions and gaining commitment Education: Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters) Languages: Excellent knowledge in English and strong communication skills – written and spoken Pursue progress, discover extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. At Sanofi, we provide equal opportunities to all regardless of race, colour, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title Location: Hyderabad % of travel expected: Travel required as per business need, if any Job type: Permanent and Full time Our Team Sanofi Business Operation (SBO) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions . SBO strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, Globally. Main Responsibilities The overall purpose and main responsibilities are listed below: At our Sanofi we are leveraging analytics and technology, on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Deliverables support planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial. In addition to ensuring high-quality deliverables, our team drives synergies across the franchise, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavor, we are seeking a dynamic talent for the role of “Senior Analyst – Data & Process Management”. We are looking for a team member to support our data management team based out of US. Robust data management is a priority for our businesses, as the product potential has major implications to a wide range of disciplines. It is essential to have someone who understands and aspires to implement innovative techniques to drive our data management strategies across franchises. He/she will ensure on time and accurate delivery of data requirements by collaborating with relevant stakeholders. He/she will ensure Data availability, data quality and data completeness are maintained as per requirements and are delivered in timely manner. Ensuring data consistency across the sources and downstream teams. Pro-actively identifying data requirements and gaps in system. Developing SOPs for data processes and leading the process enhancements. Providing training to end users on usage of data across the sources. Building advance tools and automate or improve processes for analytical and other needs People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop education and communication content as per requirement Actively lead and develop SBO operations associates and ensure new technologies are leveraged Initiate the contracting process and related documents within defined timelines; and Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget Performance: Ensure data supplied by MDM is used effectively by stakeholders in commercial operations processes (forecasting targeting, call planning, alignments, field reporting, incentive compensation). Administer MDM activities related to the sales operations quarterly cycle. Monitor data quality reports and investigate problems. Maintain requirements documents, business rules and metadata. Provide first-level support for sales data inquiries. Provide ad-hoc support to US MDM colleagues. Works to develop deal tracking analytics and reporting capabilities Collaborates with Digital to enhance data access across various sources, develop tools, technology, and process to constantly improve quality and productivity Process: Contribute to overall quality enhancement by ensuring high standards for the output produced by the digital and data management team Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Management of Quarterly operations for Customer Data on basis of specific frequency (weekly/monthly/quarterly/annually), along with QC checks for each refresh Manage opt out compliance for universe of healthcare professionals. Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of with complete data availability. About You Experience: 5+ years of experience in pharmaceutical product commercial sales and Customer datasets, data governance and data stewardship. In-depth knowledge of common databases like Onekey, DHC, Medpro, DDD, IQVIA, XPO, LAAD CRM, IQVIA, APLD etc. Soft skills: Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills: At least 5+ years direct experience with pharmaceutical sales data and data management with the emphasis on syndicated data, Specialty Pharmacy, Customer data, Patient data, Sales Data and Vaccines. Strong technical background in Reltio, SQL, Python, AWS, Snowflake, SQL, Python, Informatica, Dataiku etc. Strong knowledge of pharmaceutical sales and marketing data sources (IQVIA, Veeva etc.) Knowledge of and/or experience in pharmaceuticals sales operations; understands how data is applied in a pharmaceutical’s commercial operations context. Ability to translate business needs into data requirement Understands the basic principles of data management and data processing. Understands the basic principles of data governance and data stewardship (data quality). Strong communication skills, including the ability to communicate the data management subject matter to a non-technical/unfamiliar internal customer. Experience of using analytical tools like Power BI, VBA and Alteryx etc. is a plus. Proficient of Excel/word/power point. An aptitude for problem solving and strategic thinking and ensuring high quality data output with strong quality assurance Ability to synthesize complex information into clear and actionable insights Proven ability to work effectively across all levels of stakeholders and diverse functions Solid understanding of pharmaceutical development, manufacturing, supply chain and marketing functions Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Effectively collaborate across differing levels of management, functions and role Strong decision-making skills, identifying key issues, developing solutions and gaining commitment Education: - Bachelor’s degree from an accredited four-year college or university. Advanced degree in areas such as Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters) Languages: Excellent knowledge in English and strong communication skills – written and spoken Pursue progress, discover extraordinary Better is out there. Better medications, better outcomes, better science. But progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. So, let’s be those people. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null

Posted 2 days ago

Apply

15.0 years

45 - 55 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a skilled Sr. Data Engineer to join our dynamic team. The Sr. Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or Master's degree in Computer Science, Data Science, or a related field Must have 15+ years of experience as a Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills Minimum Work Experience 15 Maximum Work Experience 20 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.

Posted 2 days ago

Apply

8.0 years

35 - 45 Lacs

Greater Kolkata Area

On-site

Linkedin logo

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a skilled Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or master’s degree in computer science, Data Science, or a related field Must have 8+ years of experience as a Snowflake Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills Minimum Work Experience 8 Maximum Work Experience 13 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.

Posted 2 days ago

Apply

8.0 years

35 - 45 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a skilled Data Engineer to join our dynamic team. The Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or master’s degree in computer science, Data Science, or a related field Must have 8+ years of experience as a Snowflake Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills Minimum Work Experience 8 Maximum Work Experience 13 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.

Posted 2 days ago

Apply

1.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Amazon strives to be the world's most customer-centric company, where customers can research and purchase anything they might want online. We set big goals and are looking for people who can help us reach and exceed them. The CPT Data Engineering & Analytics (DEA) team builds and maintains critical data infrastructure that enhances seller experience and protects the privacy of Amazon business partners throughout their lifecycle. We are looking for a strong Data Engineer to join our team. The Data Engineer I will work with well-defined requirements to develop and maintain data pipelines that help internal teams gather required insights for business decisions timely and accurately. You will collaborate with a team of Data Scientists, Business Analysts and other Engineers to build solutions that reduce investigation defects and assess the health of our Operations business while ensuring data quality and regulatory compliance. The ideal candidate must be passionate about building reliable data infrastructure, detail-oriented, and driven to help protect Amazon's customers and business partners. They will be an individual contributor who works effectively with guidance from senior team members to successfully implement data solutions. The candidate must be proficient in SQL and at least one scripting language (e.g. Python, Perl, Scala), with strong understanding of data management fundamentals and distributed systems concepts Key job responsibilities Build and optimize physical data models and data pipelines for simple datasets Write secure, stable, testable, maintainable code with minimal defects Troubleshoot existing datasets and maintain data quality Participate in team design, scoping, and prioritization discussions Document solutions to ensure ease of use and maintainability Handle data in accordance with Amazon policies and security requirements Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3018752

Posted 2 days ago

Apply

15.0 years

45 - 55 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

About The Company e.l.f. Beauty, Inc. stands with every eye, lip, face and paw. Our deep commitment to clean, cruelty free beauty at an incredible value has fueled the success of our flagship brand e.l.f. Cosmetics since 2004 and driven our portfolio expansion. Today, our multi-brand portfolio includes e.l.f. Cosmetics, e.l.f. SKIN, pioneering clean beauty brand Well People, Keys Soulcare, a groundbreaking lifestyle beauty brand created with Alicia Keys and Naturium, high-performance, biocompatible, clinically-effective and accessible skincare. In our Fiscal year 24, we had net sales of $1 Billion and our business performance has been nothing short of extraordinary with 24 consecutive quarters of net sales growth. We are the #2 mass cosmetics brand in the US and are the fastest growing mass cosmetics brand among the top 5. Our total compensation philosophy offers every full-time new hire competitive pay and benefits, bonus eligibility (200% of target over the last four fiscal years), equity, flexible time off, year-round half-day Fridays, and a hybrid 3 day in office, 2 day at home work environment. We believe the combination of our unique culture, total compensation, workplace flexibility and care for the team is unmatched across not just beauty but any industry. Visit our Career Page to learn more about our team: https://www.elfbeauty.com/work-with-us Position Summary We are seeking a skilled Sr. Data Engineer to join our dynamic team. The Sr. Data Engineer will be responsible for designing, developing, and maintaining our data pipelines, integrations, and data warehouse infrastructure. The successful candidate will work closely with data scientists, analysts, and business stakeholders to ensure that our data is accurate, secure, and accessible for all users. Responsibilities Design and build scalable data pipeline architecture that can handle large volumes of data Develop ELT/ETL pipelines to extract, load and transform data from various sources into our data warehouse Optimize and maintain the data infrastructure to ensure high availability and performance Collaborate with data scientists and analysts to identify and implement improvements to our data pipeline and models Develop and maintain data models to support business needs Ensure data security and compliance with data governance policies Identify and troubleshoot data quality issues Automate and streamline processes related to data management Stay up-to-date with emerging data technologies and trends to ensure the continuous improvement of our data infrastructure and architecture Analyze the data products and requirements to align with data strategy Assist in extracting or researching data for cross-functional business partners for consumer insights, supply chain, and finance teams Enhance the efficiency, automation, and accuracy of existing reports Follow best practices in data querying and manipulation to ensure data integrity Requirements Bachelor's or Master's degree in Computer Science, Data Science, or a related field Must have 15+ years of experience as a Data Engineer or related role Must have experience with Snowflake Strong Snowflake experience building, maintaining and documenting data pipelines Expertise in Snowflake concepts like RBAC management, virtual warehouse, file format, streams, zero copy clone, time travel and understand how to use these features Strong SQL development experience including SQL queries and stored procedures Strong knowledge of ELT/ETL no-code/low-code tools like Informatica / SnapLogic Well versed in data standardization, cleansing, enrichment, and modeling Proficiency in one or more programming languages such as Python, Java, or C# Experience with cloud computing platforms such as AWS, Azure, or GCP Knowledge of ELT/ETL processes, data warehousing, and data modeling Familiarity with data security and governance best practices Excellent hands-on experience in problem-solving and analytical skills and improving the performance of processes Strong communication and collaboration skills Minimum Work Experience 15 Maximum Work Experience 20 This job description is intended to describe the general nature and level of work being performed in this position. It also reflects the general details considered necessary to describe the principal functions of the job identified, and shall not be considered, as detailed description of all the work required inherent in the job. It is not an exhaustive list of responsibilities, and it is subject to changes and exceptions at the supervisors’ discretion. e.l.f. Beauty respects your privacy. Please see our Job Applicant Privacy Notice (www.elfbeauty.com/us-job-applicant-privacy-notice) for how your personal information is used and shared.

Posted 2 days ago

Apply

5.0 - 9.0 years

8 - 18 Lacs

Pune

Work from Office

Naukri logo

Job Title: Senior Data Engineer/Module Lead Location: Pune, Maharashtra, India Experience Level: 5-8 Years Work Model: Full-time About the Role: We are seeking a highly skilled and experienced Senior Data Engineer to join our growing team in Pune. The ideal candidate will have a strong background in data engineering, with a particular focus on Google Cloud Platform (GCP) data services and Apache Airflow. You will be responsible for designing, developing, and maintaining robust and scalable data pipelines, ensuring data quality, and optimizing data solutions for performance and cost. This role requires a hands-on approach and the ability to work independently and collaboratively within an agile environment. Responsibilities: Design, develop, and deploy scalable and efficient data pipelines using Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Develop, deploy, and manage complex DAGs in Apache Airflow for orchestrating data workflows. Write complex SQL and PL/SQL queries, stored procedures, and functions for data manipulation, transformation, and analysis. Optimize BigQuery queries for performance, cost efficiency, and scalability. Ensure data quality, integrity, and reliability across all data solutions. Collaborate with cross-functional teams including data scientists, analysts, and other engineers to understand data requirements and deliver effective solutions. Participate in code reviews and contribute to best practices for data engineering. Troubleshoot and resolve data-related issues in a timely manner. Manage and maintain version control for data engineering projects using Git. Stay up-to-date with the latest industry trends and technologies in data engineering and GCP. Required Skills and Qualifications: 5-8 years of hands-on experience in Data Engineering roles. Strong hands-on experience with Google Cloud Platform (GCP) data services, specifically BigQuery, Cloud Composer (Apache Airflow), and Cloud Storage. Mandatory expertise in Apache Airflow, including designing, developing, and deploying complex DAGs. Mandatory strong proficiency in SQL and PL/SQL for data manipulation, stored procedures, functions, and complex query writing. Experience with Informatica. Proven ability to optimize BigQuery queries for performance and cost. Familiarity with version control systems (e.g., Git). Excellent problem-solving, analytical, and communication skills. Ability to work independently and collaboratively in an agile environment. Bachelor's degree in Computer Science, Engineering, or a related field.

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies