Jobs
Interviews

31 Matillion Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

telangana

On-site

You will provide analytics support to Novartis internal customers (CPOs & Regional marketing and sales teams) on various low-medium complexity analytical reports. You will support and facilitate data-enabled decision-making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. Additionally, you will support GBS - GCO business in building practice by involving in various initiatives like knowledge sharing, on-boarding and training support, supporting team lead in all business-related tasks/activities, building process documentation, and knowledge repositories. You will also be an integral part of a comprehensive design team responsible for designing promotional marketing materials. As an Analyst at Novartis, your key responsibilities will include creating and delivering Field Excellence insights as per agreed SLAs, designing, developing, and/or maintaining ETL based solutions that optimize field excellence activities, delivering services through an Agile project management approach, maintaining standard operating procedures (SOPs) and quality checklists, and developing and maintaining knowledge repositories collecting qualitative and quantitative data of field excellence related trends across Novartis operating markets. Essential requirements for this role include 2 years of experience in SQL and Excel, learning agility, the ability to manage multiple stakeholders, experience in Pharma datasets, and experience in Python or any other scripting language. Desirable requirements include a University/Advanced degree, ideally a Masters degree or equivalent experience in fields such as business administration, finance, computer science, or a technical field. Experience of at least 3 years in using ETL tools (Alteryx, DataIKU, Matillion, etc.) and hands-on experience with cloud-based platforms like SnowFlake is mandatory. Novartis's purpose is to reimagine medicine to improve and extend people's lives, with a vision to become the most valued and trusted medicines company in the world. By joining Novartis, you will be a part of a mission-driven organization where associates drive the company to reach its ambitions. If you are passionate about making a difference in patients" lives and want to be part of a community of smart and dedicated individuals, consider joining Novartis. For more information about benefits and rewards at Novartis, you can refer to the Novartis Life Handbook at https://www.novartis.com/careers/benefits-rewards. If you are interested in staying connected with Novartis and learning about future career opportunities, you can join the Novartis Network here: https://talentnetwork.novartis.com/network.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer with 4 to 6 years of hands-on experience in Microsoft Fabric, Snowflake, and Matillion, you will be a valuable asset to our team. Your primary responsibility will involve supporting MS Fabric and leading the migration process to Snowflake and Matillion. Your expertise and attention to detail will play a crucial role in the success of these projects.,

Posted 4 days ago

Apply

4.0 - 8.0 years

15 - 27 Lacs

Indore, Hyderabad

Hybrid

Data Engineer - D365 OneLake Integration Specialist Position Overview: We are seeking an experienced Data Engineer with expertise in Microsoft D365 ERP and OneLake integration to support a critical acquisition integration project. The successful candidate will assess existing data integrations, collaborate with our data team to migrate pipelines to Snowflake using Matillion, and ensure seamless data flow for go-live critical reports by November 2025. Role & responsibilities: Assessment & Documentation: Analyze and document existing D365 to OneLake/Fabric integrations and data flows Data Pipeline Migration: Collaborate with the current data team to redesign and migrate data integrations from D365 to Snowflake using Matillion Integration Architecture : Understand and map current Power BI reporting dependencies and data sources Go-Live Support: Identify critical reports for go-live and recommend optimal data integration strategies Technical Collaboration: Work closely with existing data engineering team to leverage current Snowflake and Matillion expertise Knowledge Transfer: Document findings and provide recommendations on existing vs. new integration approaches ERP Implementation Support: Support the acquired company's ERP go-live timeline and requirements Required Qualifications: Technical Skills 3+ years experience with Microsoft Dynamics 365 ERP data integrations 2+ years hands-on experience with Microsoft OneLake and Fabric ecosystem Strong experience with Snowflake data warehouse platform Proficiency in Matillion ETL tool for data pipeline development Experience with Power BI data modeling and reporting architecture Strong SQL skills and data modeling expertise Knowledge of Azure Data Factory or similar cloud ETL tools Experience with REST APIs and data connector frameworks Business & Soft Skills Experience supporting ERP implementation projects and go-live activities Strong analytical and problem-solving skills for complex data integration challenges Excellent documentation and communication skills Ability to work in fast-paced, deadline-driven environments Experience in M&A integration projects (preferred) Project management skills and ability to prioritize go-live critical deliverables Preferred candidate profile Microsoft Azure certifications (DP-203, DP-900) Experience with Snowflake SnowPro certification Previous experience with acquisition integration projects Knowledge of financial and operational reporting requirements Familiarity with data governance and compliance frameworks

Posted 6 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

You will be joining a fast-growing data-analytics consultancy focused on Life Sciences / Pharmaceutical commercial analytics. Our team specializes in building cloud-native data platforms to provide sales, marketing, and patient-centric insights for top global pharma brands, ensuring compliant and high-impact solutions on an enterprise scale. As a Data Engineer in this role, you will be responsible for architecting, constructing, and optimizing Snowflake data warehouses and ELT pipelines using SQL, Streams, Tasks, UDFs, and Stored Procedures to cater to complex commercial-analytics workloads. You will also work on integrating various pharma data sources such as Veeva, Salesforce, IQVIA, Symphony, RWD, and patient-services feeds through Fivetran, ADF, or Python-based frameworks to ensure end-to-end data quality. Your duties will involve establishing robust data models (star, snowflake, Data Vault) that are tailored for sales reporting, market-share analytics, and AI/ML use-cases. You will drive governance and compliance efforts (HIPAA, GDPR, GxP) by implementing fine-grained access controls, masking, lineage, and metadata management. Additionally, you will lead code reviews, mentor engineers, optimize performance, and ensure cost-efficient compute usage. Collaboration with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights will be a key aspect of your role. You will need to have at least 7 years of data-engineering / warehousing experience, including a minimum of 4 years of hands-on Snowflake design and development experience. Expertise in SQL, data modeling (Dimensional, Data Vault), ETL/ELT optimization, and proficiency in Python (or similar) for automation, API integrations, and orchestration are essential qualifications. Strong governance/security acumen within regulated industries (HIPAA, GDPR, PII), a Bachelor's degree in Computer Science, Engineering, or Information Systems (Masters preferred), and excellent client-facing communication and problem-solving skills in fast-paced, agile environments are required. Direct experience with pharma commercial datasets, cloud-platform depth (AWS, Azure, or GCP), familiarity with tools like Matillion/DBT/Airflow, Git, Snowflake certifications (SnowPro Core / Advanced), and knowledge of Tableau, Power BI, or Qlik connectivity are preferred qualifications. This is a full-time position that requires in-person work. If you are interested in this opportunity, please speak with the employer at +91 9008078505.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As an Enterprise Snowflake L1/L2 AMS Support, your primary responsibilities will include monitoring and supporting Snowflake data warehouse performance, optimizing queries, and overseeing job execution. You will be tasked with troubleshooting data loading failures, managing access control, and addressing role-based security issues. Additionally, you will be expected to carry out patching, software upgrades, and security compliance checks while upholding SLA commitments for query execution and system performance. To excel in this role, you should possess 2-5 years of experience working with Snowflake architecture, SQL scripting, and query optimization. It would be beneficial to have familiarity with ETL tools such as Talend, Matillion, and Alteryx for seamless Snowflake integration.,

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

hyderabad, telangana

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are currently seeking SnowFlake Professionals with at least 12+ years of experience in the following areas: - Strong communication and proactive skills, ability to lead conversations - Experience architecting and delivering solutions on AWS - Hands-on experience with cloud warehouses like Snowflake - Strong knowledge of data integrations, data modeling (Dimensional & Data Vault), and visualization practices - Good understanding of data management (Data Quality, Data Governance, etc.) - Zeal to pick up new technologies, conduct PoCs, and present PoVs Technical Skills (Strong experience in at least one item in each category): - Cloud: AWS - Data Integration: Qlik Replicate, Snaplogic, Matillion & Informatica - Visualization: PowerBI & Thoughtspot - Storage & Databases: Snowflake, AWS Having certifications in Snowflake and Snaplogic would be considered a plus. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All Support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture.,

Posted 1 week ago

Apply

9.0 - 14.0 years

30 - 40 Lacs

Pune, Chennai

Work from Office

Designing, implementing, and optimizing data solutions using both Azure and Snowflake experience working with Matillion tool Azure and Snowflake, including data modeling, ETL processes, and data warehousing. SQL and data integration tools.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Hyderabad

Work from Office

7+ years of experience as a Data Engineer or Snowflake Developer. Expert-level knowledge of SQL (joins, subqueries, CTEs). Experience with ETL tools (e.g., Informatica, Talend, Matillion). Experience with cloud platforms like AWS, Azure, or GCP.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant- Sr.Data Engineer ( DBT +Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Roles and Responsibilities : . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with e xperience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tool s like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

5.0 - 20.0 years

10 - 35 Lacs

Hyderabad, Pune, Delhi / NCR

Work from Office

Mandatory Skill - Snowflake, Matillion

Posted 3 weeks ago

Apply

5.0 - 6.0 years

12 - 16 Lacs

Gurugram

Work from Office

Role : Senior Data Architect Snowflake & Matillion (Remote) Design and implement data architecture support analytics, and collaborate with stakeholders Min. 5 yrs experience Strong data modeling, analytics, and communication skills required.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

3 - 18 Lacs

Pune, Maharashtra, India

On-site

Your specific responsibilities will include: Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education: B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field. Required experience: 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience: Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science andvisualization/reportingproducts, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI)

Posted 1 month ago

Apply

0.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Associate - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 month ago

Apply

7.0 - 10.0 years

20 - 30 Lacs

Hyderabad, Chennai

Work from Office

Prof in designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift), using various ETL/ELT tools such as Matillion, dbt, Striim, etc. Solid understanding of database systems (relational / NoSQL) ,data modeling tech Required Candidate profile looking for candidates with strong experience in data architecture Potential companies: Tiger Analytics, Tredence, Quantiphi, Data Engineering Group within Infosys/TCS/Cognizant, Deloitte Consulting Perks and benefits 5 working days - Onsite

Posted 1 month ago

Apply

5.0 - 10.0 years

19 - 30 Lacs

Hyderabad

Work from Office

For Data Engineer Years of experience -3-5 years Number of openings-2 For Sr. Data Engineer Years of experience- 6-10 years Number of openings-2 About Us Logic Pursuits provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, Logic Pursuits is a game-changer in any operations strategy. Job Description We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files . Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms . Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets. Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design. Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows. Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code. Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets. Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery. Required Qualifications Data Engineering Skills 3–5 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt. Capable of building and maintaining ELT pipelines using dbt and Snowflake with guidance on architecture and best practices. Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional) . Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems. Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations). Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M . Experience using Git for version control and exposure to CI/CD workflows in team environments. Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake. Working knowledge of Python for basic automation and data manipulation tasks. Understanding of Snowflake's role-based access control (RBAC) , data security features, and general data privacy practices like GDPR. Data Quality & Documentation Familiar with dbt testing and documentation practices (e.g., dbt tests, dbt docs). Awareness of standard data validation and monitoring techniques for reliable pipeline development. Soft Skills & Collaboration Strong problem-solving skills and ability to debug SQL and transformation logic effectively. Able to document work clearly and communicate technical solutions to a cross-functional team. Experience working in Agile settings, participating in sprints, and handling shifting priorities. Comfortable collaborating with analysts, data scientists, and architects across onshore/offshore teams. High attention to detail, proactive attitude, and adaptability in dynamic project environments. Nice to Have Experience working in client-facing or consulting roles. Exposure to AI/ML data pipelines or tools like feature stores and MLflow Familiarity with enterprise-grade data quality tools Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus Additional Information Why Join Us? Opportunity to work on diverse and challenging projects in a consulting environment. Collaborative work culture that values innovation and curiosity. Access to cutting-edge technologies and a focus on professional development. Competitive compensation and benefits package. Be part of a dynamic team delivering impactful data solutions Required Qualification Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)

Posted 1 month ago

Apply

12.0 - 22.0 years

3 - 6 Lacs

Chennai, Tamil Nadu, India

On-site

We are hiring an ESA Solution Architect - COE for a CMMI Level 5 client. If you have relevant experience and are looking for a challenging opportunity, we invite you to apply. Key Responsibilities: Design and implement enterprise solutions that align with business and technical requirements. Lead migration projects from on-premise to cloud or cloud-to-cloud (preferably Snowflake). Provide expertise in ETL technologies such as Informatica, Matillion, and Talend . Develop Snowflake-based solutions and optimize data architectures. Analyze project constraints, mitigate risks, and recommend process improvements. Act as a liaison between technical teams and stakeholders , translating business needs into technical solutions. Conduct architectural system evaluations to ensure scalability and efficiency. Define processes and procedures to streamline solution delivery. Create solution prototypes and participate in technology selection . Ensure compliance with strategic guidelines, technical standards, and business objectives. Oversee solution development and collaborate closely with project management and IT teams. Required Skills & Experience: 10+ years of experience in technical solutioning and enterprise solution architecture. Proven experience in cloud migration projects (on-prem to cloud/cloud-to-cloud). Strong expertise in Snowflake architecture and solutioning . Hands-on experience with ETL tools such as Informatica, Matillion, and Talend . Excellent problem-solving and risk mitigation skills. Ability to work with cross-functional teams and align technical solutions with business goals. If you are interested, please share your updated profile.

Posted 1 month ago

Apply

5.0 - 10.0 years

16 - 31 Lacs

Pune, Chennai, Bengaluru

Work from Office

Hi Connections, I am looking for Matillion Lead for one of our MNC client . Exp- 5+ yrs. Please email your resumes to parul@mounttalent.com. Skills Required: - Matillion - Python - SQL Location: Pune, Mumbai, Noida, Chennai, Bangalore, Hyderabad

Posted 1 month ago

Apply

5.0 - 8.0 years

22 - 25 Lacs

Pune, Chennai, Coimbatore

Hybrid

Candidate Skill: Technical Skills - SQL, MySQL, SQL Server, Python, AWS, Rundeck, Matillion, Tableau, Salesforce, ETL, Change Management, Web Application Support We are seeking a highly skilled Senior Full Stack Developer with expertise in .NET Core, React, and Azure Cloud to design, develop, and deploy scalable applications. The ideal candidate should have a strong technical background, a problem-solving mindset, and the ability to collaborate effectively with cross-functional teams. Key Responsibilities: Design, develop, and maintain web applications using .NET Core and React.Architect and optimize Azure cloud-based solutions for performance, security, and scalability.Implement best practices and design patterns to ensure high-quality, maintainable code. Collaborate with cross-functional teams including designers, testers, and DevOps engineers for seamless integration and deployment.Troubleshoot and resolve technical issues in cloud and application environments.Stay updated with the latest technologies and trends in .NET, React, and cloud computing. Required Skills: .NET Core Strong experience in building backend services and APIs.React Hands-on expertise in developing dynamic front-end applications.Azure Cloud Proficiency in deploying and managing applications on Azure. Experience with SQL and NoSQL databases.Familiarity with CI/CD pipelines, Git, and DevOps best practices.Strong problem-solving and debugging skills.Excellent communication and teamwork abilities. Nice to Have:Experience with Microservices architecture. Knowledge of Docker and Kubernetes. Understanding of Agile methodologies

Posted 1 month ago

Apply

3.0 - 8.0 years

22 - 25 Lacs

Mohali, Panchkula

Work from Office

The Customer Support Database Engineer is responsible for enabling the day-to-dayoperations associated with our SaaS offerings The team is responsible for frequentdata loads of client data, monitoring ETL and data processing, removing/resolvingpoints of failure, and determining methods to improve query performance This role will work both independently and as a team member, performing a largevariety of tasks This role is customer facing The individual will provide expertise to customer businessand IT departments 2 nd and 3 rd level support will be required This individual will beresponsible for investigating and resolving easy to extremely complex issues What youll do: Be a key player in the delivery of the SLA and works with customer support team todefine and execute 24X7 customer support plan Work queued cases from internal and external customers Triage cases, assist customers, resolve issues and bugs Assist in coordinating response to major incidents, including post-incident rootcause analysis Act as an escalation point to resolve critical and major client related issues Monitor ETL Processing using Rundeck, AWS tools, Matillion, and other services SQL, Tableau and script development for on-going improvements Work closely with the professional services team to understand the needs of eachclient implementation Assist with the on-boarding of new customers What you need: This role requires weekend duties and may be asked to work an alternative schedulein the evenings Experience in technical support, issue management, and conflict resolution Intermediate to expert knowledge of SQL (MySQL, SQL Server) and experienceusing Python Demonstrable experience diagnosing bugs/issues in customized software solutions Experience with Salesforce and Salesforce-based apps Salesforce Administration and/or development a plus Experience with Tableau or other data visualization tools Great organization, collaboration, communication, and coordination skills Ability to work across the organization and collaborate with customers, sales,services, and account management Experience working in a structured change management process for highly availableenvironment a plus Experience supporting a critical client facing web application Technical Skills: SQL, MySQL, SQL Server, Python, AWS, Rundeck, Matillion, Tableau, Salesforce, ETL, Change Management, Web Application Support

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 18 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

SQL, SNOWFLAKE, TABLEAU SQL, SNOWFLAKE,DBT, Datawarehousing SQL, SNOWFLAKE, Python, DBT, Datawarehousing SQL, SNOWFLAKE, Datawarehousing, any ETL tool(preffered is Matillion) SQL, SNOWFLAKE, TABLEAU

Posted 1 month ago

Apply

3.0 - 20.0 years

10 - 40 Lacs

Pune, Delhi / NCR, Greater Noida

Work from Office

Mandatory Skills - Snowflake, Matillion

Posted 1 month ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 months ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Pune, Maharashtra, India

On-site

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 months ago

Apply

6.0 - 11.0 years

6 - 11 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Experience in end-to-end data pipeline development/troubleshooting using Snowflake and Matillion Cloud. 5+ years of experience in DWH, 2-4 years of experience in implementing DWH on Snowflake using Matillion. Design, develop, and maintain ETL processes using Matillion to extract, transform, and load data into Snowflake and Develop and Debug ETL programs primarily using Matillion Cloud. Collaborate with data architects and business stakeholders to understand data requirements and translate them into technical solutions. We seek a skilled technical professional to lead the end-to-end system and architecture design for our application and infrastructure. Data validation & end to end testing of ETL Objects, Source data analysis and data profiling. Troubleshoot and resolve issues related to Matillion development and data integration. Collaborate with business users to create architecture in alignment with business need. Collaborate in Developing Project requirements for end-to-end Data integration process using ETL for Structured, semi-structured and Unstructured Data. Strong understanding of ELT/ETL and integration concepts and design best practices. Experience in performance tuning of the Matillion Cloud data pipelines and should be able to trouble shoot the issue quickly. Experience in Snowsql, Snow pipe will be added advantage.

Posted 2 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies