Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
0 Lacs
Andhra Pradesh, India
On-site
We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance
Posted 1 week ago
3.0 - 6.0 years
0 Lacs
Andhra Pradesh, India
On-site
We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance
Posted 1 week ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Ignition Application Administrator Position: We are seeking a highly motivated Ignition Application Administrator to join the Enterprise Services – Data team. Working very closely with peer platform administrators, developers, Product/Project Seniors and Customers, you will play an active role in administering the existing analytics platforms. You will join a team of platform administrators who are specialized in one tool, but cross-trained on other tools. While you will focus on Ignition, administration knowledge of these other platforms is beneficial – Qlik Sense, Tableau, PowerBI, SAP Business Objects, Matillion, Snowflake, Informatica (EDC, IDQ, Axon), Alteryx, HVR or Databricks. This role requires a willingness to dive into complex problems to help the team find elegant solutions. How you communicate and approach problems is important to us. We are looking for team players, who are willing to bring people across the disciplines together. This position will provide the unique opportunity to operate in a start-up-like environment within a Fortune 50 company. Our digital focus is geared towards releasing the insights inherent to our best-in-class products and services. Together we aim to achieve new levels of productivity by changing the way we work and identifying new sources of growth for our customers. Responsibilities include, but are not limited to, the following: Install and configure Ignition. Monitor the Ignition platform, including integration with observability and alerting solutions, and recommend platform improvements. Troubleshoot and resolve Ignition platform issues. Configure data source connections and manage asset libraries. Identify and raise system capacity related issues (storage, licenses, performance threshold). Define best practices for Ignition deployment. Integrate Ignition with other ES Data platforms and Business Unit installations of Ignition. Participate in overall data platform architecture and strategy. Research and recommend alternative actions for problem resolution based on best practices and application functionality with minimal direction. Knowledge and Skills: 3+ years working in customer success or in a customer-facing engineering capacity is required. Large scale implementation experience with complex solutions environment. Experience in customer-facing positions, preferably industry experience in technology-based solutions. Experience being able to navigate, escalate and lead efforts on complex customer/partner requests or projects. Experience with Linux command line. An aptitude for both analysing technical concepts and translating them into business terms, as well as for mapping business requirements into technical features. Knowledge of the software development process and of software design methodologies helpful 3+ years’ experience in a cloud ops / Kubernetes application deployment and management role, working with an enterprise software or data product. Experience with Attribute-based Access Control (ABAC), Virtual Director Services (VDS), PING Federate or Azure Active Directory (AAD) helpful. Cloud platform architecture, administration and programming experience desired. Experience with Helm, Argo CD, Docker, and cloud networking. Excellent communication skills: interpersonal, written, and verbal. Education and Work Experience: This position requires a minimum A BA/BS Degree (or equivalent) in technology, computing or other related field of study. Experience in lieu of education may be considered if the individual has ten (3+) or more years of relevant experience. Hours: Normal work schedule hours may vary, Monday through Friday. May be required to work flexible hours and/or weekends, as needed, to meet deadlines or to fulfil application administration obligations. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Ignition Application Administrator Position: We are seeking a highly motivated Ignition Application Administrator to join the Enterprise Services – Data team. Working very closely with peer platform administrators, developers, Product/Project Seniors and Customers, you will play an active role in administering the existing analytics platforms. You will join a team of platform administrators who are specialized in one tool, but cross-trained on other tools. While you will focus on Ignition, administration knowledge of these other platforms is beneficial – Qlik Sense, Tableau, PowerBI, SAP Business Objects, Matillion, Snowflake, Informatica (EDC, IDQ, Axon), Alteryx, HVR or Databricks. This role requires a willingness to dive into complex problems to help the team find elegant solutions. How you communicate and approach problems is important to us. We are looking for team players, who are willing to bring people across the disciplines together. This position will provide the unique opportunity to operate in a start-up-like environment within a Fortune 50 company. Our digital focus is geared towards releasing the insights inherent to our best-in-class products and services. Together we aim to achieve new levels of productivity by changing the way we work and identifying new sources of growth for our customers. Responsibilities include, but are not limited to, the following: Install and configure Ignition. Monitor the Ignition platform, including integration with observability and alerting solutions, and recommend platform improvements. Troubleshoot and resolve Ignition platform issues. Configure data source connections and manage asset libraries. Identify and raise system capacity related issues (storage, licenses, performance threshold). Define best practices for Ignition deployment. Integrate Ignition with other ES Data platforms and Business Unit installations of Ignition. Participate in overall data platform architecture and strategy. Research and recommend alternative actions for problem resolution based on best practices and application functionality with minimal direction. Knowledge and Skills: 3+ years working in customer success or in a customer-facing engineering capacity is required. Large scale implementation experience with complex solutions environment. Experience in customer-facing positions, preferably industry experience in technology-based solutions. Experience being able to navigate, escalate and lead efforts on complex customer/partner requests or projects. Experience with Linux command line. An aptitude for both analysing technical concepts and translating them into business terms, as well as for mapping business requirements into technical features. Knowledge of the software development process and of software design methodologies helpful 3+ years’ experience in a cloud ops / Kubernetes application deployment and management role, working with an enterprise software or data product. Experience with Attribute-based Access Control (ABAC), Virtual Director Services (VDS), PING Federate or Azure Active Directory (AAD) helpful. Cloud platform architecture, administration and programming experience desired. Experience with Helm, Argo CD, Docker, and cloud networking. Excellent communication skills: interpersonal, written, and verbal. Education and Work Experience: This position requires a minimum A BA/BS Degree (or equivalent) in technology, computing or other related field of study. Experience in lieu of education may be considered if the individual has ten (3+) or more years of relevant experience. Hours: Normal work schedule hours may vary, Monday through Friday. May be required to work flexible hours and/or weekends, as needed, to meet deadlines or to fulfil application administration obligations. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title JD for Sr. Data Engineer (/ETL Developer) Job Summary We are looking for a talented Data Engineer cum Database Developer with a strong background in the banking sector. The ideal candidate will have experience with SQL Server, AWS PostgreSQL, AWS Glue, and ETL tools, along with expertise in data ingestion frameworks and Control-M scheduling Key Responsibilities Design, develop, and maintain scalable data pipelines to support data ingestion and transformation processes. Collaborate with cross-functional teams to gather requirements and implement solutions tailored to banking applications. Utilize SQL Server and AWS PostgreSQL for database development, optimization, and management. Implement data ingestion frameworks to ensure efficient and reliable data flow. Develop and maintain ETL processes using AWS Glue / other ETL tool, Control-M for scheduling Ensure data quality and integrity through validation and testing processes. Monitor and optimize system performance to support business analytics and reporting needs. Document data architecture, processes, and workflows for reference and compliance purposes. Stay updated on industry trends and best practices related to data engineering and management. Qualifications Bachelor’s degree in computer science, Information Technology, or a related field. 4+ years of experience in data engineering and database development, preferably in the banking sector. Proficiency in SQL Server and AWS PostgreSQL. Experience with Databricks/ AWS Glue or any other ETL tools (e.g., Informatica, ADF). Strong understanding of data ingestion frameworks and methodologies. Excellent problem-solving skills and attention to detail. Knowledge of Securitization in the banking industry would be plus Strong communication skills for effective collaboration with stakeholders. Familiarity with cloud-based data architectures and services. Experience with data warehousing concepts and practices. Knowledge of data privacy and security regulations in banking.
Posted 1 week ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
We are seeking a highly skilled Senior Marketing Data Analyst to join our team. This role will involve analyzing and interpreting marketing data to identify trends, patterns, and actionable insights. About the Role The successful candidate will be responsible for: Analyzing marketing data to support campaign targeting and optimize marketing efforts. C ollaborating with cross-functional teams to understand data needs and provide analytical solutions. Evaluating marketing campaign performance and providing recommendations for improvement. Ensuring data quality and accuracy through validation and cleanup. Conducting ad-hoc analysis to solve business problems and provide actionable insights. Requirements To be successful in this role, you will need: A minimum of 3 years of experience in data analysis. Proficiency in SQL, Snowflake, and Python on a daily basis. Ability to work with cross-functional teams and create relevant insights.
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
India
On-site
Job Title - ETL Testing Location - Hyderabad, Chennai, Bangalore, Pune, Mumbai, Noida, Coimbatore, Kolkata Total exp : 8 to 12 years Interview Mode : 1 virtual; 1 face to face Job Description - We are looking for a seasoned QA professional with strong expertise in Data Warehousing and ETL testing to join our team. The ideal candidate will have experience working in Agile environments, handling multiple stakeholders, and executing complex testing strategies for data and cloud-based solutions. Key Responsibilities: Minimum of 7 years of experience in Data Warehousing and ETL testing. Design, plan, and develop comprehensive test cases based on business requirements. Work closely with multiple stakeholders and ensure clear communication throughout the project lifecycle. Participate actively in Agile ceremonies including sprint planning, estimation, and testing of user stories and acceptance criteria. Collaborate with QA team members, developers, and product owners to ensure high-quality deliverables. Execute cloud readiness checks and ensure adherence to defined standards. Maintain and track identified issues, vulnerabilities, and inventory records. Ensure timely communication of test metrics, statuses, and issues to relevant stakeholders. Work with SQL (ANSI SQL and PL/SQL) extensively to validate database functionality. Participate in all phases of the Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC). Analyze Business Requirement Documents (BRDs) to extract testing requirements. Ensure effective defect reporting and evaluation documentation. Contribute to product or design testability discussions. Troubleshoot test reporting and inventory-related issues. Preferred Skills: Hands-on experience with IICS (Informatica Intelligent Cloud Services) is a plus. Exposure to test automation tools like TOSCA is an added advantage. Strong communication skills are a must. Ability to work independently and collaboratively within a team.
Posted 1 week ago
0 years
0 Lacs
Thane, Maharashtra, India
On-site
Role: ETL Developer Location: Thane Work Mode: Work From Office only Workdays: Monday to Friday About company: It is a cutting-edge FinTech and RegTech company headquartered in Atlanta, USA, with an R&D center in Thane, India. We specialize in solving complex problems in the banking and payments industry using AI, machine learning, and big data analytics. Our flagship product , is an AI-powered platform designed for banks, fintechs, and payment processors. It simplifies operations across four critical areas: ✅ Compliance ✅ Fraud Detection ✅ Reconciliation ✅ Analytics Built on the powerful HPCC Systems platform, it helps financial institutions improve data accuracy, reduce risk, and increase operational efficiency. Job Description: • Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities. • Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions. • Develop and implement data collection systems and other strategies that optimize Statistical efficiency and data quality. • Acquire data from primary or secondary data sources and maintain databases/data systems. • Identify, analyze, and interpret trends or patterns in complex data sets. • Work closely with management to prioritize business and information needs. • Locate and define new process improvement opportunities. • Prepare documentation for further reference. • Performing quality testing and data assurance. • High attention to detail. • Passionate about complex data structures and problem solving. Qualifications: • Bachelor’s degree in computer science, electrical engineering, or information technology • Experience working in IT. • Experience working with complex data sets. • Knowledge of at least one ETL tool (SSIS, Informatica, Talend, etc.). • Knowledge of HPCC Systems and C++ preferred • Familiarity with Kafka on-premise architectures and ELK. • Understanding of cross cluster replication, index lifecycle management and hot-warm architectures
Posted 1 week ago
0 years
0 Lacs
Sadar, Uttar Pradesh, India
On-site
We are seeking a seasoned Informatica CDGC expert to work with Informatica team and lead the implementation and optimization of Informatica Cloud Data Governance and Catalog solutions. The ideal candidate will establish best practices, drive data governance initiatives, and mentor a team of data professionals to ensure a scalable and efficient governance framework aligned with business objectives
Posted 1 week ago
5.0 years
15 - 20 Lacs
Bengaluru, Karnataka, India
On-site
Good Communication, New Generation firewall (Palo alto/Fortigate/Checkpoint/FTD) Basic operational knowledge in Wireless Knowledge in F5 and SDWAN Good understanding and operational knowledge in Switches (preferably Nexus) Interview Process 1 st round: virtual 2 nd round: Face to Face round in informatica office, Location – Bagmane techpark, cv raman nagar. Please make sure you check with the candidates that they need to come for F2F round which is mandatory, if they are ok with it then you can share the profiles or else reject it. Note: Don’t share profiles who are NOC Engineers and TAC Engineers as they are not suitable for this network l2 role. While sourcing for Network L2 pls check with the below questionnaire: Network L2 Questions EXP: 5+ Years Mandatory: Routing, switching, firewall exp, wireless - How many years of experience ? Firewall - Read or write access ? - If read access then pls reject Load Balancers - Do u have exp or knowledge SD wan - Do u have exp or knowledge Wireless - Do u have exp or knowledge Nexus - Do u have exp or knowledge Skills: new generation firewall (palo alto/fortigate/checkpoint/ftd),fortigate,knowledge in f5 and sdwan,wireless,checkpoint,good communication,operational knowledge in switches (preferably nexus),load balancers,ftd,sd wan,firewall,load balancer,nexus,basic operational knowledge in wireless,network engineer,routing,switching,palo alto firewall
Posted 1 week ago
5.0 years
20 - 25 Lacs
Chennai, Tamil Nadu, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field Skills: data integration,data analysis,data warehousing,snowflake,etl,data modeling,workflow management tools,informatica,power bi,python,dbt,aws,sql,pipelines,azure,banking domain,dwh,gcp,fivetran
Posted 1 week ago
5.0 years
20 - 25 Lacs
Gurugram, Haryana, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field Skills: data integration,data analysis,data warehousing,snowflake,etl,data modeling,workflow management tools,informatica,power bi,python,dbt,aws,sql,pipelines,azure,banking domain,dwh,gcp,fivetran
Posted 1 week ago
5.0 years
20 - 25 Lacs
Greater Kolkata Area
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field Skills: data integration,data analysis,data warehousing,snowflake,etl,data modeling,workflow management tools,informatica,power bi,python,dbt,aws,sql,pipelines,azure,banking domain,dwh,gcp,fivetran
Posted 1 week ago
5.0 years
20 - 25 Lacs
Pune, Maharashtra, India
On-site
Exp: 5 - 12 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills: Snowflake, SQL, DWH, Power BI, ETL and Informatica. We are seeking a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools to join our Data Engineering team. The ideal candidate will have 5+ years of experience in designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and help build scalable, high-performance data solutions. Key Responsibilities Minimum of 5+ years of hands-on experience in Data Engineering, with a focus on Data Warehousing, Business Intelligence, and related technologies. Data Integration & Pipeline Development: Develop and maintain data pipelines using Snowflake, Fivetran, and DBT for efficient ELT processes (Extract, Load, Transform) across various data sources. SQL Query Development & Optimization: Write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Data Modeling & ELT Implementation: Implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT. Design and optimize high-performance data architectures. Business Requirement Analysis: Collaborate with business stakeholders to understand data needs and translate business requirements into technical solutions. Troubleshooting & Data Quality: Perform root cause analysis on data-related issues, ensuring effective resolution and maintaining high data quality standards. Collaboration & Documentation: Work closely with cross-functional teams to integrate data solutions. Create and maintain clear documentation for data processes, data models, and pipelines. Skills & Qualifications Expertise in Snowflake for data warehousing and ELT processes. Strong proficiency in SQL for relational databases and writing complex queries. Experience with Informatica PowerCenter for data integration and ETL development. Experience using Power BI for data visualization and business intelligence reporting. Experience with Fivetran for automated ELT pipelines. Familiarity with Sigma Computing, Tableau, Oracle, and DBT. Strong data analysis, requirement gathering, and mapping skills. Familiarity with cloud services such as Azure (RDBMS, Data Bricks, ADF), with AWS or GCP Experience with workflow management tools such as Airflow, Azkaban, or Luigi. Proficiency in Python for data processing (other languages like Java, Scala are a plus). Education- Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field Skills: data integration,data analysis,data warehousing,snowflake,etl,data modeling,workflow management tools,informatica,power bi,python,dbt,aws,sql,pipelines,azure,banking domain,dwh,gcp,fivetran
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior ETL & Data Streaming Engineer at DataFlow Group, a global provider of Primary Source Verification solutions and background screening services, you will be a key player in the design, development, and maintenance of robust data pipelines. With over 10 years of experience, you will leverage your expertise in both batch ETL processes and real-time data streaming technologies to ensure efficient data extraction, transformation, and loading into our Data Lake and Data Warehouse. Your responsibilities will include designing and implementing highly scalable ETL processes using industry-leading tools, as well as architecting batch and real-time data streaming solutions with technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis. You will collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into effective pipeline solutions, ensuring data quality, integrity, and security across all storage solutions. Monitoring, troubleshooting, and optimizing existing data pipelines for performance, cost-efficiency, and reliability will be a crucial part of your role. Additionally, you will develop comprehensive documentation for all ETL and streaming processes, contribute to data governance policies, and mentor junior engineers to foster a culture of technical excellence and continuous improvement. To excel in this position, you should have 10+ years of progressive experience in data engineering, with a focus on ETL, ELT, and data pipeline development. Your deep expertise in ETL tools like Talend, proficiency in Data Streaming Technologies such as AWS Glue and Apache Kafka, and extensive experience with AWS data services like S3, Glue, and Lake Formation will be essential. Strong knowledge of traditional data warehousing concepts, dimensional modeling, programming languages like SQL and Python, and relational and NoSQL databases will also be required. If you are a problem-solver with excellent analytical skills, strong communication abilities, and a passion for staying updated on emerging technologies and industry best practices in data engineering, ETL, and streaming, we invite you to join our team at DataFlow Group and make a significant impact in the field of data management.,
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Location Chennai, Tamil Nadu, India Job ID R-231552 Date posted 15/07/2025 Job Title: Senior Consultant - Coupa GCL -D3 Introduction to role Are you ready to disrupt an industry and change lives? As a Senior Consultant specializing in the Coupa Platform, you'll leverage your technical expertise to support the delivery of life-changing solutions. You'll act as the technical domain expert, driving program management and scaling efforts by collaborating with key stake holders. Your role will be pivotal in transforming our ability to develop medicines that impact lives. Accountabilities Technical Ownership: Support Coupa technical solution design and implementation in alignment with design decisions. Participate in design discussions and contribute towards decisions. Engage in the full lifecycle of Coupa technical delivery—from concept to design to deployment and post-implementation stabilization. Gather high-level business requirements, perform analysis, define Coupa technology requirements, and design solutions based on completed analysis. Integration & Middleware Oversight: Support mapping between legacy and target systems (Coupa), coordinating with middleware and interface teams. Define and validate integration points across systems, applications, and services. Support development and testing of APIs, messaging frameworks, error handling, push-pull mechanisms, and data pipelines. Data Migration Execution: Support large-volume data migration activities, including mock runs, cutover rehearsal, and production go-live support. Ensure data cleansing, mapping rules, and exception handling are well-documented and implemented. Collaborate with business stake holders to define data acceptance criteria and validation plans. DevOps Skills: Demonstrate strong knowledge about the Coupa platform and its integrations. Actively assess system enhancements and deploy them in accordance with the latest platform product release. Identify process improvements and implement changes with clear outcomes of improvement and standardization. Undertake diagnostic work to understand specific technical issues or problems in greater depth. Manage change management end-to-end and support testing activities by triaging, scenario setting, etc. Deliver platform-based projects to improve adoption of the latest features. Resolve issues by partnering with technical, finance, procurement teams, and vendors. Essential Skills/Experience Coupa certified 8+ years of overall IT experience with solid background on Coupa Technical delivery roles, with proven experience in large-scale data migration and middleware integration. Experience with integration technologies such as MuleSoft, Dell Boomi, Azure Integration Services, Kafka, or similar. Proficient in ETL tools and practices (e.g., Informatica, Talend, SQL-based scripting). Familiarity with cloud platforms (AWS, Azure, GCP) and hybrid integration strategies. Strong problem-solving and analytical skills in technical and data contexts. Ability to translate complex technical designs into business-aligned delivery outcomes. Leadership in cross-functional and cross-technology environments. Effective communicator capable of working with developers, data engineers, testers, and business stake holders. Experienced with IT Service Management tools like ServiceNow & Jira Experience in managing and developing 3rd party business relationships - UG - B. Tech /B.E. or other equivalent technical qualifications When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we empower our teams to innovate and take ownership of their work. Our dynamic environment encourages experimentation with innovative technology while tackling challenges that have never been addressed before. With a focus on collaboration across diverse areas, we drive simplicity and frictionless interactions. Here you can design your own path with the support needed to thrive and develop. Our commitment to lifelong learning ensures that every day is an opportunity for growth. Ready to make a meaningful impact? Apply now to join us on this exciting journey! Date Posted 16-Jul-2025 Closing Date 29-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Location Chennai, Tamil Nadu, India Job ID R-231549 Date posted 15/07/2025 Title : Analyst - Coupa GCL : C3 Job Title: Analyst - Coupa Introduction to role: Are you ready to disrupt an industry and change lives? As an Analyst specializing in the Coupa Platform, you'll leverage your technical expertise to support the design, implementation, and integration of this progressive technology. You'll be the technical domain expert, leading change and scaling solutions by collaborating with stake holders. Your work will directly impact our ability to develop life-changing medicines and empower the business to perform at its peak. Accountabilities: Technical Ownership: Support Coupa technical solution design & implementation in alignment with design decisions. Participate in design discussions and supply towards decisions. Engage in the full lifecycle of Coupa technical delivery—from concept to design to deployment and post-implementation stabilization. Gather high-level business requirements, perform analysis, define Coupa technology requirements, and design solutions based on completed analysis. Integration & Middleware Oversight: Support mapping between legacy and target systems (Coupa), coordinating with middleware and interface teams. Define and validate integration points across systems, applications, and services. Support development and testing of APIs, messaging frameworks, error handling, push-pull mechanisms, and data pipelines. Data Migration Execution: Support large-volume data migration activities, including mock runs, cutover rehearsal, and production release support. Ensure data cleansing, mapping rules, and exception handling are well-documented and implemented. Collaborate with business stake holders to define data acceptance criteria and validation plans. DevOps Skills: Demonstrate strong knowledge about the Coupa platform and its integrations. Actively assess system enhancements and deploy them in accordance with the latest platform product release. Identify process improvements and implement change with clear outcomes of improvement and standardization. Undertake diagnostic work to understand specific technical issues or problems in greater depth. Manage change management end-to-end and support testing activities by triaging, scenario setting, etc. Deliver platform-based projects to improve adoption of the latest features. Resolve issues by partnering with technical, finance, procurement teams, and vendors. Essential Skills/Experience: Coupa certified 6+ years of overall IT experience with good background on Coupa Technical delivery roles Experience with integration technologies such as MuleSoft, Dell Boomi, Azure Integration Services, Kafka, or similar Proficient in ETL tools and practices (e.g., Informatica, Talend, SQL-based scripting) Familiarity with cloud platforms (AWS, Azure, GCP) and hybrid integration strategies Strong problem-solving and analytical skills in technical and data contexts Ability to translate complex technical designs into business-aligned delivery outcomes Leadership in cross-functional and cross-technology environments Effective communicator capable of working with developers, data engineers, testers, and business stake holders Experienced with IT Service Management tools like ServiceNow & Jira Experience in managing and developing 3rd party business relationships Educational Qualifications: - UG - B. Tech /B.E. or other equivalent technical qualifications When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, you'll be part of a dynamic environment where innovation thrives. Our commitment to innovative science combined with leading digital technology platforms empowers us to make a significant impact. With a spirit of experimentation and collaboration across diverse teams, we drive cross-company change to disrupt the industry. Here, you can explore new technologies, shape your own path, and contribute to developing life-changing medicines. Ready to make a difference? Apply now to join our journey! Date Posted 16-Jul-2025 Closing Date 29-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 2 weeks ago
12.0 years
0 Lacs
Tharamani, Chennai, Tamil Nadu
On-site
Job Information Job Type Permanent Date Opened 07/15/2025 Work Shift UK Shift Work Experience 12+ years Industry IT Services Work Location Chennai - OMR State/Province Tamil Nadu City Tharamani Zip/Postal Code 600113 Country India Job Description IDMC Administrator Configure and Maintain Platform: Set up and configure the IDMC platform according to business and enterprise requirements. Manage platform components, including connections, services, and integrations. Ensure high availability, scalability, and optimal performance. Leverage “isolation” features to separate enterprise projects and workloads. Install, configure, and administer Linux-based software components, such as Informatica Intelligent Cloud Services (IICS), Data Catalog, MarketPlace, Data Quality, Data Profiling, and Meta Data Management Scale, Tune and Optimize Environment: Cloud-Specific scaling including auto-scaling, elastic load balancing and serverless computing to facilitate larger load volumes, concurrency and high-availability. Monitor system performance and identify bottlenecks. Fine-tune configurations for optimal resource utilization. Implement best practices for performance optimization. At the OS level (Linux), adjust kernel parameters, manage and tune file systems, utilize tools to monitor system health and identify bottlenecks. Optimize network integration to ensure efficient data flow between on-premises and cloud environments. Integrate with Enterprise and Cloud Systems: Collaborate with other IT teams to integrate IDMC with existing enterprise systems (e.g., ERP, CRM, databases). Configure connectors and APIs to facilitate seamless data exchange. Ensure compatibility with Azure cloud platform for compute and storage infrastructure deployment, security integration, network integration and monitoring. Administer and Control Platform Security: Implement security measures to safeguard data and prevent unauthorized access. Manage user access, roles, and permissions. Monitor security logs and address any security incidents promptly. Implement, monitor, test and document security controls to abide by various regulations. Support Project Development Lifecycle: Collaborate with development teams to understand project requirements. Assist in designing and implementing data integration workflows. Implement and execute procedures for code migration and deployment of data pipelines. Troubleshoot issues during development and deployment phases. Work with Enterprise Architects to Design and Implement Best Practices: Collaborate with solution architects to define and implement data integration strategies. Incorporate industry best practices for data governance, data quality, and performance. Participate in architectural reviews and contribute to platform enhancements
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
Are you passionate about supply chain management and master data Do you thrive in a dynamic environment where accuracy and efficiency are key If you are ready to take your career to the next level, we have the perfect opportunity for you! Join us as an Associate Business Analyst at Novo Nordisk and help shape the future of our supply chain operations. As an Associate Business Analyst at Novo Nordisk, you will be responsible for creating and maintaining master data in SAP and Winshuttle in alignment with established business processes and rules. You will handle Change Requests (CR-cases) and Development Requests (DV) related to master data creation. Your responsibilities will include creating and maintaining master data for raw, semi-finished, and finished goods in SAP ECC and Winshuttle, managing material master data across the product life cycle, performing data cleansing, identifying process improvement opportunities, managing stakeholder relationships effectively, and contributing to training and onboarding of new joiners. To qualify for this role, you should have a Bachelor's degree in supply chain management, production, mechanical engineering, or equivalent from a well-recognized institute. You should have 5 to 7 years of experience within SAP master data, preferably within pharma or supply chain, and be proficient in S/4HANA, Winshuttle, SAP ECC, and Master Data Management. Additionally, you should have a good understanding of supply chain concepts, be a proficient user of Microsoft Office, and have experience in automation with advanced Excel or ETL knowledge. The Supply Chain department at Novo Nordisk focuses on business through offshoring and aims to consolidate Supply Chain activities across the organization. As part of the department, you will play a crucial role in optimizing costs and reducing complexity by operating an effective supply chain. Novo Nordisk is a leading global healthcare company committed to defeating serious chronic diseases. Our success relies on the collaboration of our employees worldwide, and we value the unique skills and perspectives they bring to the table. Join us at Novo Nordisk, where we work toward something bigger than ourselves and strive to make a collective impact on millions of lives worldwide. To apply for this position, please upload your CV online by the 21st of July 2025. Novo Nordisk is committed to an inclusive recruitment process and equality of opportunity for all job applicants.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Product Test Engineer at Barclays where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of a team of developers, you will deliver technology stack, using strong analytical and problem-solving skills to understand the business requirements and deliver quality solutions. To be successful as a Product Test Engineer you should have experience with: Hands-on experience in one or more technical skills under any of the technology platforms as below: - Mainframe: COBOL, IMS, CICS, DB2, VSAM, JCL, TWS, File-Aid, REXX - Open Systems and tools: Selenium, Java, Jenkins, J2EE, Web-services, APIs, XML, JSON, Parasoft/SoaTest Service Virtualization - API Testing Tools: SOAP UI, Postman, Insomnia - Mid-Tier technology: MQ, WebSphere, UNIX, API 3rd party hosting platforms - Data warehouse: ETL, Informatica, Ab-initio, Oracle, Hadoop - Good knowledge of API Architecture and API Concepts - Experience in JIRA and similar test management tools - Test Automation Skills - Hands-on Experience of Test Automation using Java or any other Object-Oriented Programming Language - Hands-on Experience of Automation Framework Creation and Optimization - Good understanding of Selenium, Appium, SeeTest, JQuery, JavaScript, and Cucumber - Experience of working Build tools like Apache Ant, Maven, Gradle - Knowledge/previous experience of DevOps and Continuous Integration using Jenkins, GIT, Dockers - Experience In API Automation Framework Like RestAssured, Karate Some other highly valued skills may include: - E2E Integration Testing Experience - Previous Barclays Experience - Understanding of Mainframes and Barclays Systems will be an added advantage - Understanding of Cloud Technologies like AWS, Azure - Hands-on Experience in Agile methodology - Domain/Testing/Technical certification will be an advantage You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the Role: To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities: - Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards - Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues - Collaboration with cross-functional teams to analyze requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested - Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution - Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing - Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L - Listen and be authentic, E - Energize and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team's operational processing and activities. Escalate breaches of policies/procedures appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision making within their area of expertise. Take ownership for managing risk and strengthening controls in relation to the work they own or contribute to. Deliver work and areas of responsibility in line with relevant rules, regulation, and codes of conduct. Maintain and continually build an understanding of how their own sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You are a highly experienced Senior Reporting & ETL Analyst responsible for designing, building, and optimizing data integration and reporting solutions. Your expertise in handling large datasets, performing complex ETL operations, developing interactive dashboards, and collaborating across teams to deliver insights is crucial for this role. It is essential to have experience in the healthcare or insurance domain. Your key responsibilities include designing, developing, and maintaining robust ETL pipelines using Informatica, Azure Data Factory (ADF), and SSIS. You will also create interactive, visually rich dashboards using Power BI and Tableau to support business decision-making. Writing and optimizing advanced SQL queries to support complex data warehousing operations is another important aspect of your role. Collaboration with stakeholders from healthcare and insurance domains to understand requirements and deliver actionable insights is also a key responsibility. Participation in Agile/Scrum ceremonies, contribution to sprint planning, and ensuring timely delivery of tasks are part of your routine. You are expected to have expert-level experience (10+ years) in Informatica, ADF, and SSIS for ETL tools. Similarly, expert-level experience (8+ years) in Power BI and Tableau for reporting tools is required. Proficiency in writing complex SQL queries and supporting data warehouse design and maintenance is essential with expert-level proficiency (12+ years). Deep understanding of healthcare and/or insurance data systems and regulations (6+ years) is necessary. Proven experience (5+ years) participating in Agile development environments and sprint-based work is also required.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You will be responsible for the practice in Asia, taking accountability for driving quality, sales, recruiting, account management, consulting, and all operational aspects. Your primary responsibilities will include: - Driving overall growth of the practice area through a combination of business development, pre-sales and estimating, delivery work, and thought leadership. - Maximizing team performance by implementing an effective team approach that enhances productivity and job satisfaction. - Managing the allocation of offshore resources to local projects. - Managing engagement risk, project economics, planning and budgeting, defining deliverable content, and ensuring buy-in of proposed solutions from top management levels at the client. - Managing and delivering MuleSoft engagements while building the practice. - Ensuring profitability of all MuleSoft offerings, with revenue management expectations. - Building and developing relationships with MuleSoft Alliance and Sales teams. - Owning joint sales pursuits in partnership with MuleSoft. - Identifying opportunities for growth and maturation of MuleSoft offerings. - Providing oversight and governance of all sold and managed MuleSoft projects. - Driving business development with the necessary information, tools, and subject matter expertise to sell engagements within the offering. - Building and developing relationships/partnerships with local market teams, aligning on sales pursuits, resource capacity and capabilities, and awareness across the region. - Developing or supporting the development of case studies and training materials, conducting brown bags, and providing guidance for MuleSoft Practice. - Developing or supporting the development and delivery of best practices, delivery templates, and point-of-view papers. - Overseeing quality assurance of project delivery. - Facilitating client satisfaction surveys where applicable. - Ensuring alignment of global resources to projects based on appropriate skills and availability, while being responsible for the overall utilization numbers of the team. - Supporting recruiting and onboarding of new employees. Profile: - Minimum Bachelor's Degree in Software Development or Engineering. - 10+ years of experience in a large consulting environment. - Deep technical understanding in the Integration and API Management space. - 6+ years of prior experience leading projects built on integration platforms, preferably MuleSoft, Boomi, Informatica, or TIBCO. - Expert at project delivery, including all aspects of program management and the SDLC. - Expert in business development skills and managing relationships with clients and internal stakeholders. - Expert in communication, both verbal and written. - Expert in business operations such as invoicing, SOWs, margins, and utilization. - Skilled at managing multiple clients. - Excellent mentoring and leadership skills.,
Posted 2 weeks ago
8.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description We are seeking a seasoned MS Dynamics CRM Developer with deep expertise in Dynamics 365 Online solutions and custom development. The ideal candidate should have a strong background in integrating CRM applications, creating plugins, working with Power Platform tools, and providing architectural and functional guidance to project teams. Key Responsibilities Design, develop, and troubleshoot CRM applications using Microsoft Dynamics 365 Implement Dynamics 365 Online solutions with a strong understanding of architecture, capabilities, and deployment Develop JavaScript and HTML web resources using Angular, jQuery, etc. Create and customize plugins, workflows, PCF controls, Power Automate flows, and Power Apps (canvas/model-driven) Integrate Dynamics CRM with external systems using CRM APIs, REST/ODATA, SOAP endpoints Develop reusable components using .NET, Azure Functions, and Logic Apps Configure security groups, roles, and team-based access controls Work with source control systems including Git, Azure DevOps, and TFS Clearly articulate solution designs and ensure alignment with customer requirements Maintain clear and organized documentation for all technical solutions Must-Have Skills 8+ years of hands-on experience in Microsoft Dynamics CRM development Minimum 4 years of recent experience with Dynamics 365 Online implementations Strong experience in .NET, JavaScript, Power Platform, and Azure integrations Proficiency with Visual Studio, source control tools (Git, TFS, Azure DevOps) Deep functional knowledge of core D365 CRM modules Good understanding of security configuration in D365 Excellent communication skills (written & verbal) Experience with IICS (Informatica Intelligent Cloud Services) or CAI (Cloud Application Qualifications : Microsoft D365 Certifications Experience in Dynamics CRM integrations and tools Familiarity with DevOps pipelines and deployment best practices (ref:hirist.tech)
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi