Home
Jobs

219 Etl Processes Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

3 - 6 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Description We are seeking a skilled Marketing and Insights Manager (MIM) to join our team in India. The ideal candidate will be responsible for developing and executing marketing strategies that enhance brand presence and drive business growth. Responsibilities Develop and implement marketing strategies to drive brand awareness and engagement. Conduct market research and analyze trends to identify new opportunities for growth. Collaborate with cross-functional teams to create cohesive marketing campaigns. Manage social media platforms and create content to engage with target audiences. Monitor and report on the performance of marketing initiatives and optimize as needed. Skills and Qualifications Bachelor's degree in Marketing, Business Administration, or a related field. 2-7 years of experience in marketing or related roles. Strong analytical skills with the ability to interpret data and make data-driven decisions. Proficient in digital marketing tools and platforms, including SEO, PPC, and social media marketing. Excellent communication and interpersonal skills, with the ability to work collaboratively in a team environment.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

2 - 5 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Job Summary: The Azure Data Engineer (Standard) is a senior-level role responsible for designing and implementing complex data processing solutions on the Azure platform. They work with other data engineers and architects to develop scalable, reliable, and efficient data pipelines that meet business requirements. Job Description: Experience with Databricks, Spark, SQL and handling large volume of data Strong experience in data migration, ETL, and data integration processes. Knowledge of data warehousing and data lakes concepts. Familiarity with CI/CD processes for data workflows. Experience with Databricks-specific features such as Delta Lake and Databricks SQL Ability to communicate effectively with both technical and non-technical stakeholders.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Responsibilities Develop and maintain complex Power BI reports and dashboards to provide actionable insights for various stakeholders. Conduct in-depth data analysis using SQL to extract, clean, and transform large datasets. Collaborate with business users to understand their reporting needs and translate them into effective data visualizations. Identify trends, patterns, and anomalies in data to uncover opportunities for improvement. Develop and implement data quality standards and processes. Automate routine reporting and analysis tasks through scripting or other means. Provide data-driven recommendations to improve business performance. Mentor and guide junior data analysts. Qualifications Bachelor's degree in Computer Science, Statistics, Mathematics, or a related field. Minimum of 8 years of experience in data analysis and reporting. Advanced proficiency in Power BI, including data modeling, DAX, and report development. Strong SQL skills, including writing complex queries and performance optimization. Experience with Python programming (preferred). Excellent analytical and problem-solving skills. Strong attention to detail and accuracy. Ability to communicate complex technical information to non-technical audiences. Experience working in an Agile environment (preferred). Desired Skills Experience with data visualization tools (Power BI etc.). Knowledge of data warehousing and ETL processes. Experience with cloud-based data platforms (AWS, Azure, GCP). Understanding of statistical methods and data mining techniques.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

4 - 7 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Description We are seeking an experienced SAS Developer with 6-11 years of expertise in Base SAS, Macros, SQL, and PROC SQL to join our dynamic team. The ideal candidate will be responsible for developing and maintaining robust SAS programs to support data analysis and reporting needs. You will play a key role in transforming data into actionable insights and collaborating with various teams to enhance our data-driven decision-making processes. Responsibilities Develop and maintain SAS programs using Base SAS, Macros, and SQL for data analysis and reporting. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Perform data extraction, transformation, and loading (ETL) processes using SAS tools. Create and optimize complex queries using PROC SQL to manipulate and analyze large datasets. Ensure data integrity and accuracy by performing thorough testing and validation of SAS programs. Generate insightful reports and dashboards to support business decision-making. Document technical processes, workflows, and procedures for future reference and knowledge sharing. Troubleshoot and resolve any issues related to SAS programming and data processing. Skills and Qualifications Bachelor's or Master's degree in Computer Science, Statistics, Mathematics, or a related field. 6-11 years of hands-on experience in SAS programming, specifically with Base SAS, Macros, and SQL. Strong proficiency in PROC SQL for data manipulation and analysis. Experience in data management and ETL processes using SAS tools. Familiarity with SAS Enterprise Guide and SAS Studio is a plus. Knowledge of statistical analysis techniques and methodologies. Ability to work with large datasets and perform data cleaning and validation. Excellent problem-solving skills and attention to detail. Strong communication skills to effectively collaborate with team members and stakeholders.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

3 - 6 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

Description Infogain is seeking a skilled SAS Developer with 6-10 years of experience to join our dynamic team. This is a hybrid work position based in India. The ideal candidate will have a strong background in SAS programming, data analysis, and a passion for delivering high-quality data solutions to our clients. You will play a critical role in analyzing complex data sets, generating insights, and assisting in decision-making processes. Responsibilities Develop, test, and implement SAS programs for data analysis and reporting. Collaborate with data analysts and business users to gather requirements and translate them into technical specifications. Optimize and enhance existing SAS programs to improve performance and efficiency. Design and maintain data models and databases to support data analysis. Ensure data integrity and accuracy throughout the data processing lifecycle. Prepare and present reports and visualizations to stakeholders. Skills and Qualifications Proficient in SAS programming and analytics tools. Strong knowledge of SQL and experience with relational databases. Experience with data visualization tools like Tableau or Power BI is a plus. Familiarity with ETL processes and tools. Understanding of statistical analysis and data mining techniques. Excellent problem-solving and analytical skills. Ability to work in a collaborative team environment and communicate effectively with non-technical stakeholders.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

2 - 5 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

Should have good hands-on experience in Base SAS, SAS/Macros and other SAS tools. Strong Skill in writing and debugging SAS SQL/Proc SQL code. Experience in writing and optimizing SAS programs for data extraction, transformation, and analysis Experience in handling large datasets and performing data cleansing. Knowledge in PySpark is a plus

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 7 Lacs

Delhi NCR, , India

On-site

Foundit logo

Description We are seeking an experienced Data Engineer with 4-9 years of hands-on experience to join our dynamic team. The ideal candidate will be responsible for designing and implementing robust data pipelines and ensuring data quality for analytics and reporting purposes. You will work closely with cross-functional teams to support data-driven decision-making and enhance our data infrastructure. If you are passionate about data and possess strong technical skills, we would love to hear from you. Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Collaborate with data scientists and analysts to understand data requirements and ensure the availability of clean, structured, and reliable data. Optimize and automate data ingestion, transformation, and storage processes. Monitor and troubleshoot data pipeline performance and implement improvements as necessary. Implement data governance and data quality best practices to ensure data integrity and security. Work with cloud-based data platforms (e.g., AWS, Azure, Google Cloud) to manage data infrastructure and storage solutions. Skills and Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 4-9 years of experience as a Data Engineer or in a similar role. Proficiency in programming languages such as Python, Java, or Scala for data processing and manipulation. Experience with SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra). Strong knowledge of data warehousing concepts and experience with ETL tools (e.g., Apache NiFi, Talend, Informatica). Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and data processing frameworks. Experience with cloud platforms and services (e.g., AWS Redshift, Google BigQuery, Azure Data Lake). Understanding of data modeling and data architecture principles. Excellent problem-solving skills and ability to work with complex data sets.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

4 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Description We are seeking a skilled Azure Data Engineer with 6-11 years of experience to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining data solutions on the Azure platform. You will work closely with cross-functional teams to ensure the efficient processing and management of data, while also driving data-driven decision-making within the organization. If you are passionate about data and have a strong technical background in Azure services, we would love to hear from you. Responsibilities Design and implement data solutions using Azure services such as Azure Data Factory, Azure Databricks, and Azure SQL Database. Develop and maintain data pipelines for data ingestion, transformation, and storage. Ensure data quality and integrity by implementing data validation and cleansing processes. Collaborate with data scientists and analysts to understand data requirements and provide data access. Optimize performance of data processing and storage solutions in Azure. Monitor and troubleshoot data workflows and pipelines to ensure reliability and efficiency. Implement security and compliance measures for data handling and storage. Document data architecture and processes for future reference and onboarding. Skills and Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. 6-11 years of experience in data engineering or a related field. Strong experience with Azure Data services (Azure Data Factory, Azure Databricks, Azure Synapse Analytics). Proficiency in SQL and experience with relational databases (e.g., Azure SQL Database, SQL Server). Knowledge of programming languages such as Python or Scala for data processing and ETL tasks. Experience with data modeling and database design principles. Familiarity with big data technologies (e.g., Apache Spark, Hadoop) and data warehousing concepts. Understanding of data governance and best practices in data security. Experience with CI/CD processes and DevOps practices for data solutions.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Job Description:Job Description: Experience with Databricks, PySpark, and handling large volume of data Strong experience in data migration, ETL, and data integration processes. Proficient in Python, SQL, Pyspark and Scala (knowledge of Java is a plus). Knowledge of data warehousing and data lakes concepts. Familiarity with CI/CD processes for data workflows. Experience with Databricks-specific features such as Delta Lake and Databricks SQL Ability to communicate effectively with both technical and non-technical stakeholders.

Posted 2 weeks ago

Apply

2.0 - 8.0 years

2 - 8 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Description We are looking for a skilled Analyst to join our team in India. The ideal candidate will have 2-8 years of experience in data analysis and will be responsible for analyzing complex data sets, developing reports, and providing insights to support business decisions. Responsibilities Analyze and interpret complex data sets to provide actionable insights. Develop and maintain dashboards and reports for management. Assist in the preparation of presentations for stakeholders. Collaborate with cross-functional teams to drive data-driven decision making. Identify trends and patterns in data to support strategic initiatives. Skills and Qualifications Bachelor's degree in Finance, Economics, Statistics, or a related field. Proficiency in data analysis tools such as SQL, Python, or R. Experience with data visualization tools such as Tableau or Power BI. Strong analytical skills with attention to detail. Excellent communication skills, both verbal and written. Ability to work independently and as part of a team. Familiarity with statistical analysis and modeling techniques. SKILLSET - LMS - Cornerstone is good to have Candidate must have Learning experience, specifically LMS (Learning Management System) experience is must. If LMS experience is on CSOD that would be more beneficial. Candidate must need to move to Hyderabad and must adapt to hybrid working model. Candidate must have good Communication skills. Added advantage would be there if candidate has worked in Pharma/Biopharma industry

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 - 5 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Very good experience on Continuous Flow Graph tool used for point based development Design, develop, and maintain ETL processes using Ab Initio tools. Write, test, and deploy Ab Initio graphs, scripts, and other necessary components. Troubleshoot and resolve data processing issues and improve performance Data Integration: Extract, transform, and load data from various sources into data warehouses, operational data stores, or other target systems. Work with different data formats, including structured, semi-structured, andunstructured data. Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 weeks ago

Apply

3.0 - 10.0 years

5 - 14 Lacs

Ahmedabad, Gujarat, India

On-site

Foundit logo

Description We are seeking a skilled Data Analyst to join our team in India. The ideal candidate will have 3-10 years of experience in data analysis, with a strong background in extracting insights from data to drive business decisions. Responsibilities Analyze and interpret complex data sets to inform strategic decision-making. Develop and maintain dashboards and reports to track key performance metrics. Collaborate with cross-functional teams to understand data needs and provide insights. Conduct data validation and ensure data integrity across multiple sources. Utilize statistical methods to analyze data trends and patterns. Skills and Qualifications Bachelor's degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. Proficiency in data analysis tools such as SQL, R, Python, or Excel. Experience with data visualization tools like Tableau, Power BI, or similar. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills to present findings and insights clearly.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

1 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

As a Technical Specialist - Azure Databricks , you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS (Amazon Web Services) services, and SQL, to support our data processing and analytics needs. Responsibilities: - Develop robust data platforms, and contribute to the organization's data-driven growth. Utilize expertise in Apache Airflow for orchestrating and automating complex data workflows. Help the other team members in developing solutions. Design solutions on Databricks including using delta lake/delta table, data warehouse, and more. Apply best practices while developing the solutions. Work towards creating reusable components. Ensure that the pipelines are designed to keep the operating cost low. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Hands-on experience in designing, developing, and optimizing scalable large volume pipelines to support the processing of structured and semi-structured data In-depth knowledge of SQL, PySpark, Databricks, delta lake, delta table, Azure data factory, Azure services Python, DevOps and Git, Relational Databases (Oracle, SQL Server, and more) Understanding of designing and implementing data solutions involving real-time streaming technologies, such as Apache Kafka, AWS MSK, Kinesis and Azure Event Hubs, ensuring seamless integration of streaming data into processing pipelines Exposure to the CI/CD pipelines for the Azure and AWS resources Possess excellent communication skills, with the ability to effectively convey complex analytical findings Understanding of FinOps/ observability patterns, data governance best practices AWS/GCP/Azure

Posted 2 weeks ago

Apply

8.0 - 10.0 years

1 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

As a Solution Architect Snowflake, you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS services, and SQL, to support our data processing and analytics needs. Responsibilities: - Collaborate with stakeholders to finalize the scope of enhancements and development projects, gather detailed requirements. Apply expertise in ETL/ELT processes and tools to design and implement data pipelines that fulfil business requirements. Provide expertise as a technical resource to solve complex business issues that translate into data integration and database systems designs. Migrate and modernize existing legacy ETL jobs for Snowflake, ensure data integrity and optimal performance. Analyze existing ETL jobs and identify opportunities for creating reusable patterns and components to expedite future development. Develop and implement a configuration-driven Data Ingestion framework that enables efficient onboarding of new source tables. Collaborate with cross-functional teams, including business analysts and solution architects, to align data engineering initiatives with business goals. Drive continuous improvement initiatives, enhance data engineering processes, tools, and frameworks. Ensure compliance with data quality, security, and privacy standards across all data engineering activities. Participate in code reviews, provide constructive feedback, and ensure high-quality, maintainable code. Prepare and present technical documentation, including data flow diagrams, ETL specifications, and architectural designs. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Cloud certifications (AWS, etc.) and relevant technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Should have strong experience in Snowflake and must have executed development and migration projects involving Snowflake Should have strong working experience in ETL tools (Matillion/ DBT/Fivetron/ADF preferably) Experience in SQL writing including flatten tables and experience in JSON will be good to have, and able to write complex queries. Strong understanding of SQL queries, good coding experience on Python, deploying into Snowflake data warehousing, pipelines Experience in large databases Working knowledge of AWS (S3, KMS, and more) or Azure/GCP Design, develop, and thoroughly test new ETL/ELT code, ensure accuracy, reliability, and adherence to best practices Snowflake Python/Spark/JavaScript AWS/Azure/GCP SQL Good to have skills:- CI/CD (DevOps)

Posted 2 weeks ago

Apply

11.0 - 13.0 years

1 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

As an Senior Solution Architect - Azure Databricks , you will play a crucial role in designing, building, and maintaining scalable data pipelines and infrastructure. You will work with a variety of technologies, including Scala, Python, Spark, AWS services, and SQL, to support our data processing and analytics needs. Responsibilities: - Enable end-to-end program planning/WBS and execution. Guiding the team technically and leading technical reviews with the project team and Client facing. Drive technical/solution and architecture reviews with the client business and IT stakeholders. Deploy CitiusTech accelerators, reusable assets, and frameworks for the engagements. Proactively identify technical/solution delivery risks and plan mitigation steps. Work with both technical and business users, platform/product partners, and other stakeholders to design and deliver quality outcomes for the data on Cloud/platform modernization engagements. Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Should be certified on at least one of the data platforms (Snowflake or Databricks), Hyperscaler/cloud certification. Skills: Mandatory Technical Skills:- Over 5 years of experience in Cloud data architecture and analytics Proficient in Azure, Snowflake, SQL, and DBT Extensive experience in designing and developing data integration solutions using DBT and other data pipeline tools Strong understanding of data validation, cleansing, and enrichment processes Experience in creating low-level design documents and unit test strategies Ability to perform code reviews and unit test plan reviews Excellent communication and teamwork skills Self-initiated, problem solver with a strong sense of ownership At least 12 to 13 years of IT experience, primarily in leading data and analytics programs tech delivery Should have led at least 3 large legacy EDW/data platform modernization and migrations to Snowflake/Databricks/data on Cloud engagements in the last 5 years. Having experience in leading all aspects of the project/program life cycle, including strategy, roadmap, architecture, design, development, and Implementation for the multi-phase/multi-year engagements and rollouts Should have led large team with up to 50 members Good to have experience in DataOps Snowflake Python/Spark/JavaScript AWS/Azure/GCP SQL Good to Have Skills:- CI/CD (DevOps) Terraform/Airflow

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

As an Technical Lead Data Built Tool, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards Responsibilities: - Design and develop data integration solutions using Data Build Tool (DBT) and Azure Data Factory (ADF) ETL tools. Create DBT models, and maintain dependencies across the models. Build and maintain CI/CD pipeline for DBT deployments. Write custom test scripts, and implement various hooks techniques. Integrate various sources using Snow Pipe (GCP, AWS, Azure). Analyze and validate data in Snowflake warehouse. Build Metric model at Semantic layer. Work with complex SQL functions and enable transformation of data on large data sets in Snowflake. Design and implement data models, develop pipelines for incremental data updates and historical data capture, optimize performance, ensure data quality, and collaborate with team members to support the data needs. Collaborate with cross-functional teams to understand business requirements, and design effective Cloud migration solutions. Perform data validation, testing, and troubleshooting during and after migration to ensure data integrity and quality. Experience: 5- 8 Years Location: Bangalore Pune Mumbai Chennai Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Dbt Snowflake DevOps Strong verbal and written communication skills along with problem solving skills Good to have skills:- ETL/ELT Azure Data Factory Cloud Migration experience US Healthcare

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

As an Technical Lead Data Built Tool, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards Responsibilities: - Design and develop data integration solutions using Data Build Tool (DBT) and Azure Data Factory (ADF) ETL tools. Create DBT models, and maintain dependencies across the models. Build and maintain CI/CD pipeline for DBT deployments. Write custom test scripts, and implement various hooks techniques. Integrate various sources using Snow Pipe (GCP, AWS, Azure). Analyze and validate data in Snowflake warehouse. Build Metric model at Semantic layer. Work with complex SQL functions and enable transformation of data on large data sets in Snowflake. Design and implement data models, develop pipelines for incremental data updates and historical data capture, optimize performance, ensure data quality, and collaborate with team members to support the data needs. Collaborate with cross-functional teams to understand business requirements, and design effective Cloud migration solutions. Perform data validation, testing, and troubleshooting during and after migration to ensure data integrity and quality. Experience: 5- 8 Years Location: Bangalore Pune Mumbai Chennai Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Dbt Snowflake DevOps Strong verbal and written communication skills along with problem solving skills Good to have skills:- ETL/ELT Azure Data Factory Cloud Migration experience US Healthcare

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Foundit logo

As an Technical Lead Data Built Tool, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards Responsibilities: - Design and develop data integration solutions using Data Build Tool (DBT) and Azure Data Factory (ADF) ETL tools. Create DBT models, and maintain dependencies across the models. Build and maintain CI/CD pipeline for DBT deployments. Write custom test scripts, and implement various hooks techniques. Integrate various sources using Snow Pipe (GCP, AWS, Azure). Analyze and validate data in Snowflake warehouse. Build Metric model at Semantic layer. Work with complex SQL functions and enable transformation of data on large data sets in Snowflake. Design and implement data models, develop pipelines for incremental data updates and historical data capture, optimize performance, ensure data quality, and collaborate with team members to support the data needs. Collaborate with cross-functional teams to understand business requirements, and design effective Cloud migration solutions. Perform data validation, testing, and troubleshooting during and after migration to ensure data integrity and quality. Experience: 5- 8 Years Location: Bangalore Pune Mumbai Chennai Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Dbt Snowflake DevOps Strong verbal and written communication skills along with problem solving skills Good to have skills:- ETL/ELT Azure Data Factory Cloud Migration experience US Healthcare

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

As an Technical Lead Data Built Tool, you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards Responsibilities: - Design and develop data integration solutions using Data Build Tool (DBT) and Azure Data Factory (ADF) ETL tools. Create DBT models, and maintain dependencies across the models. Build and maintain CI/CD pipeline for DBT deployments. Write custom test scripts, and implement various hooks techniques. Integrate various sources using Snow Pipe (GCP, AWS, Azure). Analyze and validate data in Snowflake warehouse. Build Metric model at Semantic layer. Work with complex SQL functions and enable transformation of data on large data sets in Snowflake. Design and implement data models, develop pipelines for incremental data updates and historical data capture, optimize performance, ensure data quality, and collaborate with team members to support the data needs. Collaborate with cross-functional teams to understand business requirements, and design effective Cloud migration solutions. Perform data validation, testing, and troubleshooting during and after migration to ensure data integrity and quality. Experience: 5- 8 Years Location: Bangalore Pune Mumbai Chennai Educational Qualifications: Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: Mandatory Technical Skills:- Dbt Snowflake DevOps Strong verbal and written communication skills along with problem solving skills Good to have skills:- ETL/ELT Azure Data Factory Cloud Migration experience US Healthcare

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

As a Senior Software Engineer - SQL Development , you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards. Responsibilities: - Design, develop, and maintain the company's data applications. Work with product management team and analysts to build and deploy data-driven solutions. Develop and implement data model and data quality. Work with stakeholders to understand the data needs and requirements. Provide technical leadership to the data engineering team. Stay up-to-date on the latest data engineering trends and technologies. Perform periodic code reviews to ensure that the code is rigorously designed, coded, and effectively tuned for performance. About the program: - The Data Academy program is designed to accelerate your career and enhance your expertise in modern Data Engineering tech stacks, focusing on developing cutting-edge solutions in the healthcare space. Here are a few key highlights Comprehensive Training in Healthcare Data Migration, Integration, and Modernization Access to Cutting-Edge Cloud Infrastructure & Advanced Data Engineering Tool Use modern tech stack like Databricks and Snowflake Mentorship from CitiusTech SMEs Engage in Real-World Projects in the Healthcare Domain Earn a certification that's recognized worldwide, boosting your career prospects You will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards Experience: - 3 5 Years Location: - Pune Educational Qualifications: - Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: - Mandatory Technical Skills: - Strong knowledge Python and SQL Any one ETL/ELT platforms development experience and building data pipelines (real-time, batch) Knowledge of Distributed computing, Big Data Concepts and PySpark Experience in integrating and processing semi-structured/un-structured files and data processing, data preparation, and data quality, and more Strong understanding of data warehousing and data lakes Excellent communication and interpersonal skills Ability to work under pressure and meet deadlines Good to Have Skills: - Hands-on experience in at least 1 Cloud Hyperscaler's (AWS/Azure) data services Hands-on experience in Azure Cloud (Azure HD Insights, Azure Data Factory, ADF, Synapse) or AWS Cloud (AWS EMR, Athena, Glue, Kinesis, Firehose, AWS Step functions, Amazon QuickSight, Athena, Redshift) Experience in designing and developing ETL large scale data pipelines Knowledge of data engineering tools and technologies Experience in handling healthcare clients and data

Posted 2 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

As a Software Engineer -SQL Development , you will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards. Responsibilities: - Design, develop, and maintain the company's data applications. Develop and implement data model and data quality. Work with stakeholders to understand the data needs and requirements. Stay up-to-date on the latest data engineering trends and technologies. About the program: - The Data Academy program is designed to accelerate your career and enhance your expertise in modern Data Engineering tech stacks, focusing on developing cutting-edge solutions in the healthcare space. Here are a few key highlights Comprehensive Training in Healthcare Data Migration, Integration, and Modernization Access to Cutting-Edge Cloud Infrastructure & Advanced Data Engineering Tool Use modern tech stack like Databricks and Snowflake Mentorship from CitiusTech SMEs Engage in Real-World Projects in the Healthcare Domain Earn a certification that's recognized worldwide, boosting your career prospects You will be a part of an Agile team to build healthcare applications and implement new features while adhering to the best coding development standards Experience: - 1 3 Years Location: - Pune Educational Qualifications: - Engineering Degree BE/ME/BTech/MTech/BSc/MSc. Technical certification in multiple technologies is desirable. Skills: - Mandatory Technical Skills: - Good knowledge of Python and SQL Understanding of RDBMS and data warehousing Any one ETL/ELT platforms development experience and building data pipelines (real-time, batch) Knowledge of Distributed computing and Big Data Concepts Experience in integrating and processing semi-structured/un-structured files and data processing, data preparation, and data quality, and more Excellent communication and interpersonal skills Ability to work under pressure and meet deadlines Good to Have Skills: - Knowledge of at least 1 Cloud Hyperscaler's (AWS/Azure) data services Knowledge of data engineering tools and technologies Experience in handling healthcare clients and data

Posted 2 weeks ago

Apply

4.0 - 6.0 years

3 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Diensten Tech Limited is looking for ODI Developer to join our dynamic team and embark on a rewarding career journey Responsible for designing, developing, and maintaining data integration solutions using ODI technology The ODI Developer works with business analysts and other stakeholders to understand data integration requirements and designs solutions that meet those requirements Responsible for developing data integration solutions using ODI technology This includes coding and testing mappings, transformations, and workflows The ODI Developer is responsible for providing technical support to users of data integration solutions This includes resolving issues related to data quality, performance, and usability

Posted 2 weeks ago

Apply

5.0 - 12.0 years

5 - 12 Lacs

Ahmedabad, Gujarat, India

On-site

Foundit logo

Your role and responsibilities As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution Focus: Working across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution Delivery: Involvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS(BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge, Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 2 weeks ago

Apply

5.0 - 12.0 years

5 - 12 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Your role and responsibilities As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution Focus: Working across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution Delivery: Involvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS(BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge, Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 - 5 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Your role and responsibilities As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour's. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively.. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities.. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies