Jobs
Interviews

2470 Snowflake Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

8 - 18 Lacs

Nashik

Work from Office

Proven exp as a Data Architect, preferably in a healthcare setting. Exp in data modeling tools, database management systems (e.g., SQL, NoSQL), & ETL processes. Exp with cloud based databases. Exp with data warehousing, data lakes Required Candidate profile In-depth knowledge of healthcare data standards, such as HL7, ICD-10, CPT, and SNOMED. developing & maintaining data architecture, ensuring data quality, & supporting data-driven.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

8 - 18 Lacs

Nagpur

Work from Office

Proven exp as a Data Architect, preferably in a healthcare setting. Exp in data modeling tools, database management systems (e.g., SQL, NoSQL), & ETL processes. Exp with cloud based databases. Exp with data warehousing, data lakes Required Candidate profile In-depth knowledge of healthcare data standards, such as HL7, ICD-10, CPT, and SNOMED. developing & maintaining data architecture, ensuring data quality, & supporting data-driven.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

8 - 18 Lacs

Sindhudurg

Work from Office

Proven exp as a Data Architect, preferably in a healthcare setting. Exp in data modeling tools, database management systems (e.g., SQL, NoSQL), & ETL processes. Exp with cloud based databases. Exp with data warehousing, data lakes Required Candidate profile In-depth knowledge of healthcare data standards, such as HL7, ICD-10, CPT, and SNOMED. developing & maintaining data architecture, ensuring data quality, & supporting data-driven.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

8 - 18 Lacs

Pune

Work from Office

Proven exp as a Data Architect, preferably in a healthcare setting. Exp in data modeling tools, database management systems (e.g., SQL, NoSQL), & ETL processes. Exp with cloud based databases. Exp with data warehousing, data lakes Required Candidate profile In-depth knowledge of healthcare data standards, such as HL7, ICD-10, CPT, and SNOMED. developing & maintaining data architecture, ensuring data quality, & supporting data-driven.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 Strategic Impact As a Senior Data Engineer, you will directly contribute to our key organizational objectives: Accelerated Innovation Enable rapid development and deployment of data-driven products through scalable, cloud-native architectures Empower analytics and data science teams with self-service, real-time, and high-quality data access Shorten time-to-insight by automating data ingestion, transformation, and delivery pipelines Cost Optimization Reduce infrastructure costs by leveraging serverless, pay-as-you-go, and managed cloud services (e.g., AWS Glue, Databricks, Snowflake) Minimize manual intervention through orchestration, monitoring, and automated recovery of data workflows Optimize storage and compute usage with efficient data partitioning, compression, and lifecycle management Risk Mitigation Improve data governance, lineage, and compliance through metadata management and automated policy enforcement Increase data quality and reliability with robust validation, monitoring, and alerting frameworks Enhance system resilience and scalability by adopting distributed, fault-tolerant architectures Business Enablement Foster cross-functional collaboration by building and maintaining well-documented, discoverable data assets (e.g., data lakes, data warehouses, APIs) Support advanced analytics, machine learning, and AI initiatives by ensuring timely, trusted, and accessible data Drive business agility by enabling rapid experimentation and iteration on new data products and features Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics Be accountable for technical delivery and take ownership of solutions Lead a team of senior and junior developers providing mentorship and guidance Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress Drive technical innovation within the department to increase code reusability, code quality and developer productivity Challenge the status quo by bringing the very latest data engineering practices and techniques About youCore Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance OptimizationExperience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion Experience using orchestration tools (Airflow, Control-M, etc...) Significant experience in software engineering practices using GitHub, code verification, validation, and use of copilots Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes Strong experience in API development using Python based frameworks like FastAPI Key Soft Skills: Problem-SolvingLeadership experience in problem-solving and technical decision-making. CommunicationStrong in strategic communication and stakeholder engagement. Project ManagementExperienced in overseeing project lifecycles working with Project Managers to manage resources.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

IICS Developer2 Job Overview :We are seeking an experienced IICS (Informatica Intelligent Cloud Services) Developer with hands-on experience in the IICS platform . The ideal candidate must have strong knowledge of Snowflake and be proficient in building and managing integrations between different systems and databases. The role will involve working with cloud-based integration solutions, ensuring data flows seamlessly across platforms, and optimizing performance for large-scale data processes. Key Responsibilities : Design, develop, and implement data integration solutions using IICS (Informatica Intelligent Cloud Services). Work with Snowflake data warehouse solutions, including data loading, transformation, and querying. Build, monitor, and maintain efficient data pipelines between cloud-based systems and Snowflake. Troubleshoot and resolve integration issues within the IICS platform and Snowflake . Ensure optimal data processing performance and manage data flow between various cloud applications and databases. Collaborate with data architects, analysts, and stakeholders to gather requirements and design integration solutions. Implement best practices for data governance, security, and data quality within the integration solutions. Perform unit testing and debugging of IICS data integration tasks. Optimize integration workflows to ensure they meet performance and scalability needs. Key Skills : Hands-on experience with IICS (Informatica Intelligent Cloud Services) . Strong knowledge and experience working with Snowflake as a cloud data warehouse. Proficient in building ETL/ELT workflows , including integration of various data sources into Snowflake . Experience with SQL and writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms. Experience with cloud integration platforms and working with RESTful APIs and other integration protocols. Ability to troubleshoot, optimize, and maintain data pipelines effectively. Knowledge of data governance, security principles, and data quality standards. Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Minimum of 5 years of experience in data integration development. Proficiency in Snowflake and cloud-based data solutions. Strong understanding of ETL/ELT processes and integration design principles. Experience working in Agile or similar development methodologies. Location - Pune,Hyderabad,Kolkata,Jaipur,Chandigarh

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Lead Data Engineer Data Management Job description Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad Role Overview: Accordion is looking for Lead Data Engineer. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in-depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Lead Data Engineer should be able to evaluate existing architectures and recommend way to upgrade and improve the performance of architectures both on-premises and cloud-based solutions. A successful Lead Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Partners with clients to understand their business and create comprehensive business requirements. Develops end-to-end Business Intelligence framework based on requirements including recommending appropriate architecture (on-premises or cloud), analytics and reporting. Works closely with the business and technology teams to guide in solution development and implementation. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Curiosity to learn new tools and technologies to provide futuristic solutions for clients. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. More than 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite.) In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (any one of Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.) In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.) Good understanding of Azure (OR) AWS: Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services) or AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

22 - 25 Lacs

Pune

Work from Office

We are looking for immediate joiner + who can join us within 30 Days for below position Senior Snowflake DBT Developer Primary Role We are seeking a skilled Senior Snowflake DBT Developer to join our data engineering team. The ideal candidate will have solid experience developing ETL/ELT pipelines on snowflake + DBT, strong SQL skills, and hands-on expertise working with Snowflake and DBT on the Azure cloud platform. This role involves designing, building, and maintaining scalable data transformation workflows and data models to support analytics and business intelligence. Key Responsibilities: Design, develop, and maintain data transformation pipelines using DBT to build modular, reusable, and scalable data models on Snowflake. Develop and optimize SQL queries and procedures for data loading, transformation, and analysis in Snowflake. Load and manage data efficiently in Snowflake from various sources, ensuring data quality and integrity. Analyze and profile data using SQL to support business requirements and troubleshooting. Collaborate with data engineers, analysts, and business stakeholders to understand data needs and translate them into technical solutions. Implement best practices for DBT project structure, version control (Git), testing, and documentation. Work on Azure cloud platform, leveraging its integration capabilities with Snowflake. Participate in code reviews, unit testing, and deployment processes to ensure high-quality deliverables. Troubleshoot and optimize data pipelines for performance and cost-effectiveness. Desired Skills Qualification Bachelors degree in science, Engineering and related disciplines. Work Experience 5-7 years of experience in data engineering or development roles, with at least 2 years of hands-on experience in Snowflake and 2 years with DBT. Experience developing ETL/ELT pipelines and working with data warehouse concepts. Strong proficiency in SQL, including complex query writing, data analysis, and performance tuning. Proven experience loading and transforming data in Snowflake. Hands-on experience working on Azure cloud platform and integrating Snowflake with Azure services. Familiarity with DBT Core features such as models, macros, tests, hooks, and modular project structure. Good understanding of data modeling concepts and dimensional modeling (star/snowflake schemas). Experience with version control systems like Git and CI/CD workflows is a plus. Strong analytical, problem-solving, and communication skills.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

2 - 5 Lacs

Gurugram

Work from Office

Role Objective We are currently seeking a Associate - BI Analytics to contribute to Analytics & BI Development process. This is a hands-on technical and Individual Contributor role. This position requires that the candidate having knowledge on Advance Excel, SQL, Pythonand support on Reporting and Analytics applications. The candidate needs to have a strong acumen on technical knowledge and be very structured and analytical in his/her approach. Essential Duties and Responsibilities Primary responsibility involves advanced SQL and advanced Excel report design and development. Publishing and scheduling SSIS\SSRS report as per the business requirements. Will be responsible for END TO END BI & Visualization solutions development and projects delivery across multiple clients. Drive the development and analysis of data, reporting automation, Dash boarding, and business intelligence programs. Would be supporting in consulting engagement and should be able to articulate and architect the solution effectively to bring-in the values which data analytics & visualization solution can deliver. Good understanding of database management systems and ETL (Extract, transform, load) framework Connecting to data sources, importing data and transforming data for Business Intelligence. Experience in using advance level calculations on the data set. Responsible for design methodology and project documentation. Able to properly understand the business requirements and develop data models accordingly by taking care of the resources. Should have knowledge and experience in prototyping, designing, and requirement analysis. Qualifications Graduate (BE-BTEC Computer science /BCA/MCA/MSc Computers) or have an equivalent academic qualification Good communication Skills (both written & verbal) Skill Set: Good to have: Academic exposure on design and develop relational database models/schemas and Query performance tuning and write ad-hoc SQL queries. Academic exposure to Snowflake would be ab added advantage Academic exposure with advanced query design, stored procedures, views, functions. Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Understanding of Python and different libraries - Pandas, Numpy etc Exposure to Cloud Computing such as Microsoft Azure Good knowledge/expertise on different versions of .Microsoft Sql Server 2008, 2012, 2016, 2017, 2020

Posted 3 weeks ago

Apply

8.0 - 13.0 years

17 - 30 Lacs

Hyderabad

Hybrid

SUMMARY OF RESPONSIBILITIES Here at MetLife Australia, we are seeking an experienced Data Modeler/Solution Designer to join our team and help on our building data platform solution (known as the Data Exchange or DAX). This role will mainly help design, build data models, and oversee implementation of data pipelines on our Snowflake data platform(DAX) . You will work with Snowflake, Erwin and Matillion to develop scalable and efficient data pipelines that support our business needs. A strong understanding of data modeling principles is essential to build robust data pipelines based on our Data Architecture. Additionally, familiarity with DevOps models, CI/CD workflows and best practices is important. KEY RESPONSIBILITIES DATA SOLUTION DESIGN Core Deliverables: Develop and deliver Conceptual/Logical/Physical data models Develop data mappings and transformation rules document Develop business glossaries (metadata) and relevant data model documentation (data dictionaries) for data and analytics solutions. Responsibilities: Work with business SMEs and data stewards to align data models according to business process. Ensure data architecture and data management principles are followed during development of new data models. Build data modelling standards, naming conventions and follow them during implementation. Work with data stewards, data management specialists to ensure metadata, data quality rules are implemented accurately. Integrate data from various source systems (e.g., databases, APIs, flat files) into Snowflake. Review and support the whole SDLC till go-live and post-production issues. This includes reviewing technical mapping documents, test cases, test plans and execution. Includes scheduling seamless deployment to go live. TECHNICAL TROUBLESHOOTING AND OPTIMISATION Ability to conduct workshops with business SMEs and data stewards to understand the business processes. Data Profiling and analysis to establish strong hold on data. Conduct root cause analysis and recommend long-term fixes for recurring issues Conduct impact assessment for any upstream / downstream changes. DevOps and Operations Collaborate with Data Engineers and Tech BA, ensuring proper version control (Git, Bitbucket) and deployment best practices for data models. Ensure compliance with data governance, security. QUALIFICATIONS Bachelor’s degree in computer science or equivalent with demonstrated experience in delivery of data model and design for Data and analytics project delivery. Core SnowPro Certificate is highly desired EXPERIENCE AND SKILLS 8+ years of experience as a Data Engineer/Data Modeler, preferably in a cloud-based environment . Strong experience with data modelling and data warehouse and data integration. Deep understanding of Snowflake , including performance optimization and best practices. Strong SQL skill is mandatory Solid understanding of data modeling concepts to build effective pipelines. Familiarity with DevOps workflows and working in environments with CI/CD, version control (Git, Bitbucket), and automated deployments . Strong problem-solving skills and the ability to troubleshoot pipeline issues effectively. KNOWLEDGE Knowledge of data management / governance principles and their importance to dealing with data risk (for MetLife and relationship with regulators). Life Insurance or Banking experience including knowledge of financial / actuarial valuation methods and processes (preferred).

Posted 3 weeks ago

Apply

5.0 - 9.0 years

1 - 6 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer Experience: 5-7 Years Location : Bangalore Job Type: Full-Time with NAM Job Summary We are seeking an experienced Data Engineer with 5 to 7 years of experience in building and optimizing data pipelines and architectures on modern cloud data platforms. The ideal candidate will have strong expertise across Google Cloud Platform (GCP), DBT, Snowflake, Apache Airflow, and Data Lake architectures. Key Responsibilities Design, build, and maintain robust, scalable, and efficient ETL/ELT pipelines. Implement data ingestion processes using FiveTran and integrate various structured and unstructured data sources into GCP-based environments. Develop data models and transformation workflows using DBT and manage version-controlled pipelines. Build and manage data storage solutions using Snowflake, optimizing for cost, performance, and scalability. Orchestrate workflows and pipeline dependencies using Apache Airflow. Design and support Data Lake architecture for raw and curated data zones. Collaborate with Data Analysts, Scientists, and Product teams to ensure availability and quality of data. Monitor data pipeline performance, ensure data integrity, and handle error recovery mechanisms. Followbest practices in CI/CD, testing, data governance, and security standards. Required Skills 5 - 7 years of professional experience in data engineering roles. Hands-on experience with GCP services: BigQuery, Cloud Storage, Pub/Sub, Dataflow, Composer, etc. Expertisein FiveTran and experience integrating APIs and external sources. Proficient in writing modular SQL transformations and data modeling using DBT. Deep understanding of Snowflake warehousing: performance tuning, cost optimization, security. Experience with Airflow for pipeline orchestration and DAG management. Familiarity with designing and implementing Data Lake solutions. Proficient in Python and/or SQL. Strong understanding of data governance, data quality frameworks, and DevOps practices. Preferred Qualifications GCP Professional Data Engineer certification is a plus. Experience in agile development environments. Exposure to data catalog tools and data observability platforms. Send profiles to narasimha@nam-it.com Thanks & regards, Narasimha.B Staffing executive NAM Info Pvt Ltd, 29/2B-01, 1st Floor, K.R. Road, Banashankari 2nd Stage, Bangalore - 560070. +91 9182480146 (India)

Posted 3 weeks ago

Apply

5.0 - 9.0 years

1 - 6 Lacs

Pune, Chennai, Bengaluru

Work from Office

Job Title: Data Engineer Experience: 5-7 Years Location : Bangalore Job Type: Full-Time with NAM Job Summary We are seeking an experienced Data Engineer with 5 to 7 years of experience in building and optimizing data pipelines and architectures on modern cloud data platforms. The ideal candidate will have strong expertise across Google Cloud Platform (GCP), DBT, Snowflake, Apache Airflow, and Data Lake architectures. Key Responsibilities Design, build, and maintain robust, scalable, and efficient ETL/ELT pipelines. Implement data ingestion processes using FiveTran and integrate various structured and unstructured data sources into GCP-based environments. Develop data models and transformation workflows using DBT and manage version-controlled pipelines. Build and manage data storage solutions using Snowflake, optimizing for cost, performance, and scalability. Orchestrate workflows and pipeline dependencies using Apache Airflow. Design and support Data Lake architecture for raw and curated data zones. Collaborate with Data Analysts, Scientists, and Product teams to ensure availability and quality of data. Monitor data pipeline performance, ensure data integrity, and handle error recovery mechanisms. Followbest practices in CI/CD, testing, data governance, and security standards. Required Skills 5 - 7 years of professional experience in data engineering roles. Hands-on experience with GCP services: BigQuery, Cloud Storage, Pub/Sub, Dataflow, Composer, etc. Expertisein FiveTran and experience integrating APIs and external sources. Proficient in writing modular SQL transformations and data modeling using DBT. Deep understanding of Snowflake warehousing: performance tuning, cost optimization, security. Experience with Airflow for pipeline orchestration and DAG management. Familiarity with designing and implementing Data Lake solutions. Proficient in Python and/or SQL. Strong understanding of data governance, data quality frameworks, and DevOps practices. Preferred Qualifications GCP Professional Data Engineer certification is a plus. Experience in agile development environments. Exposure to data catalog tools and data observability platforms. Send profiles to narasimha@nam-it.com Thanks & regards, Narasimha.B Staffing executive NAM Info Pvt Ltd, 29/2B-01, 1st Floor, K.R. Road, Banashankari 2nd Stage, Bangalore - 560070. +91 9182480146 (India)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

We are seeking a skilled and motivated Data Engineer with hands-on experience in Snowflake , Azure Data Factory (ADF) , and Fivetran . The ideal candidate will be responsible for building and optimizing data pipelines, ensuring efficient data integration and transformation to support analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using Fivetran , ADF , and other ETL tools. Build and manage scalable data models and data warehouses on Snowflake . Integrate data from various sources into Snowflake using automated workflows. Implement data transformation and cleansing processes to ensure data quality and integrity. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Monitor pipeline performance, troubleshoot issues, and optimize for efficiency. Maintain documentation related to data architecture, processes, and workflows. Ensure data security and compliance with company policies and industry standards. Required Skills & Qualifications: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. 3+ years of experience in data engineering or a similar role. Proficiency with Snowflake including architecture, SQL scripting, and performance tuning. Hands-on experience with Azure Data Factory (ADF) for pipeline orchestration and data integration. Experience with Fivetran or similar ELT/ETL automation tools. Strong SQL skills and familiarity with data warehousing best practices. Knowledge of cloud platforms, preferably Microsoft Azure. Familiarity with version control tools (e.g., Git) and CI/CD practices. Excellent communication and problem-solving skills. Preferred Qualifications: Experience with Python, dbt, or other data transformation tools. Understanding of data governance, data quality, and compliance frameworks. Knowledge of additional data tools (e.g., Power BI, Databricks, Kafka) is a plus.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Process Delivery Specialist - Talent Development Optimization Processes - Data Analyst Help with the data validations and resolve any data discrepancies Responsible for creating Mode/Tableau/PowerBI dashboards to surface data for the accounting team to help with reconciliation Work closely with the Revenue teams daily doing the following o Creating and updating dashboards o Use SQL to query data from Snowflake database to perform reconciliations and data investigation o Root cause analysis Work cross functionally with Revenue, Billing, Engineering, Tax, and Strategic Finance teams to discuss, investigate, and resolve data discrepancies Work with large data sets and compare to current state & then investigate the differences Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3+ years working as a Financial Data Analyst supporting accounting teams Experience working as a data analyst using a variety of BI tools (Mode/Tableau/PowerBI); primarily using Mode Strong SQL knowledge Experience with analyzing financial data Preferred technical and professional experience PowerQuery experience

Posted 3 weeks ago

Apply

3.0 - 8.0 years

20 - 35 Lacs

Hyderabad, Pune

Work from Office

Technical Data Analyst Snowflake, SQL, Python (Finance Data Warehouse) Job Description We are seeking a highly skilled *Technical Data Analyst* to join our team and play a key role in building a *single source of truth* for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate will have a strong background in data analysis, SQL, and data transformation, with experience in financial data warehousing and reporting. This role will involve working closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently built in *Snowflake* and will be migrated to *Databricks*. The candidate will be responsible for transitioning reporting and transformation processes to Databricks while ensuring data accuracy and consistency. *Key Responsibilities:* 1. *Data Analysis & Reporting:* - Build and maintain *month-end accounting and tax dashboards* using SQL and Snowsight in Snowflake. - Transition reporting processes to *Databricks*, creating dashboards and reports to support finance and accounting teams. - Gather requirements from finance and accounting stakeholders to design and deliver actionable insights. 2. *Data Transformation & Aggregation:* - Develop and implement data transformation pipelines in *Databricks* to aggregate financial data and create *balance sheet look-forward views*. - Ensure data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to optimize data ingestion and transformation processes. 3. *Data Integration & ERP Collaboration:* - Support the integration of financial data from the data warehouse into *NetSuite ERP* by ensuring data is properly transformed and validated. - Work with cross-functional teams to ensure seamless data flow between systems. 4. *Data Ingestion & Tools:* - Understand and work with *Fivetran* for data ingestion (no need to be an expert, but familiarity is required). - Troubleshoot and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - 3+ years of experience as a *Data Analyst* or similar role, preferably in a financial or accounting context. - Strong proficiency in *SQL* and experience with *Snowflake* and *Databricks*. - Experience building dashboards and reports for financial data (e.g., month-end close, tax reporting, balance sheets). - Familiarity with *Fivetran* or similar data ingestion tools. - Understanding of financial data concepts (e.g., general ledger, journals, balance sheets, income statements). - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate with finance and accounting teams. - Nice-to-have: Experience with *NetSuite ERP* or similar financial systems.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Hyderabad

Work from Office

Who are we? CDK Global is the largest technical soltuions provider for the automotive retail industry that is setting the the landscape for automotive dealers, original equipment manufacturers (OEMs) and the customers they serve. As a technology company, we have a significant focus moving our applications to the public cloud and in the process working multiple transformation/modernization Be Part of Something Bigger Each year, more than three percent of the U.S. gross domestic product (GDP) is attributed to the auto industry, which flows through our customer, the auto dealer. Its time you joined an evolving marketplace where research and development investment is measured in the tens of billions. Its time you were a part of something bigger. Were expanding our workforce engineers, architects, developers and more onboarding early adopters who can optimize, pivot and keep pace with ever-evolving development roadmaps and applications. Join Our Team Growth potential, flexibility and material impact on the success and quality of a next-gen, enterprise software product make CDK an excellent choice for those who thrive in challenging, fast-paced engineering environments. The possibilities for impact are endless. We have exceptional opportunities to evolve our industry by driving change through new technology. If youre ready for high-impact, youre ready for CDK. Location: Hyderbad, India Role: Define/Maintain/Implement CDKs Public Clould standards including secrets management, storage, compute, networking, account management, database and operations. Leverage tools like AWS Trusted Advisor, 3rd party Cloud Cost Management tools and scripting to identify and drive cost optimization. This will include working with Application owners to achieve the cost savings. Design and implement Cloud Security Controls that creates guard rails for application teams to work within ensuring proper platform security for applications deployed within the CDK cloud environments. Design/Develop/Implement cloud solutions. Leveraging cloud native services, wrap the appropriate security, automation and service levels to support CDK business needs. Examples of solutions this role will be responsible for developing and supporting are Business Continuity/Backup and Recovery, Identity and Access Management, data services including long term archival, DNS, etc. Develop/maintain/implement cloud platform standards (User Access & Roles, tagging, security/compliance controls, operations management, performance management and configuration management) Responsible for writing and eventual automation of operational run-books for operations. Assist application teams with automating their production support run-books (automate everywhere) Assist application teams when they have issues using AWS services where they are not are fully up to speed in their use. Hands on development of automation solutions to support application teams. Define and maintain minimum application deployment standards (governance, cost management and tech debt) Optimizing and tuning designs based on performance and root cause analysis Analysis of existing solutions alignment to infrastructure standards and providing feedback to both evolve and mature the product solutions and CDK public cloud standards. Essential Duties & Skills: This is a hands-on role where the candidate will take on technical tasks where in depth knowledge on usage and public cloud best practices. Some of the areas within AWS where you will be working include: Compute: EC2, EKS. RDS, Lambda Networking: Load Balancing (ALB/ELB), VPN, Transit Gateways, VPCs, Availablity Zones/Regions Storage: EBS, S3, Archive Services, AWS Backup Security: AWS Config, Cloud Watch, Cloud Trail, Route53, Guard Duty, Detective, Inspector, Security Hub, Secrets Server, KMS, AWS Shield, Security Groups,.AWS Identity and Access Management, etc. Cloud Cost Optimization: Cost Optimizer, Trusted Advisor, Cost Explorer, Harness Cloud Clost Management or equivalent cost management tools. Preferred: Experience with 3rd party SaaS solutions like DataBricks, Snowflake, Confluent Kafka Broad understanding/experience across full stack infrastructure technologies Site Reliablity Engineering practices Github/Artifactory/Bamboo/Terraform Database solutions (SQL/NoSQL) Containerization Solutions (Docker, Kubernetes) DevOps processes and tooling Message queuing, data streaming and caching solutions Networking principles and concepts Scripting and development; preferred Python & Java languages Server based operating systems (Windows/Linux) and Web Services (IIS, Apache) Experience of designing, optimizing and troubleshooting public cloud platforms associated with large, complex application stacks Have clear and concise communication and be comfortable working with at all levels in the organization Capable of managing and prioritize multiple projects with competing resource requirements and timelines Years of Experience: 4-5 yrs+ working in the AWS public cloud environment AWS Solution Architect Professional certification preferred Experience with Infrastructure as code (CloudFormation, Terraform)

Posted 3 weeks ago

Apply

9.0 - 14.0 years

25 - 30 Lacs

Gurugram

Work from Office

Reports To Associate Director - Risk Data Analytics Level Level 5 About your team The Global Risk team in Fidelity covers the management oversight of Fidelitys risk profile, including key risk frameworks, policies and procedures and oversight and challenge processes. The team partner with the businesses to ensure Fidelity manages its risk profile within defined risk appetite. The team comprises risk specialists covering all facets of risk management, including investment, financial, non-financial and strategic risk. As part of a broader General Counsel team, the Risk team collaborates closely with Compliance, Legal, Tax and Corporate Sustainability colleagues. Develop efficient data driven solutions to support SMEs take key decisions for oversights & monitoring. Keep up with the pace of change in field of Data Analytics using cloud driven technology stack. Work on diverse risk subject areas. About your role The successful candidate will be responsible for data analysis, visualisation, and reporting for the Global Risk business. This role encompasses the full spectrum of data analysis, data modelling, technical design, and the development of enterprise-level analytics and insights using tools such as Power BI. Additionally, the candidate will provide operational support. Strong relationship management and stakeholder management skills are essential to maintain superior service for our various business contacts and clients. This role is for a Visualization & Reporting expert who can understand various risk domains such as Investment Risk, Non-Financial Risk, Enterprise Risk, and Strategic Risk, as well as complex risk frameworks and business issues. The candidate must comprehend the functional and technical implications associated with delivering analytics capabilities using various data sources and the Power Platform. This role demands strong hands-on skills in data modelling and transformation using SQL queries and Power Query/DAX, along with expert data visualization and reporting abilities. The successful candidate should be able to handle complex project requirements within agreed timelines while maintaining a high level of deliverable quality. Additionally, they will be expected to interact with stakeholders at all levels of the business, seeking approval and signoffs on project deliverables. Key Responsibilities Understand the scope of business requirements and translate them into stories, define data ingestion approach, data transformation strategy, data model, and front-end design (UI/UX) for the required product. Create working prototypes in tools like Excel or Power BI and reach an agreement with business stakeholders before commencing development to ensure engagement. Drive the data modelling and data visualization development from start to finish, keeping various stakeholders informed and obtaining approvals/signoffs on known issues, solution design, and risks. Work closely with Python Developers to develop data adaptors for ingesting, transforming and retaining time series data as required for frontend. Demonstrate a high degree of proficiency in Power Query, Power BI, advanced DAX calculations and modelling techniques, and developing intuitive visualization solutions. Possess strong experience in developing and managing dimensional data models in Power BI or within a data warehouse environment. Show proficiency in data integration and architecture, including dimensional data modelling, database design, data warehousing, ETL development, and query performance tuning. Advanced data modelling and testing skills using various RDBMS (SQL Server 2017+, Oracle 12C+) and Snowflake data warehouse will be an added advantage. Assess and ensure that the solution being delivered is fit for purpose, efficient, and scalable, refining iteratively if required. Collaborate with global teams and stakeholders to deliver the scope of the project. Obtain agreement on delivered visuals and solutions, ensuring they meet all business requirements. Work collaboratively with the project manager within the team to identify, define, and clarify the scope and terms of complex data visualization requirements. Converting raw data into meaningful insights through interactive and easy-to-understand dashboards and reports. Coordinate across multiple project teams delivering common, reusable functionality using service-oriented patterns. Drive user acceptance testing with the product owner, addressing defects, and improving solutions based on observations. Interact and work with third-party vendors and suppliers for vendor products and in cases of market data integration. Build and contribute towards professional data visualization capabilities within risk teams and at the organization level. Stay abreast of key emerging products industry standards in the data visualization and advance analytics. Co-work with other team members for both relationship management and fund promotion. About you Experience 9+ years of experience in developing and implementing advance analytics solutions. Competencies Ability to identify & self-manage analysis work for the allocated workstream with minimal or no assistance. Ability to develop and maintain strong relationships with stakeholders within project working group ensuring continual and effective communication. Ability to translate business requirements to technical requirements (internal and external) in supporting the project. Excellent interpersonal, communication, documentation, facilitation & presentation skills. Fair idea of Agile methodology, familiar with Stories requirements artefact used in Agile. Excellent written and verbal communication skills and a strong team player. Good communication, influencing, negotiation skills. Proven ability to work well under pressure and in a team environment. Self-motivated, flexible, responsible, and a penchant for quality. Experience based domain knowledge of Risk management, regulatory compliance or operational compliance functions would be an advantage. Basic knowledge and know-how of Data Science and Artificial Intelligence/GenAI. Qualifications Preferred academic qualification is BE B-Tech MCA Any Graduate

Posted 3 weeks ago

Apply

3.0 - 8.0 years

8 - 18 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Immediate Requirement for Technical Data Analyst, please find below details for profiles : Exp-3+Years Location-Hyd/Pune WFO- 5 Days Interview- 1st round virtual second F2F(only for Hyd) CTC-Best in the Market Notice Period- Immediate to serving notice july joiners only Company -IT Service Based MNC -Fulltime Job Responsibilities: Snowflake+Python+SQL Exp is mandatory along with finance/accounting context exp. Job Description We are seeking a highly skilled **Technical Data Analyst** to join our team and play a key role in building a **single source of truth** for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate will have a strong background in data analysis, SQL, and data transformation, with experience in financial data warehousing and reporting. This role will involve working closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently built in **Snowflake** and will be migrated to **Databricks**. The candidate will be responsible for transitioning reporting and transformation processes to Databricks while ensuring data accuracy and consistency. **Key Responsibilities:** 1. **Data Analysis & Reporting:** - Build and maintain **month-end accounting and tax dashboards** using SQL and Snowsight in Snowflake. - Transition reporting processes to **Databricks**, creating dashboards and reports to support finance and accounting teams. - Gather requirements from finance and accounting stakeholders to design and deliver actionable insights. 2. **Data Transformation & Aggregation:** - Develop and implement data transformation pipelines in **Databricks** to aggregate financial data and create **balance sheet look-forward views**. - Ensure data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to optimize data ingestion and transformation processes. 3. **Data Integration & ERP Collaboration:** - Support the integration of financial data from the data warehouse into **NetSuite ERP** by ensuring data is properly transformed and validated. - Work with cross-functional teams to ensure seamless data flow between systems. 4. **Data Ingestion & Tools:** - Understand and work with **Fivetran** for data ingestion (no need to be an expert, but familiarity is required). - Troubleshoot and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - 3+ years of experience as a **Data Analyst** or similar role, preferably in a financial or accounting context. - Strong proficiency in **SQL** and experience with **Snowflake** and **Databricks**. - Experience building dashboards and reports for financial data (e.g., month-end close, tax reporting, balance sheets). - Familiarity with **Fivetran** or similar data ingestion tools. - Understanding of financial data concepts (e.g., general ledger, journals, balance sheets, income statements). - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate with finance and accounting teams. - Nice-to-have: Experience with **NetSuite ERP** or similar financial systems. please share details directly at jyoti.c@globalaaplications.com

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad, Ahmedabad

Work from Office

Grade Level (for internal use): 10 The Team: We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact: Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. Whats in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of dataideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications: Bachelor's/Masters Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Kafka, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks , SPARK , Scalatechnologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Hybrid

Role: Business Systems Analyst III Location: Bengaluru ( Hybrid ) The Opportunity: Our client is seeking a highly skilled and motivated Snowflake FinOps Engineer to play a critical role in managing the spend of our growing Snowflake data platform. You will be responsible for ensuring the efficient and cost-effective operation of our Snowflake environment, combining deep technical expertise in Snowflake administration with a strong focus on financial accountability and resource optimization. This is an exciting opportunity to make a significant impact on our data infrastructure and contribute to a data-driven culture. Responsibleness': Snowflake Cost Optimization (FinOps): Develop and implement a comprehensive Snowflake cost optimization strategy aligned with business objectives. Continuously monitor and analyze Snowflake credit consumption and storage costs, identifying key cost drivers and trends. Proactively identify and implement opportunities for cost reduction through techniques such as virtual warehouse rightsizing, query optimization, data lifecycle management, and feature utilization. Develop and maintain dashboards and reports to track Snowflake spending, identify anomalies, and communicate cost optimization progress to stakeholders. Collaborate with engineering and analytics teams to educate them on cost-aware Snowflake practices and promote a culture of cost efficiency. Implement and manage Snowflake cost controls and alerts to prevent unexpected spending. Evaluate and recommend new Snowflake features and pricing models to optimize cost and performance. Automate cost monitoring, reporting, and optimization tasks using scripting and other tools. Work closely with finance and procurement teams on Snowflake budgeting and forecasting. Establish, document, and enforce a comprehensive tagging standard for Snowflake objects (e.g., virtual warehouses, tables, users) to improve cost tracking, resource allocation, and governance. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience 5+ years in administering and managing Snowflake data warehouse environments. Strong understanding of Snowflake architecture, features, and best practices. Demonstrated experience in implementing and driving cost optimization strategies for Snowflake. Proficiency in SQL and experience with data analysis and visualization tools (e.g., Tableau, Looker, Power BI). Experience with scripting languages (e.g., Python) for automation tasks is highly desirable. Familiarity with FinOps principles and practices in a cloud environment is a significant advantage. Excellent analytical and problem-solving skills with a strong attention to detail. Strong communication and collaboration skills, with the ability to explain technical concepts to both technical and non-technical audiences. Ability to work independently and manage multiple priorities in a fast-paced environment. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

1 - 4 Lacs

Bengaluru

Work from Office

Job Title: Data Engineer Company Name: Kinara Capital Job Description: As a Data Engineer at Kinara Capital, you will play a critical role in building and maintaining the data infrastructure necessary for effective data analysis and decision-making. You will collaborate with data scientists, analysts, and other stakeholders to support data-driven initiatives. Your primary responsibilities will include designing and implementing robust data pipelines, ensuring data quality and integrity, and optimizing data storage and retrieval processes. Key Responsibilities: - Develop, construct, test, and maintain data architectures including databases and large-scale processing systems. - Create and manage data pipelines to ingest, process, and transform data from various sources. - Collaborate with data scientists and analysts to understand data needs and develop solutions to meet those needs. - Monitor data quality and implement data governance best practices. - Optimize SQL queries and improve performance of data-processing systems. - Ensure data privacy and security standards are met and maintained. - Document data processes and pipelines to facilitate knowledge sharing within the team. Skills and Tools Required: - Proficiency in programming languages such as Python, Java, or Scala. - Experience with data warehousing solutions, such as Amazon Redshift, Google BigQuery, or Snowflake. - Strong knowledge of SQL and experience with relational databases like MySQL, PostgreSQL, or Oracle. - Familiarity with big data technologies like Apache Hadoop, Apache Spark, or Apache Kafka. - Understanding of data modeling and ETL (Extract, Transform, Load) processes. - Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. - Familiarity with data visualization tools (e.g., Tableau, Power BI) is a plus. - Strong analytical and problem-solving skills, with attention to detail. - Excellent communication skills to work collaboratively with cross-functional teams. Join Kinara Capital and leverage your data engineering skills to help drive innovative solutions and empower businesses through data.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

8 - 12 Lacs

Pune, Bengaluru

Hybrid

Role & responsibilities - Overall 8+ years of prior experience as Data engineer/ Data analyst/ BI Engineer. - At least 5 years of Consulting or client service delivery experience on Amazon AWS (AWS) - At least 5 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions - Minimum of 5 years of hands-on experience in AWS and Big Data technologies such as Python, SQL, EC2, S3, Lambda, Spark/SparkSQL, Redshift, Snowflake, Snaplogic. - Prior experience on Snaplogic, AWS Glue, Lambda is must to have. - 3-5+ years of hands on experience in programming languages such as python, pyspark, spark, SQL,. - 2+ years of experience with DevOps tools such as GitLabs, Jenkins, Code Build, CodePipeline, CodeDeploy, etc. - Bachelors or higher degree in Computer Science or a related discipline. - AWS certification like Solution Architect Associate or Associate AWS Developer or AWS Big Data Specialty (nice to have).

Posted 3 weeks ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Chennai, Mumbai (All Areas)

Work from Office

Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies