Jobs
Interviews

169 Snowflake Db Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

5 - 8 Lacs

kolkata

Remote

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 16 hours ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

ahmedabad

Remote

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 16 hours ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

lucknow

Remote

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 16 hours ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

lucknow

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 18 hours ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

surat

Remote

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 18 hours ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

kanpur

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 20 hours ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

jaipur

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 20 hours ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

kanpur

Remote

About The Opportunity : Join a pioneering consulting firm in the Data Analytics and Cloud Solutions sector, where transformative data architectures empower global enterprises. We specialize in leveraging cutting-edge Snowflake technologies and innovative cloud solutions to drive real-time insights and business intelligence. This remote role, based in India, offers the opportunity to work on high-impact projects while collaborating with a diverse team of experts. Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 20 hours ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

lucknow

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 20 hours ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

jaipur

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 21 hours ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

lucknow

Remote

About The Opportunity : Join a pioneering consulting firm in the Data Analytics and Cloud Solutions sector, where transformative data architectures empower global enterprises. We specialize in leveraging cutting-edge Snowflake technologies and innovative cloud solutions to drive real-time insights and business intelligence. This remote role, based in India, offers the opportunity to work on high-impact projects while collaborating with a diverse team of experts. Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 21 hours ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

kanpur

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 22 hours ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

jaipur

Work from Office

Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 22 hours ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

hyderabad

Work from Office

The Impact you will have in this role: The Development family is responsible for crafting, designing, deploying, and supporting applications, programs, and software solutions. May include research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities related to software products used internally or externally on product platforms supported by the firm. The software development process requires in-depth domain expertise in existing and emerging development methodologies, tools, and programming languages. Software Developers work closely with business partners and / or external clients in defining requirements and implementing solutions. The Software Engineering role specializes in planning, documenting technical requirements, crafting, developing, and testing all software systems and applications for the firm. Works closely with architects, product managers, project management, and end-users in the development and improvement of existing software systems and applications, proposing and recommending solutions that solve complex business problems. Your Primary Responsibilities: Act as a technical expert on one or more applications used by DTCC Work with the Business System Analyst to ensure designs satisfy functional requirements Partner with Infrastructure to identify and deploy optimal hosting environments Tune application performance to eliminate and reduce issues Research and evaluate technical solutions consistent with DTCC technology standards Align risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Apply different software development methodologies dependent on project needs Contribute expertise to the design of components or individual programs, and participate in the construction and functional testing Support development teams, testing, solving, and production support Create applications and construct unit test cases that ensure compliance with functional and non-functional requirements Work with peers to mature ways of working, continuous integration, and continuous delivery Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: Expertise in Snowflake DB and its various architecture principles, capabilities Experience with data warehousing, data architecture, ETL data pipeline and/or data engineering environments at enterprise scale that are built on Snowflake Ability to create Strong SQL Procedures in Snowflake, Build a Data Pipeline efficiently in a cost-optimizing & performance efficient way Proficient understanding of code versioning tools - Git, Mercurial, SVN Knowledge of SDLC, Testing & CI/CD aspects such as Jenkins, BB , JIRA Fosters a culture where integrity and transparency are encouraged. Stays ahead of on changes in their own specialist area and seeks out learning opportunities to ensure knowledge is up-to-date. Invests in effort to individually coach others. Build collaborative teams across the organization. Communicates openly keeping everyone across the organization advised.

Posted 1 day ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

pune

Work from Office

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 day ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

ahmedabad

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 day ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

kolkata

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 day ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

surat

Remote

Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 1 day ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

ahmedabad

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 1 day ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

pune

Remote

Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 1 day ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

pune

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 day ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

kolkata

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 1 day ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

chennai

Work from Office

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 day ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Opportunity : Join a pioneering consulting firm in the Data Analytics and Cloud Solutions sector, where transformative data architectures empower global enterprises. We specialize in leveraging cutting-edge Snowflake technologies and innovative cloud solutions to drive real-time insights and business intelligence. This remote role, based in India, offers the opportunity to work on high-impact projects while collaborating with a diverse team of experts. Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 1 day ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

hyderabad

Remote

Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies