Jobs
Interviews

2244 Snowflake Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

9 - 15 Lacs

Bengaluru

Work from Office

JD : Below are the details about required skills. Administer and manage Snowflake environments: Oversee user access, security, and performance tuning. Develop and optimize SQL queries: Create and refine complex SQL queries for data extraction, transformation, and loading (ETL) processes. Implement and maintain data pipelines: Use Python and integrate them with Snowflake. Monitor and troubleshoot: Ensure the smooth operation of Snowflake environments, identifying and resolving issues promptly. Collaborate with data engineers: Work closely with data engineers to provide optimized solutions and best practices. Review roles hierarchy: Provide recommendations for best practices in role hierarchy and security.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

12 - 22 Lacs

Hyderabad

Work from Office

Reporting To: CTO Experience: 7+ years Work Location: Hyderabad India We are looking for an experienced and dynamic Analytics Team Lead to manage our analytics team. As the Analytics Team Lead, you will be responsible for managing a team of analysts, collaborating with cross-functional departments, and utilizing data to provide valuable insights that contribute to the achievement of organizational objectives. Requirements: Bachelors or Masters degree in Computer Science or a related discipline. Minimum of 5-7 years of progressive experience in analytics or data science roles. At least 2-3 years of experience in a team lead position within an analytics team. Demonstrated success in managing and leading a team of 20 analysts to deliver impactful insights. Experience in working with large datasets and using database management systems. Strong project management skills, with the ability to manage the planning, execution, and delivery of analytics projects. Excellent communication skills to convey complex data insights to both technical and non-technical stakeholders. Proficiency in customer management strategies and techniques. Expertise in Snowflake and AWS is a plus. Familiarity with industry best practices and emerging trends in analytics and data science.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

22 - 25 Lacs

Bengaluru

Work from Office

We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance. Immediate Joiners.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

19 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Snowflake DBT SQL PYTHON PYSPARK AWS

Posted 3 weeks ago

Apply

10.0 - 15.0 years

14 - 19 Lacs

Chennai

Work from Office

Job Summary: We are seeking ahighly motivated and experienced Delivery Partner to lead and oversee thesuccessful delivery of analytics projects. The ideal candidate will possess astrong background in data analytics, excellent project management skills, and aproven ability to collaborate effectively with cross-functional teams. You willbe responsible for ensuring projects are delivered on time, within budget, andto the required quality standards. This role requires a detail-orientedindividual with excellent communication and problem-solving skills, and apassion for leveraging data to drive business value. Responsibilities: Project Leadership: Leadand manage the full lifecycle of analytics projects, from initiation todeployment. Stakeholder Management: Collaboratewith business stakeholders, product owners, and engineering teams to defineproject scope, requirements, and deliverables. Technical Guidance: Providetechnical leadership and guidance to the development team, ensuring adherenceto architectural standards and best practices. Data Architecture & Design: Contributeto data architecture design, data modeling (logical and physical), and thedevelopment of data integration strategies. Risk Management: Identify,assess, and mitigate project risks and issues, escalating as necessary. Resource Management: Manageproject resources, including developers, data engineers, and business analysts,ensuring optimal allocation and utilization. Quality Assurance: Implementand maintain quality assurance processes to ensure data accuracy, integrity,and reliability. Communication & Reporting: Provideregular project status updates to stakeholders, including progress, risks, andissues. Process Improvement: Identifyand implement process improvements to enhance project delivery efficiency andeffectiveness. Vendor Management: Managerelationships with external vendors and partners, ensuring deliverables meetexpectations and contractual obligations. Qualifications: Bachelors degree in computer science, Engineering, or arelated field. 15+ years of experience in data engineering, datawarehousing, business intelligence, or a related area. Proven experience in managing complex data-related projectsusing Agile methodologies. Strong understanding of data warehousing concepts, datamodeling techniques, and ETL processes. Hands-on experience with cloud platforms such as AWS,Azure, or Snowflake. Proficiency in SQL and experience with databases such asTeradata, Oracle, and SQL Server. Experience with data visualization tools such as Tableau orPower BI. Excellent communication, interpersonal, and leadershipskills. Ability to work effectively in a matrix environment. Strong problem-solving and analytical skills. Experience with project management tools such as JIRA, MSProject, and Confluence

Posted 3 weeks ago

Apply

8.0 - 10.0 years

3 - 6 Lacs

Chennai

Work from Office

About the Role: Senior Business Intelligence Analyst The BusinessIntelligence Analyst is responsible for collecting and analyzing data frommultiple sources systems, to help organization make better business decisions.This role is crucial in maintaining data quality, compliance, and accessibilitywhile driving data-driven decision-making and reporting for Mind sprintclients. The role requires a combination of OLAM business domain expertise,problem-solving skills, and business acumen. Create,review, validate and manage data as it collected. The person will act ascustodian of data getting generated. Developpolicies and procedures for the collection and analysis of data. Possessanalytical skills to analyze data to derive meaningful insights. Skill togenerate predictive and insightful reports. Build dailyreports and schedule internal weekly and monthly meetings, preparing in advanceto share relevant and beneficial information. Data Ownership: Assume ownership of specific datasets, data dictionaries, metadata, masterdata and ensure data accuracy, completeness, and relevance. Data Integration: Collaborate with system owners,data engineers, domain experts and integration teams to facilitate the smooth integrationof financial data from multiple systems/entities into the financialtransactional and analytical datamarts. Data Quality Assurance: Establish and enforce dataquality standards and policies within the financial domain. Collaborate withdata engineers, analytics, data stewards and data custodians to monitor andimprove data quality. Data Access Control: Control and manage access todata, ensuring appropriate permissions and security measures are in place.Monitor and audit data access to prevent unauthorized use Data Reporting and Analysis: Collaborate withfinance teams to generate accurate and timely financial reports. Perform dataanalysis to identify trends, anomalies, and insights in financial data,supporting financial modelling, forecasting, and predictive decision-making. Collaborate with co-workers andmanagement to implement improvements. Job Qualifications: Masters/bachelors in financeand accounting or related fields. An advanced degree is a plus. Proven experience in financialdata management, data governance, and data analysis. Demonstrated ability to approachcomplex problems with analytical and critical thinking skills. Excellent written and verbalcommunication skills Leadership skills and theability to collaborate effectively with cross-functional teams. Ability to influence andinteract with senior management. Preferred Qualifications & Skills Knowledge in Big Data, DataLake, Azure Data Factory (ADF), Snowflake, DataBricks Synapse, MonteCarlo,Atlin and DevOpS tools like DBT. Agile Project Management Skillswith knowledge of JIRA & Confluence Good understanding of financialconcepts like Balance Sheet, P&L, TB, Direct Costs Management, Fair value,Book Value, Production/Standard costs, Stock Valuations, Ratios, andSustainability Finance. Experience in working with ERPdata especially SAP FI and SAP CO. Strategic mindset and theability to identify opportunities to use data to drive business growth. Youshould be able to think creatively and identify innovative solutions to complexproblems. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Pune

Hybrid

Administer and manage Snowflake environments: Oversee user access, security, and performance tuning. Develop and optimize SQL queries: Create and refine complex SQL queries for data extraction, transformation, and loading (ETL) processes. Implement and maintain data pipelines: Use Python and integrate them with Snowflake. Monitor and troubleshoot: Ensure the smooth operation of Snowflake environments, identifying and resolving issues promptly. Collaborate with data engineers: Work closely with data engineers to provide optimized solutions and best practices. Review roles hierarchy: Provide recommendations for best practices in role hierarchy and security. Experience: Minimum of 3 years as a Snowflake Administrator, with a total of 5+ years in database administration or data engineering. Technical Skills: Proficiency in SQL, Python, and experience with performance tuning and optimization. Cloud Services: Experience with cloud platforms such as Azure. Data Warehousing: Strong understanding of data warehousing concepts and ETL processes. Problem-Solving: Excellent analytical and problem-solving skills.

Posted 3 weeks ago

Apply

2.0 - 3.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Design, develop, and optimize SQL queries for PostgreSQL, MySQL, and Snowflake. Build and maintain ETL pipelines using Python for data processing and migration. Proficient in PostgreSQL and MySQL for database management and optimization.

Posted 3 weeks ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Bengaluru

Remote

As the data engineering consultant, you should have the common traits and capabilities that are listed Essential Requirements and meet many of the capabilities listed in Desirable Requirements Essential Requirements and Skills 10+ years working with customers in the Data Analytics, Big Data and Data Warehousing field. 10+ years working with data modeling tools. 5+ years building data pipelines for large customers. 2+ years of experience working in the field of Artificial Intelligence that leverages Big Data. This should be in a customer-facing services delivery role. 3+ years of experience in Big Data database design. A good understanding of LLMs, prompt engineering, fine tuning and training. Strong knowledge of SQL, NoSQL and Vector databases. Experience with popular enterprise databases such as SQL Server, MySQL, Postgres and Redis is a must. Additionally experience with popular Vector Databases such as PGVector, Milvus and Elasticsearch is a requirement. Experience with major data warehousing providers such as Teradata. Experience with data lake tools such as Databricks, Snowflake and Starburst. Proven experience building data pipelines and ETLs for both data transformation and multiple data source data extraction. Experience with automation of the deployment and execution of these pipelines. Experience with tools such as Apache Spark, Apache Hadoop, Informatica and similar data processing tools. Proficient knowledge of Python and SQL is a must. Proven experience with building test procedures, ensuring the quality, reliability, performance, and scalability of the data pipelines. Ability to develop applications that expose Restful APIs for data querying and ingestion. Experience preparing training data for Large Language Model ingestion and training (e.g. through vector databases). Experience with integrating with RAG solutions and leveraging related tools such as Nvidia Guardrails. Ability to define and implement metrics for RAG solutions. Understanding of typical AI tooling ecosystem including knowledge and experience of Kubernetes, MLOps, LLMOps and AIOps tools. Ability to gain customer trust, ability to plan, organize and drive customer workshops. Good communication skills in English is a must. The ability to work in a highly efficient team using an Agile methodology such as Scrum or Kanban. Ability to have extended pairing sessions with customers, enabling knowledge transfers in complex domains. Ability to influence and interact with confidence and credibility at all levels within the Dell Technologies companies and with our customers, partners, and vendors. Experience working on project teams within a defined methodology while adhering to margin, planning and SOW requirements. Ability to be onsite during customer workshops and enablement sessions. Desirable Requirements and Skills Knowledge of industry widespread AI Studios and AI Workbenches is a plus. Experience building and using Information Retrieval (IR) frameworks to support LLM inferencing. Working knowledge of Linux is a plus. Knowledge of using Minio is appreciated. Experience using Lean and Iterative Deployment Methodologies. Working knowledge of cloud technologies is a plus. University Degree aligned to Data Engineering is a plus. In possession of relevant industry certifications e.g. Databricks Certified Data Engineer, Microsoft Certifications, etc.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Pune, Gurugram

Hybrid

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory, Snowflake, and DBT. Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake, Azure Synapse, or Azure Functions. Familiarity with Python or PySpark for custom data transformations. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. • Strong problem-solving, communication, and collaboration skills.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

4 - 7 Lacs

Ghaziabad

Remote

Dear Candidate, We are looking for a data engineer trainer with Databricks& Snowflake on a part- time basis and can provide training to our US students. Please find the job description for your reference. If it seems good fit for you, please revert to us with your updated resume. Job Summary We are looking for a skilled and experienced Data Engineer Trainer to join our team! In this role, you will deliver training content to our US based students in Data Engineer with Snowflake & Databricks . You will have an opportunity to combine a passion for teaching, with enthusiasm for technology, to drive learning and establish positive customer relationships. You should have excellent communication skills and proven technology training experience. Key job responsibilities In this role, you will be at the heart of the world class programs delivered by Synergistic Compusoft Pvt Ltd- Your job responsibilities will include : 1. Training working professionals on in-demand skills like Data Bricks, Snowflake , Azure Data Lake. 2. Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) 3. Good Knowledge and Understanding of Data warehouse concepts. 4. Experience with designing implementation modern data platforms (Data Fabrics, Data Mesh, Data Hubs etc,) 5. Experience with the design of Data Catalogs/dictionaries driven by active metadata 6. Delivering highly interactive lectures online that are in line with Synergistic Compusofts teaching methodology. 7. Develop cutting edge and innovative content for classes to help facilitate delivery of classes in an interesting way. 8. Strong programming skills in languages such as SQL, Python, or Scala. 9. Knowledge of data integration patterns, data lakes, and data warehouses. 10. Experience with data quality, data governance, and data security best practices. Note: Trainers have to make our students certify on any global azure certification & deliver content accordingly. Excellent communication and collaboration skills. Primary Skills: Databricks , Snowflake Secondary Skills: ADF, Databricks, Python, Perks and Benefits Remuneration Best in the Industry(55-60k per month) 5 Days working (Mon- Fri) For Part Time- 2.5 to 3 hours - Remote (Night Shift -10:30 Pm onwards) The curriculum and syllabus should be provided by the trainer The curriculum and syllabus should align with the Azure Certification requirements. The duration of a single batch depends on the trainer, but it cannot exceed more than 3 months. Company Website- www.synergisticit.com Companys LinkedIn profile- https://www.linkedin.com/redir/redirect?url=https%3A%2F%2Fsynergisticit%2Ecom%2F&urlhash=rKyX&trk=about_website

Posted 3 weeks ago

Apply

7.0 - 12.0 years

18 - 27 Lacs

Hyderabad

Work from Office

SnowFlake Data Engineering (SnowFlake, DBT & ADF) Lead Programmer Analyst (Experience: 7 to 12 Years) We are looking for a highly self-motivated individual with SnowFlake Data Engineering (SnowFlake, DBT & ADF) Lead Programmer Analyst: At least 5+ years of experience in designing and developing Data Pipelines & Assets. Must have experience with at least one Columnar MPP Cloud data warehouse (Snowflake/Azure Synapse/Redshift) for at least 5 years. Experience in ETL tools like Azure Data factory, Fivetran / DBT for 4 years. Experience with Git and Azure DevOps. Experience in Agile, Jira, and Confluence. Solid understanding of programming SQL objects (procedures, triggers, views, functions) in SQL Server. Experience optimizing SQL queries a plus. Working Knowledge of Azure Architecture, Data Lake. Willingness to contribute to documentation (e.g., mapping, defect logs). Generate functional specs for code migration or ask right questions thereof. Hands on programmer with a thorough understand of performance tuning techniques. Handling large data volume transformations (order of 100 GBs monthly). Able to create solution / data flows to suit requirements. Produce timely documentation e.g., mapping, UTR, defect / KEDB logs etc. Self-starter & learner. Able to understand and probe for requirements. Tech experience expected. Primary: Snowflake, DBT (development & testing). Secondary: Python, ETL or any data processing tool. Nice to have - Domain experience in Healthcare. Should have good oral and written communication. Should be a good team player. Should be proactive and adaptive.

Posted 3 weeks ago

Apply

12.0 - 22.0 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Python, Spark, and AWS, RAG, Azure, ETL, Python, Snowflake, Datawarehouse, Databricks, Abinitio, Tableau, SQL, and NoSQL. Good at problem solving, well-versed with overall project architecture and Hands-on Coding exp is mandatory

Posted 3 weeks ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Kochi

Work from Office

About Neudesic Passion for technology drives us, but its innovation that defines us . From design to development and support to management, Neudesic offers decades of experience, proven frameworks, and a disciplined approach to quickly deliver reliable, quality solutions that help our customers go to market faster.What sets us apart from the rest is an amazing collection of people who live and lead with our core values. We believe that everyone should be Passionate about what they do, Disciplined to the core, Innovative by nature, committed to a Team and conduct themselves with Integrity . If these attributes mean something to you - we'd like to hear from you. We are currently looking for Azure Data Engineers to become a member of Neudesic s Data & AI team. Must Have Skills: Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Scala, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Good understanding about SQL, Databases, NO-SQL DBs, Data Warehouse, Hadoop and various data storage options on the cloud. Development experience in orchestration of pipelines Experience in deployment and monitoring techniques Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling Good to Have Skills: Familiarity with DevOps, Agile Scrum methodologies and CI/CD Domain-driven development exposure Analytical / problem solving skills Strong communication skills Good experience with unit, integration and UAT support Able to design and code reusable components and functions Should able to review design, code & provide review comments with justification Zeal to learn new tool/technologies and adoption Power BI and Data Catalog experience * Be aware of phishing scams involving fraudulent career recruiting and fictitious job postings; visit our Phishing Scams page to learn more. Neudesic is an Equal Opportunity Employer Neudesic provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by local laws. Neudesic is an IBM subsidiary which has been acquired by IBM and will be integrated into the IBM organization. Neudesic will be the hiring entity. By proceeding with this application, you understand that Neudesic will share your personal information with other IBM companies involved in your recruitment process, wherever these are located. More Information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here: https://www.ibm.com/us-en/privacy?lnk=flg-priv-usen

Posted 3 weeks ago

Apply

5.0 - 7.0 years

15 - 30 Lacs

Bengaluru

Hybrid

Snowflake Database Administrator (DBA) Summary: We are seeking a highly skilled and experienced Snowflake Database Administrator (DBA) to join our team. The ideal candidate will be responsible for the administration, management, and optimization of our Snowflake data platform. The role requires strong expertise in database design, performance tuning, security, and data governance within the Snowflake environment. Key Responsibilities: • Administer and manage Snowflake cloud data warehouse environments, including provisioning, configuration, monitoring, and maintenance. Implement security policies, compliance, and access controls. Manage Snowflake accounts and databases in a multi-tenant environment. Monitor the systems and provide proactive solutions to ensure high availability and reliability. Monitor and manage Snowflake costs. Collaborate with developers, support engineers and business stakeholders to ensure efficient data integration. Automate database management tasks and procedures to improve operational efficiency. Stay up to date with the latest Snowflake features, best practices, and industry trends to enhance the overall data architecture. Develop and maintain documentation, including database configurations, processes, and standard operating procedures. Support disaster recovery and business continuity planning for Snowflake environments. Required Qualifications: Bachelors degree in computer science, Information Technology, or a related field. 5+ years of experience in Snowflake operations and administration. Strong knowledge of SQL, query optimization, and performance tuning techniques. Experience in managing security, access controls, and data governance in Snowflake. Familiarity with AWS. • Proficiency in Python or Bash. Experience in automating database tasks using Terraform, CloudFormation, or similar tools. Understanding of data modeling concepts and experience working with structured and semi-structured data (JSON, Avro, Parquet). Strong analytical, problem-solving, and troubleshooting skills. Excellent communication and collaboration abilities. Preferred Qualifications: • Snowflake certification (e.g., SnowPro Core, SnowPro Advanced: Architect, Administrator). • Experience with CI/CD pipelines and DevOps practices for database management. • Knowledge of machine learning and analytics workflows within Snowflake. • Hands-on experience with data streaming technologies (Kafka, AWS Kinesis, etc.)

Posted 3 weeks ago

Apply

10.0 - 18.0 years

25 - 40 Lacs

Noida

Hybrid

Experience Aplenty: 10+ years of hands-on experience in applicable software development environments, showcasing your prowess and ability to excel. Educational Symphony: A Bachelor's degree is strongly preferred, demonstrating your commitment to continuous learning and growth. Tech Savvy: Should have demonstrated experience in Cloud environments like AWS, GCP, or Azure. Comparable knowledge of tools like Azure Pipelines, BigQuery, MFT, Vault, SSIS, SSRS, SQL, & Google DataFlow. Workflow management and orchestration tools such as Airflow. Experience with object function/object-oriented scripting languages including Java and Python. Working knowledge of Snowflake and Google DataFlow a definite plus! Business Acumen: Translate business needs into technical requirements with finesse, showcasing your ability to balance technical excellence with customer satisfaction. Team Player: Collaborate seamlessly with the team, responding to requests in a timely manner, meeting individual commitments, and contributing to the collective success. Mentor Extraordinaire: Leverage your coaching and teaching skills to guide and mentor your fellow team members, fostering an environment of continuous improvement. Please apply here also- https://corelogic.wd5.myworkdayjobs.com/ACQ/job/Noida-Uttar-Pradesh/Lead-Data-Engineer_REQ15827

Posted 4 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and implement data platform solutions.- Develop and maintain data pipelines for efficient data processing.- Optimize data storage and retrieval processes.- Implement data security measures to protect sensitive information.- Conduct performance tuning and troubleshooting of data platform components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of cloud data platforms like AWS or Azure.- Experience with SQL and database management systems.- Hands-on experience with ETL tools for data integration.- Knowledge of data modeling and schema design. Additional Information:- The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Architecture Principles Good to have skills : Python (Programming Language), Data Building Tool, Snowflake Data WarehouseMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating and implementing innovative solutions to enhance business processes and meet application needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the team in implementing data architecture principles effectively- Develop and maintain data models and databases- Ensure data integrity and security measures are in place Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles- Good To Have Skills: Experience with Python (Programming Language), Snowflake Data Warehouse, Data Building Tool- Strong understanding of data architecture principles- Experience in designing and implementing data solutions- Knowledge of data modeling and database design Additional Information:- The candidate should have a minimum of 12 years of experience in Data Architecture Principles- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Skill required: Delivery - Credit Risk Modeling Designation: I&F Decision Sci Practitioner Sr Analyst Qualifications: Any Graduation Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AICredit risk modelling refers to the use of financial models to estimate losses a firm might suffer in the event of a borrower s default. What are we looking for Primary Skills: Banking, Financial Services Credit Risk Model Development Market Risk Modelling Financial Regulations Quantitative AnalysisSecondary Skills: SQL SAS Python R Tableau VBA Qlik Sense Snowflake MatillionSoft Skills: Adaptable and flexible Commitment to quality Ability to work well in a team Agility for quick learning Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification Any Graduation

Posted 4 weeks ago

Apply

7.0 - 11.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Skill required: Delivery - Credit Risk Modeling Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AICredit risk modelling refers to the use of financial models to estimate losses a firm might suffer in the event of a borrower s default. What are we looking for Primary Skills: Banking, Financial Services Credit Risk Model Development Market Risk Modelling Financial Regulations Quantitative AnalysisSecondary Skills: SQL SAS Python R Tableau VBA Qlik Sense Snowflake MatillionSoft Skills: Adaptable and flexible Commitment to quality Ability to work well in a team Result Oriented Problem Solving Skills Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems May create new solutions, leveraging and, where needed, adapting existing methods and procedures The person would require understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor May interact with peers and/or management levels at a client and/or within Accenture Guidance would be provided when determining methods and procedures on new assignments Decisions made by you will often impact the team in which they reside Individual would manage small teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Qualification Any Graduation

Posted 4 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Building Tool, Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data solutions and collaborating with teams to optimize data processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Develop and maintain data pipelines- Ensure data quality and integrity- Implement ETL processes Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Good To Have Skills: Experience with Data Building Tool- Strong understanding of data architecture- Proficiency in SQL and database management- Experience with cloud data platforms- Knowledge of data modeling Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

4.0 - 6.0 years

14 - 16 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Senior ETL Developer with IICS, Kafka & Snowflake Expertise MINIMUM 4 YEARS Experience needed WORKING WITH IICS Healthcare domain experience is a MUST. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Kolkata

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure timely project delivery- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in ETL processes- Knowledge of cloud data platforms- Hands-on experience in SQL development Additional Information:- The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse- This position is based at our Kolkata office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

6.0 - 11.0 years

16 - 22 Lacs

Bengaluru

Hybrid

Data Platform Design & Implementation Architect and deploy scalable, secure, and high-performing Snowflake environments in line with data segregation policies. Automate infrastructure provisioning, testing, and deployment for seamless operations. 2. Data Integration & Pipeline Development Develop, optimize, and maintain data pipelines (ETL/ELT) to ensure efficient data ingestion, transformation, and migration. Implement best practices for data consistency, quality, and performance across cloud and on-premises systems. 3. Data Transformation & Modeling Design and implement data models that enable efficient reporting and analytics. Develop data transformation processes using Snowflake, DBT, and Python to enhance usability and accessibility. 4. Networking, Security & Compliance Configure and manage secure network connectivity for data ingestion. Ensure compliance with GDPR, CISO policies, and industry security standards. 5. Data Quality & Governance Ensure the Data Segregation Policy is firmly followed for the data sets enabled. Implement data validation, anomaly detection, and quality assurance frameworks. Collaborate with the Data Governance team to maintain compliance and integrate quality checks into data pipelines.

Posted 4 weeks ago

Apply

3.0 - 5.0 years

5 - 12 Lacs

Pune

Remote

Hiring for Analytics Engineers / Data Modelers with hands-on experience in Honeydew, CubeDev, or DBT Metrics to join our growing data team in Pune. Hands-on experience with at least one: Honeydew, CubeDev, or DBT Metrics. Strong SQL

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies