Jobs
Interviews

2658 Snowflake Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

Looking for a Data Engineer with 5+ yrs in Snowflake & Python. Must have strong BI/data modeling, ETL, cloud (AWS/Azure/GCP), Snowflake capabilities, and Python/PySpark skills. Great communication, leadership, and problem-solving are key.

Posted 1 week ago

Apply

3.0 - 5.0 years

10 - 15 Lacs

Mumbai

Remote

Role & responsibilities Job Title: Senior Analyst - Data Analytics Job location: PAN INDIA Mandatory skills required: Data Analytics and reporting, Databricks, Power BI, Snowflake ******************************************************************************************* IMMEDIATE JOINERS ALERT! We're looking for candidates who can join immediately If you're available, please send your CV via WhatsApp only to: 9076159575 Along with your CV, kindly share a short video profile talking about your experience. Please note: No calls will be entertained. ******************************************************************************************* Skill set required Data Analytics and reporting, Databricks, Power BI, Snowflake We need candidates at Senior Analyst, with previous experience of Snowflake, Data Bricks and PowerBI. The focus is Snowflake and data analytics , as this is the reporting and analytics platform Location is from Pan India. Mode of Work: Permanent Work From Home (WFH). However, the candidate may be required to travel to the Mumbai office once a year based on business requirements, at their own expense if needed. Skillset should be more inclined towards analytics and not engineering. ( No data engineers please )

Posted 1 week ago

Apply

9.0 - 12.0 years

30 - 35 Lacs

Mumbai

Remote

Role & responsibilities J ob Title: Manager - Data Analytics Job location: PAN INDIA Mandatory skills required: Data Analytics and reporting, Databricks, Power BI, Snowflake ******************************************************************************************* IMMEDIATE JOINERS ALERT!We're looking for candidates who can join immediatelyIf you're available, please send your CV via WhatsApp only to: 9076159575Along with your CV, kindly share a short video profile talking about your experience.Please note: No calls will be entertained. ******************************************************************************************* Skill set required Data Analytics and reporting, Databricks, Power BI, Snowflake We need candidates at Manager with previous experience of Snowflake, Data Bricks and PowerBI. The focus is Snowflake and data analytics , as this is the reporting and analytics platform Location is from Pan India. Mode of Work: Permanent Work From Home (WFH). However, the candidate may be required to travel to the Mumbai office once a year based on business requirements, at their own expense if needed. Skillset should be more inclined towards analytics and not engineering. ( No data engineers please )

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Who We Are:- We are a digitally native company that helps organizations reinvent themselves and unleash their potential. We are the place where innovation, design and engineering meet scale. Globant is 20 years old, NYSE listed public organization with more than 33,000+ employees worldwide working out of 35 countries globally. www.globant.com Job location: Pune/Hyderabad/Bangalore Work Mode: Hybrid Experience: 5 to 10 Years Must have skills are 1) AWS (EC2 & EMR & EKS) 2) RedShift 3) Lambda Functions 4) Glue 5) Python 6) Pyspark 7) SQL 8) Cloud watch 9) No SQL Database - DynamoDB/MongoDB/ OR any We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications.

Posted 1 week ago

Apply

5.0 - 9.0 years

8 - 18 Lacs

Mumbai

Work from Office

Looking for Tech BA, With Regulatory reporting experience + Snowflake /(SQl/Oracle) & he need to have experience working with Data points to create requirement. Regulatory Reporting :- Asia, DOI (Singapore, Hong Kong), EMIR, SFIR, ASIC, MAS, QFC Job Requirements: Strong understanding of Investment Data and Asset Management industry 5+ years of experience in the financial services industry, preferably buy-side 5+ years of experience as a business analyst working in collaboration with software development Experience with OTC Derivatives, ETD, Equity, and Fixed Income instruments Experience with compliance systems or regulatory reporting systems is a plus Experience with EMIR, MAS, ASIC, MiFID or CFTC reporting is a plus Ability to work in a team environment Capable of managing multiple tasks with tight time deadlines Strong problem solving and practical decision-making skills Ability to communicate appropriately and effectively with stakeholders, colleagues, and vendors in both formal and informal contexts in a fashion tailored for the audience Strong SQL skills Strong Excel skills Strong analytical skills and attention to detail Tools used: Microsoft Office, Microsoft Visio, SQL

Posted 1 week ago

Apply

12.0 - 20.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Minimum 10 yrs in IT project/program management with hands-on in tools like JIRA, Excel, MS Project, Planisware. Strong in data platform implementation (Snowflake/Redshift), ETL/ELT, scalable architecture & business-aligned solutions.

Posted 1 week ago

Apply

8.0 - 12.0 years

0 - 0 Lacs

bangalore

On-site

Role Data Engineer Experience 8-12 Years Location Bangalore Design, develop, and maintain robust and scalable data pipelines that ingest, transform, and load data from various sources into data warehouse. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data quality checks and monitoring to ensure data accuracy and integrity. Optimize data pipelines for performance and efficiency. Troubleshoot and resolve data pipeline issues. Stay up-to-date with emerging technologies and trends in data engineering. Qualifications Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in SQL and at least one programming language (e.g., Python, Java). Experience with data pipeline tools and frameworks Experience with cloud-based data warehousing solutions (Snowflake). Experience with AWS Kinesis, SNS, SQS Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Desired Skills & Experience:

Posted 1 week ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Gurugram

Work from Office

Job Summary Synechron is seeking a detail-oriented Data Analyst to leverage advanced data analysis, visualization, and insights to support our business objectives. The ideal candidate will have a strong background in creating interactive dashboards, performing complex data manipulations using SQL and Python, and automating workflows to drive efficiency. Familiarity with cloud platforms such as AWS is a plus, enabling optimization of data storage and processing solutions. This role will enable data-driven decision-making across teams, contributing to strategic growth and operational excellence. Software Requirements Required: PowerBI (or equivalent visualization tools like Streamlit, Dash) SQL (for data extraction, manipulation, and querying) Python (for scripting, automation, and advanced analysis) Data management tools compatible with cloud platforms (e.g., AWS S3, Redshift, or similar) Preferred: Cloud platform familiarity, especially AWS services related to data storage and processing Knowledge of other visualization platforms (Tableau, Looker) Familiarity with source control systems (e.g., Git) Overall Responsibilities Develop, redesign, and maintain interactive dashboards and visualization tools to provide actionable insights. Perform complex data analysis, transformations, and validation using SQL and Python. Automate data workflows, reporting, and visualizations to streamline processes. Collaborate with business teams to understand data needs and translate them into effective visual and analytical solutions. Support data extraction, cleaning, and validation from various sources, ensuring data accuracy. Maintain and enhance understanding of cloud environments, especially AWS, to optimize data storage, processing pipelines, and scalability. Document technical procedures and contribute to best practices for data management and reporting. Performance Outcomes: Timely, accurate, and insightful dashboards and reports. Increased automation reducing manual effort. Clear communication of insights and data-driven recommendations to stakeholders. Technical Skills (By Category) Programming Languages: Essential: SQL, Python Preferred: R, additional scripting languages Databases/Data Management: Essential: Relational databases (SQL Server, MySQL, Oracle) Preferred: NoSQL databases like MongoDB, cloud data warehouses (AWS Redshift, Snowflake) Cloud Technologies: Essential: Basic understanding of AWS cloud services (S3, EC2, RDS) Preferred: Experience with cloud-native data solutions and deployment Frameworks and Libraries: Python: Pandas, NumPy, Matplotlib, Seaborn, Plotly, Streamlit, Dash Visualization: PowerBI, Tableau (preferred) Development Tools and Methodologies: Version control: Git Automation tools for workflows and reporting Familiarity with Agile methodologies Security Protocols: Awareness of data security best practices and compliance standards in cloud environments Experience Requirements 3-5 years of experience in data analysis, visualization, or related data roles. Proven ability to deliver insightful dashboards, reports, and analysis. Experience working across teams and communicating complex insights clearly. Knowledge of cloud environments like AWS or other cloud providers is desirable. Experience in a business environment, not necessarily as a full-time developer, but as an analytical influencer. Day-to-Day Activities Collaborate with stakeholders to gather requirements and define data visualization strategies. Design and maintain dashboards using PowerBI, Streamlit, Dash, or similar tools. Extract, transform, and analyze data using SQL and Python scripts. Automate recurring workflows and report generation to improve operational efficiencies. Troubleshoot data issues and derive insights to support decision-making. Monitor and optimize cloud data storage and processing pipelines. Present findings to business units, translating technical outputs into actionable recommendations. Qualifications Bachelors degree in Computer Science, Data Science, Statistics, or related field. Masters degree is a plus. Relevant certifications (e.g., PowerBI, AWS Data Analytics) are advantageous. Demonstrated experience with data visualization and scripting tools. Continuous learning mindset to stay updated on new data analysis trends and cloud innovations. Professional Competencies Strong analytical and problem-solving skills. Effective communication, with the ability to explain complex insights clearly. Collaborative team player with stakeholder management skills. Adaptability to rapidly changing data or project environments. Innovative mindset to suggest and implement data-driven solutions. Organized, self-motivated, and capable of managing multiple priorities efficiently. S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

Posted 1 week ago

Apply

5.0 - 7.0 years

3 - 7 Lacs

Gurugram

Work from Office

About the Opportunity Job TypeApplication 29 July 2025 Title Senior Analyst Programmer Department FIL India Technology - GPS Location Gurugram Level Software Engineer- 3 Fidelity International offers investment solutions and services and retirement expertise to more than 2.52 million customers globally. As a privately-held, purpose-driven company with a 50-year heritage, we think generationally and invest for the long term. Operating in more than 25 locations and with $750.2 billion in total assets, our clients range from central banks, sovereign wealth funds, large corporates, financial institutions, insurers and wealth managers, to private individuals. Were proud to have been helping our clients build better financial futures for over 50 years. How have we achieved thisBy working together - and supporting each other - all over the world. So, join our GPS Data Platform team and feel like youre part of something bigger. About your team The GPS Lakehouse & Reporting is a team of around 100 people whose role is to develop and maintain the datwarehouse and reporting platforms that we use to administer the pensions and investments of our workplace and retail customers across the world. In doing this we critical to the delivery of our core product and value proposition to these clients today and in future. About your role The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day to day basis including data centre, networks, proximity services, security, voice, incident management and remediation. Below are the key responsibilities: Work with Delivery Managers and System/Business Analysts and other subject matter experts to understand the requirements Implement Informatica mappings between inbound and target data model Produce Technical specifications, unit test cases for the interfaces under development Provide support through all phases of implementation Adhere to the source code control policies of the project Implement and use appropriate Change Management processes Develop capability to implement Business Intelligence tools. About you Must have technical skills: Strong understanding of standard ETL tool Informatica Power Centre with a minimum of 3 years experience. Strong Oracle SQL/PLSQL, Stored Procedure experience Knowledge of Devops, Configuration Management tools like SVN, CI tools Experience of using job scheduling tools (Control-M preferred) Experience in UNIX or Python scripting Good to have technical skills- Familiarity in Data Warehouse, Data marts and ODS concepts Exposure to Agile (Scrum) development practices Knowledge of data normalisation and Oracle performance optimisation techniques Cloud Technologies like AWS and Snowflake Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 week ago

Apply

5.0 - 7.0 years

30 - 35 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 Title Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Department Technology Location Bangalore (hybrid / flexible working permitted) Reports To Data Analysis Chapter Lead Level Associate Director About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across Fidelitys key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Mumbai

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Databricks Unified Data Analytics PlatformMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, while staying updated with the latest industry trends and technologies to continuously improve data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze data requirements.- Design and implement data models that support business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Data Engineering, Databricks Unified Data Analytics Platform.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 3.0 years

5 - 9 Lacs

Kochi

Work from Office

Job Title - Data Engineer Sr.Analyst ACS Song Management Level:Level 10- Sr. Analyst Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture (do not remove the hyperlink)Qualification Experience:3.5 -5 years of experience is required Educational Qualification:Graduation (Accurate educational details should capture)

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time educationTHIS DEMAND IS FOR ATCAMA . JD as UnderData Engineering skills:Knowledge of data integration, data warehousing, and data lake technologies. Data Quality and Governance skills:Experience with data quality tools, data governance frameworks, and data profiling techniques. Programming skills:Proficiency in languages like Java, Python, or SQL, depending on the specific role. Cloud computing skills:Experience with cloud platforms like AWS, Azure, or Google Cloud Platform. Problem-solving skills:Ability to troubleshoot data issues and identify solutions. As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance strategy implementation- Develop and maintain data governance frameworks- Conduct data quality assessments Professional & Technical Skills: - Strong understanding of data governance principles- Experience in implementing data governance solutions- Knowledge of data privacy regulations- Familiarity with data quality management practices Additional Information:- The candidate should have a minimum of 5+ years of experience in Atacama Data Governance.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that enhance operational efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud-based data solutions and architecture.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 15 Lacs

Bengaluru

Work from Office

Immediate joiners only • Design, develop, and maintain ETL processes using tools such as Talend, Informatica, SSIS, or similar. • Extract data from various sources, including databases, APIs, and flat files, transforming it to meet business requirements. • Load transformed data into target systems while ensuring data integrity and accuracy. • Collaborate with data analysts and business stakeholders to understand data needs and requirements. • Optimize ETL processes for enhanced performance and efficiency. • Debug and troubleshoot ETL jobs, providing effective solutions to data-related issues. • Document ETL processes, data models, and workflows for future reference and team collaboration.

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAS Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the successful delivery of high-quality software solutions. Roles & Responsibilities:Experience in SAS deployments for SAS Grid multimachine environment, SAS Multimachine clustered installation.Experience in configuration of EMI- framework for SAS Environment extended monitoring.Experience in creation of metadata connections to third party databases like MS SQL, Teradata, Redshift, Snowflake, SQL Anywhere ODBC connections etc.Strong experience of SAS processes to plan file generation, depot maintenance, Hotfix installation, license renewal.Hands-on experience using DI/BI tools like SAS Data Integration Studio, SAS Enterprise Guide, SAS Forecast Studio would be extra advantage.Proficient in SAS admin activities like creating access to SAS, managing SAS groups.Experience of setting up SAS Security model using ACTs.Experience in maintaining the security bridge in Unix server through ACLs settings at the folder level as well as user level.Experience in developing Shell Scripts to check system resources for sending an email notification.Experience in administration and maintenance of SAS Grid Environments and environments configured based on Levs.Experience in adding users, groups, Authentication domains, setting up ACTs, scheduling jobs, taking backups, configuring base sas and third-party libraries, registering tables from SAS Management Console.Installation of SAS client applications and troubleshooting technical problems.Tracking applicable SAS hot fixes and creating implementation plans for respective SAS modules and applications.Monitoring SAS Server resources and reporting usage(s). Professional & Technical Skills: Identifying performance and recommending/implementing tuning of active SAS environment(s) providing projected capacity shortfalls.Designing, implementing, and maintaining security on SAS Metadata and Linux/Unix for users.Designing, implementing, and executing the change control and promotions tasks in SAS.Provide guidance and assistance to SAS Developers on operational and technical issues, interfacing with the SAS Institute for Support on administrative and system issues.Monitor and log SAS servers and optimize memory usage or tune servers for optimal performance.Focus on areas of capability, interoperability, scalability and enterprise class issues (i.e., like failover).Develop applications using SAS Base & Macros.Test and debug applications to ensure that they meet quality standards.Provide technical guidance and support to junior team members.Contribute to team discussions and actively participate in providing solutions to work-related problems.Strong communication skills to present technical information to business stakeholders.Experienced in troubleshooting, documentation & backtracking critical path flow processes.Proven experience in administering both production and lower environments (development, testing, and staging) for SAS Grid environments. This includes managing configurations, deployments, and ensuring seamless integration and operation across different environments.Demonstrated ability to handle operational tasks and maintenance activities for SAS Grid environments, including performance tuning, troubleshooting issues, and implementing updates and patches in both production and lower environments. The role requires ensuring high availability and reliability across all environments Additional Information:The candidate should have a minimum of 5 years of experience in SAS Administration.15 years of education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Bengaluru

Remote

This is an URGENT requirement. We are hiring for a UK based Fintech company (name is kept confidential). The company is seeking a talented Senior Analytics Engineer to join the team and help build analytics pipelines working closely with the senior stakeholders. As an Analytics Engineer specializing in the payments space, youll be at the forefront of analysing payment transaction data, uncovering trends, and optimising card issuance operations. Your work will directly shape strategic initiatives and improve business outcomes. Please note that advanced experience in Data Build Tool (DBT) is a MUST for this role. You should NOT apply for this role if you don't have experience with DBT . Key Responsibilities Analyze large datasets related to payment processing and customer transactions to uncover trends and actionable insights. Develop dashboards and reports to track KPIs and support decision-making. Work with stakeholders to understand data needs and provide insights through presentations and reports. Deliver data-driven recommendations to support business objectives. Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated on industry trends and contribute to process enhancements. Qualifications Bachelors degree in Data Science, Computer Science, Information Systems, Finance, or a related field. Proven experience as a Data Analyst/Analytics Engineer role, preferably in the payments industry with issuer processors. Proven experience in SQL, DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding of data modeling principles and experience applying modeling techniques. Proficiency with data visualization tools such as Tableau, Power BI, or similar. Knowledge of payment processing system, card issuance, and related services. Experience with cloud-based data solutions (e.g., AWS, Azure, Google Cloud). Familiarity with modern data architecture such as data Lakehouse. Strong analytical, problem-solving, and communication skills. Attention to detail and a commitment to data quality and integrity. Familiarity with regulatory requirements and security standards in the financial industry

Posted 1 week ago

Apply

4.0 - 9.0 years

0 - 2 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning. Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must Exposure to the financial domain knowledge is considered a plus. Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. Prior experience with State Street and Charles River Development ( CRD) considered a plus. Experience in tools such as Visio, PowerPoint, Excel. Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus. Strong SQL knowledge and debugging skills is a must

Posted 1 week ago

Apply

0.0 years

2 - 6 Lacs

Mumbai

Work from Office

Skill required: Data Scientist - Data Science Designation: I&F Decision Science Practitioner Specialist Qualifications: Any Graduation Years of Experience: Experienced About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be aligned with our Insights & Intelligence vertical and help us generate insights by leveraging the latest Artificial Intelligence (AI) and Analytics techniques to deliver value to our clients. You will also help us apply your expertise in building world-class solutions, conquering business problems, addressing technical challenges using AI Platforms and technologies. You will be required to utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI applications that scale from multi-user to enterprise-class and demonstrate yourself as an expert by actively blogging, publishing research papers, and creating awareness in this emerging area. You will be working as a part of Marketing & Customer Analytics team which provides a set of processes that measure, manage and analyze marketing activities in order to provide actionable insights and recommendations to marketing organizations in terms of optimizing ROI & performance efficiency in operations.Customer analytics is a process by which data from customer behavior is used to help make key business decisions via market segmentation and predictive analytics. This information is used by businesses for direct marketing, site selection, and customer relationship management. You should have exposure to digital marketing, A/B testing, MVT, Google Analytics/Site Catalyst. You will be a core member of Accenture Operations global Applied Intelligence group, an energetic, strategic, high-visibility and high-impact team, to innovate and transform the Accenture Operations business using machine learning, advanced analytics to support data-driven decisioning. The objectives of the team include but are not limited toLeading team of data scientists to build and deploy data science models to uncover deeper insights, predict future outcomes, and optimize business processes for clients. Refining and improving data science models based on feedback, new data, and evolving business needs. Analyze available data to identify opportunities for enhancing brand equity, improving retail margins, achieving profitable growth, and expanding market share for clients. What are we looking for Extensive experience in leading Data Science and Advanced Analytics delivery teams Strong statistical programming experience - Python, R, SAS, S-plus, MATLAB, STATA or SPSS. Experience working with large data sets and big data tools like Snowflake, AWS, Spark, etc. Solid knowledge in at least one of the following Supervised and Unsupervised Learning, Classification, Regression, Clustering, Neural Networks, Ensemble Modelling (random forest, boosted tree, etc.), Multivariate Statistics, Non-parametric Methods, Reliability Models, Markov Models, Stochastic models, Bayesian Models Experience in atleast one of these business domainsCPG, Retail, Marketing Analytics, Customer Analytics, Digital Marketing, eCommerce, Health, Supply Chain Extensive experience in client engagement and business development Ability to work in a global collaborative team environment Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally, interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Pune

Hybrid

The Job Description is as below - - Minimum 6 years experience MS SQL Database Administration - Atleast 2 years experience in Snowflake is mandatory. - Experience Replication, Performance Tuning etc is mandatory. - Candidate shoould be flexible working 24*7 shift. - Experience in Azure SQL is preferred.

Posted 1 week ago

Apply

9.0 - 14.0 years

0 - 3 Lacs

Pune, Chennai, Bengaluru

Hybrid

Skill - Snowflake with ADF Experience - 9 years - 15 years Location - Pune/ Mumbai / Chennai / Bangalore • Snowflake ELT/ETL mechanisms and ADF experience Stored procedures to ingest structured, semi-structured datasets using files, and RDBMS - SQL Server. • Proficient in data analysis using SQL, and other data analysis techniques. • Expertise with data operations, sustaining data pipelines and production system processes. • Experience in applying Data management patterns to measure & monitor data quality. Applying data governance standards and creating technical data dictionary. • Experience in SDLC Processes, CI/CD and Agile Methodologies. • Ability to work with Jira, Share Point, Confluence, Git. • Develops and delivers data & analytics solutions and projects to meet organizational priorities and timelines. • Analyses enterprise and external data sources, creates optimized & resilient data pipelines to ingest and enriches them. • Designs analytical consume ready data models by applying modern data warehouse techniques. • Creates and enriches curated data models while ensuring conformed, transformed, enriched and connected data sets. • Enables dashboards/reports in Tableau; Applies analytical mindset in deriving data driven insights, designs & creates intuitive dashboards and scorecards. Apply directly through https://lnkd.in/gE-H4-Xa or Kindly share your updated resume to AISHWARYAG5@hexaware.com

Posted 1 week ago

Apply

3.0 - 6.0 years

3 - 5 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Job Title: Data Engineer Snowflake & ETL Specialist Experience: 3–6 years (adjust as needed) Employment Type: Full-time Joiner-Immediate Location-Gurgaon Department: Data Engineering / Analytics Job Summary: We are seeking a skilled Data Engineer with strong hands-on experience in Snowflake, ETL development, and AWS Glue. The ideal candidate will be responsible for designing, building, and optimizing scalable data pipelines and data warehouse solutions that support enterprise-level analytics and reporting needs. Key Responsibilities: • Develop, optimize, and manage ETL pipelines using AWS Glue, Python, and Snowflake. • Design and implement data warehouse solutions and data models based on business requirements. • Work closely with data analysts, BI developers, and stakeholders to ensure clean, consistent, and reliable data delivery. • Monitor and troubleshoot performance issues related to data pipelines and queries in Snowflake. • Participate in code reviews, documentation, and knowledge-sharing activities. • Ensure data security, governance, and compliance with organizational

Posted 1 week ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Job Description: We are seeking two Senior Data Consultants with expertise in Python, Splunk, Power BI, and Snowflake to support the management, analysis, and long-term retention of large-scale network data. The role involves designing and implementing data pipelines, creating advanced visualizations, and integrating data with Snowflake for AI-driven actions and reporting. Candidates will work closely with US-based teams, requiring occasional overlap with US business hours. Key Responsibilities: Design, develop, and maintain data pipelines using Python to process and analyze large-scale network data. Utilize Splunk for log management, monitoring, and advanced data analytics to derive actionable insights. Create interactive dashboards and reports in Power BI to visualize complex network data for stakeholders. Integrate and manage data workflows in Snowflake to support long-term retention and AI-related initiatives. Collaborate with cross-functional teams, including US-based teams, to align on project goals and deliverables. Ensure data quality, security, and compliance with organizational standards. Provide technical expertise and mentorship to junior team members. Participate in occasional meetings during US business hours to ensure seamless collaboration. Technical Skills : Advanced proficiency in Python for data processing, scripting, and automation. Hands-on experience with Splunk for log analysis, search, and dashboard creation. Expertise in Power BI for developing interactive reports and dashboards. Strong experience with Snowflake for data warehousing, data modeling, and AI-driven workflows. Domain Knowledge : Experience working with large-scale network data or similar high-volume datasets. Soft Skills : Excellent problem-solving, communication, and collaboration skills. Availability : Ability to work onsite in Bangalore 3 days a week and flexibility for occasional US-hour meetings.

Posted 1 week ago

Apply

4.0 - 9.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Experience - 4+ yrs Location- Pan India Notice period- immediate to 30 Days Mandatory skills- Python, snowflake along with LLM experience Develop backend services and data-driven applications using Python and Streamlit. Integrate with Snowflake to query governed metrics via dbt Clouds Semantic Layer. Implement smart query routing and optimize Snowflake performance for predictive and descriptive analytics. Support integration with LLM-powered interfaces and natural language query workflows. Good to have: Java script or similar.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies