Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
1 - 6 Lacs
Pune, Chennai, Bengaluru
Work from Office
Job Title: Data Engineer Experience: 5-7 Years Location : Bangalore Job Type: Full-Time with NAM Job Summary We are seeking an experienced Data Engineer with 5 to 7 years of experience in building and optimizing data pipelines and architectures on modern cloud data platforms. The ideal candidate will have strong expertise across Google Cloud Platform (GCP), DBT, Snowflake, Apache Airflow, and Data Lake architectures. Key Responsibilities Design, build, and maintain robust, scalable, and efficient ETL/ELT pipelines. Implement data ingestion processes using FiveTran and integrate various structured and unstructured data sources into GCP-based environments. Develop data models and transformation workflows using DBT and manage version-controlled pipelines. Build and manage data storage solutions using Snowflake, optimizing for cost, performance, and scalability. Orchestrate workflows and pipeline dependencies using Apache Airflow. Design and support Data Lake architecture for raw and curated data zones. Collaborate with Data Analysts, Scientists, and Product teams to ensure availability and quality of data. Monitor data pipeline performance, ensure data integrity, and handle error recovery mechanisms. Followbest practices in CI/CD, testing, data governance, and security standards. Required Skills 5 - 7 years of professional experience in data engineering roles. Hands-on experience with GCP services: BigQuery, Cloud Storage, Pub/Sub, Dataflow, Composer, etc. Expertisein FiveTran and experience integrating APIs and external sources. Proficient in writing modular SQL transformations and data modeling using DBT. Deep understanding of Snowflake warehousing: performance tuning, cost optimization, security. Experience with Airflow for pipeline orchestration and DAG management. Familiarity with designing and implementing Data Lake solutions. Proficient in Python and/or SQL. Strong understanding of data governance, data quality frameworks, and DevOps practices. Preferred Qualifications GCP Professional Data Engineer certification is a plus. Experience in agile development environments. Exposure to data catalog tools and data observability platforms. Send profiles to narasimha@nam-it.com Thanks & regards, Narasimha.B Staffing executive NAM Info Pvt Ltd, 29/2B-01, 1st Floor, K.R. Road, Banashankari 2nd Stage, Bangalore - 560070. +91 9182480146 (India)
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
We are seeking a skilled and motivated Data Engineer with hands-on experience in Snowflake , Azure Data Factory (ADF) , and Fivetran . The ideal candidate will be responsible for building and optimizing data pipelines, ensuring efficient data integration and transformation to support analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and maintain robust data pipelines using Fivetran , ADF , and other ETL tools. Build and manage scalable data models and data warehouses on Snowflake . Integrate data from various sources into Snowflake using automated workflows. Implement data transformation and cleansing processes to ensure data quality and integrity. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Monitor pipeline performance, troubleshoot issues, and optimize for efficiency. Maintain documentation related to data architecture, processes, and workflows. Ensure data security and compliance with company policies and industry standards. Required Skills & Qualifications: Bachelor's or Masters degree in Computer Science, Information Systems, or a related field. 3+ years of experience in data engineering or a similar role. Proficiency with Snowflake including architecture, SQL scripting, and performance tuning. Hands-on experience with Azure Data Factory (ADF) for pipeline orchestration and data integration. Experience with Fivetran or similar ELT/ETL automation tools. Strong SQL skills and familiarity with data warehousing best practices. Knowledge of cloud platforms, preferably Microsoft Azure. Familiarity with version control tools (e.g., Git) and CI/CD practices. Excellent communication and problem-solving skills. Preferred Qualifications: Experience with Python, dbt, or other data transformation tools. Understanding of data governance, data quality, and compliance frameworks. Knowledge of additional data tools (e.g., Power BI, Databricks, Kafka) is a plus.
Posted 3 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Process Delivery Specialist - Talent Development Optimization Processes - Data Analyst Help with the data validations and resolve any data discrepancies Responsible for creating Mode/Tableau/PowerBI dashboards to surface data for the accounting team to help with reconciliation Work closely with the Revenue teams daily doing the following o Creating and updating dashboards o Use SQL to query data from Snowflake database to perform reconciliations and data investigation o Root cause analysis Work cross functionally with Revenue, Billing, Engineering, Tax, and Strategic Finance teams to discuss, investigate, and resolve data discrepancies Work with large data sets and compare to current state & then investigate the differences Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3+ years working as a Financial Data Analyst supporting accounting teams Experience working as a data analyst using a variety of BI tools (Mode/Tableau/PowerBI); primarily using Mode Strong SQL knowledge Experience with analyzing financial data Preferred technical and professional experience PowerQuery experience
Posted 3 weeks ago
3.0 - 8.0 years
20 - 35 Lacs
Hyderabad, Pune
Work from Office
Technical Data Analyst Snowflake, SQL, Python (Finance Data Warehouse) Job Description We are seeking a highly skilled *Technical Data Analyst* to join our team and play a key role in building a *single source of truth* for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate will have a strong background in data analysis, SQL, and data transformation, with experience in financial data warehousing and reporting. This role will involve working closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently built in *Snowflake* and will be migrated to *Databricks*. The candidate will be responsible for transitioning reporting and transformation processes to Databricks while ensuring data accuracy and consistency. *Key Responsibilities:* 1. *Data Analysis & Reporting:* - Build and maintain *month-end accounting and tax dashboards* using SQL and Snowsight in Snowflake. - Transition reporting processes to *Databricks*, creating dashboards and reports to support finance and accounting teams. - Gather requirements from finance and accounting stakeholders to design and deliver actionable insights. 2. *Data Transformation & Aggregation:* - Develop and implement data transformation pipelines in *Databricks* to aggregate financial data and create *balance sheet look-forward views*. - Ensure data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to optimize data ingestion and transformation processes. 3. *Data Integration & ERP Collaboration:* - Support the integration of financial data from the data warehouse into *NetSuite ERP* by ensuring data is properly transformed and validated. - Work with cross-functional teams to ensure seamless data flow between systems. 4. *Data Ingestion & Tools:* - Understand and work with *Fivetran* for data ingestion (no need to be an expert, but familiarity is required). - Troubleshoot and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - 3+ years of experience as a *Data Analyst* or similar role, preferably in a financial or accounting context. - Strong proficiency in *SQL* and experience with *Snowflake* and *Databricks*. - Experience building dashboards and reports for financial data (e.g., month-end close, tax reporting, balance sheets). - Familiarity with *Fivetran* or similar data ingestion tools. - Understanding of financial data concepts (e.g., general ledger, journals, balance sheets, income statements). - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate with finance and accounting teams. - Nice-to-have: Experience with *NetSuite ERP* or similar financial systems.
Posted 3 weeks ago
4.0 - 5.0 years
6 - 7 Lacs
Hyderabad
Work from Office
Who are we? CDK Global is the largest technical soltuions provider for the automotive retail industry that is setting the the landscape for automotive dealers, original equipment manufacturers (OEMs) and the customers they serve. As a technology company, we have a significant focus moving our applications to the public cloud and in the process working multiple transformation/modernization Be Part of Something Bigger Each year, more than three percent of the U.S. gross domestic product (GDP) is attributed to the auto industry, which flows through our customer, the auto dealer. Its time you joined an evolving marketplace where research and development investment is measured in the tens of billions. Its time you were a part of something bigger. Were expanding our workforce engineers, architects, developers and more onboarding early adopters who can optimize, pivot and keep pace with ever-evolving development roadmaps and applications. Join Our Team Growth potential, flexibility and material impact on the success and quality of a next-gen, enterprise software product make CDK an excellent choice for those who thrive in challenging, fast-paced engineering environments. The possibilities for impact are endless. We have exceptional opportunities to evolve our industry by driving change through new technology. If youre ready for high-impact, youre ready for CDK. Location: Hyderbad, India Role: Define/Maintain/Implement CDKs Public Clould standards including secrets management, storage, compute, networking, account management, database and operations. Leverage tools like AWS Trusted Advisor, 3rd party Cloud Cost Management tools and scripting to identify and drive cost optimization. This will include working with Application owners to achieve the cost savings. Design and implement Cloud Security Controls that creates guard rails for application teams to work within ensuring proper platform security for applications deployed within the CDK cloud environments. Design/Develop/Implement cloud solutions. Leveraging cloud native services, wrap the appropriate security, automation and service levels to support CDK business needs. Examples of solutions this role will be responsible for developing and supporting are Business Continuity/Backup and Recovery, Identity and Access Management, data services including long term archival, DNS, etc. Develop/maintain/implement cloud platform standards (User Access & Roles, tagging, security/compliance controls, operations management, performance management and configuration management) Responsible for writing and eventual automation of operational run-books for operations. Assist application teams with automating their production support run-books (automate everywhere) Assist application teams when they have issues using AWS services where they are not are fully up to speed in their use. Hands on development of automation solutions to support application teams. Define and maintain minimum application deployment standards (governance, cost management and tech debt) Optimizing and tuning designs based on performance and root cause analysis Analysis of existing solutions alignment to infrastructure standards and providing feedback to both evolve and mature the product solutions and CDK public cloud standards. Essential Duties & Skills: This is a hands-on role where the candidate will take on technical tasks where in depth knowledge on usage and public cloud best practices. Some of the areas within AWS where you will be working include: Compute: EC2, EKS. RDS, Lambda Networking: Load Balancing (ALB/ELB), VPN, Transit Gateways, VPCs, Availablity Zones/Regions Storage: EBS, S3, Archive Services, AWS Backup Security: AWS Config, Cloud Watch, Cloud Trail, Route53, Guard Duty, Detective, Inspector, Security Hub, Secrets Server, KMS, AWS Shield, Security Groups,.AWS Identity and Access Management, etc. Cloud Cost Optimization: Cost Optimizer, Trusted Advisor, Cost Explorer, Harness Cloud Clost Management or equivalent cost management tools. Preferred: Experience with 3rd party SaaS solutions like DataBricks, Snowflake, Confluent Kafka Broad understanding/experience across full stack infrastructure technologies Site Reliablity Engineering practices Github/Artifactory/Bamboo/Terraform Database solutions (SQL/NoSQL) Containerization Solutions (Docker, Kubernetes) DevOps processes and tooling Message queuing, data streaming and caching solutions Networking principles and concepts Scripting and development; preferred Python & Java languages Server based operating systems (Windows/Linux) and Web Services (IIS, Apache) Experience of designing, optimizing and troubleshooting public cloud platforms associated with large, complex application stacks Have clear and concise communication and be comfortable working with at all levels in the organization Capable of managing and prioritize multiple projects with competing resource requirements and timelines Years of Experience: 4-5 yrs+ working in the AWS public cloud environment AWS Solution Architect Professional certification preferred Experience with Infrastructure as code (CloudFormation, Terraform)
Posted 3 weeks ago
9.0 - 14.0 years
25 - 30 Lacs
Gurugram
Work from Office
Reports To Associate Director - Risk Data Analytics Level Level 5 About your team The Global Risk team in Fidelity covers the management oversight of Fidelitys risk profile, including key risk frameworks, policies and procedures and oversight and challenge processes. The team partner with the businesses to ensure Fidelity manages its risk profile within defined risk appetite. The team comprises risk specialists covering all facets of risk management, including investment, financial, non-financial and strategic risk. As part of a broader General Counsel team, the Risk team collaborates closely with Compliance, Legal, Tax and Corporate Sustainability colleagues. Develop efficient data driven solutions to support SMEs take key decisions for oversights & monitoring. Keep up with the pace of change in field of Data Analytics using cloud driven technology stack. Work on diverse risk subject areas. About your role The successful candidate will be responsible for data analysis, visualisation, and reporting for the Global Risk business. This role encompasses the full spectrum of data analysis, data modelling, technical design, and the development of enterprise-level analytics and insights using tools such as Power BI. Additionally, the candidate will provide operational support. Strong relationship management and stakeholder management skills are essential to maintain superior service for our various business contacts and clients. This role is for a Visualization & Reporting expert who can understand various risk domains such as Investment Risk, Non-Financial Risk, Enterprise Risk, and Strategic Risk, as well as complex risk frameworks and business issues. The candidate must comprehend the functional and technical implications associated with delivering analytics capabilities using various data sources and the Power Platform. This role demands strong hands-on skills in data modelling and transformation using SQL queries and Power Query/DAX, along with expert data visualization and reporting abilities. The successful candidate should be able to handle complex project requirements within agreed timelines while maintaining a high level of deliverable quality. Additionally, they will be expected to interact with stakeholders at all levels of the business, seeking approval and signoffs on project deliverables. Key Responsibilities Understand the scope of business requirements and translate them into stories, define data ingestion approach, data transformation strategy, data model, and front-end design (UI/UX) for the required product. Create working prototypes in tools like Excel or Power BI and reach an agreement with business stakeholders before commencing development to ensure engagement. Drive the data modelling and data visualization development from start to finish, keeping various stakeholders informed and obtaining approvals/signoffs on known issues, solution design, and risks. Work closely with Python Developers to develop data adaptors for ingesting, transforming and retaining time series data as required for frontend. Demonstrate a high degree of proficiency in Power Query, Power BI, advanced DAX calculations and modelling techniques, and developing intuitive visualization solutions. Possess strong experience in developing and managing dimensional data models in Power BI or within a data warehouse environment. Show proficiency in data integration and architecture, including dimensional data modelling, database design, data warehousing, ETL development, and query performance tuning. Advanced data modelling and testing skills using various RDBMS (SQL Server 2017+, Oracle 12C+) and Snowflake data warehouse will be an added advantage. Assess and ensure that the solution being delivered is fit for purpose, efficient, and scalable, refining iteratively if required. Collaborate with global teams and stakeholders to deliver the scope of the project. Obtain agreement on delivered visuals and solutions, ensuring they meet all business requirements. Work collaboratively with the project manager within the team to identify, define, and clarify the scope and terms of complex data visualization requirements. Converting raw data into meaningful insights through interactive and easy-to-understand dashboards and reports. Coordinate across multiple project teams delivering common, reusable functionality using service-oriented patterns. Drive user acceptance testing with the product owner, addressing defects, and improving solutions based on observations. Interact and work with third-party vendors and suppliers for vendor products and in cases of market data integration. Build and contribute towards professional data visualization capabilities within risk teams and at the organization level. Stay abreast of key emerging products industry standards in the data visualization and advance analytics. Co-work with other team members for both relationship management and fund promotion. About you Experience 9+ years of experience in developing and implementing advance analytics solutions. Competencies Ability to identify & self-manage analysis work for the allocated workstream with minimal or no assistance. Ability to develop and maintain strong relationships with stakeholders within project working group ensuring continual and effective communication. Ability to translate business requirements to technical requirements (internal and external) in supporting the project. Excellent interpersonal, communication, documentation, facilitation & presentation skills. Fair idea of Agile methodology, familiar with Stories requirements artefact used in Agile. Excellent written and verbal communication skills and a strong team player. Good communication, influencing, negotiation skills. Proven ability to work well under pressure and in a team environment. Self-motivated, flexible, responsible, and a penchant for quality. Experience based domain knowledge of Risk management, regulatory compliance or operational compliance functions would be an advantage. Basic knowledge and know-how of Data Science and Artificial Intelligence/GenAI. Qualifications Preferred academic qualification is BE B-Tech MCA Any Graduate
Posted 3 weeks ago
3.0 - 8.0 years
8 - 18 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Immediate Requirement for Technical Data Analyst, please find below details for profiles : Exp-3+Years Location-Hyd/Pune WFO- 5 Days Interview- 1st round virtual second F2F(only for Hyd) CTC-Best in the Market Notice Period- Immediate to serving notice july joiners only Company -IT Service Based MNC -Fulltime Job Responsibilities: Snowflake+Python+SQL Exp is mandatory along with finance/accounting context exp. Job Description We are seeking a highly skilled **Technical Data Analyst** to join our team and play a key role in building a **single source of truth** for our high-volume, direct-to-consumer accounting and financial data warehouse. The ideal candidate will have a strong background in data analysis, SQL, and data transformation, with experience in financial data warehousing and reporting. This role will involve working closely with finance and accounting teams to gather requirements, build dashboards, and transform data to support month-end accounting, tax reporting, and financial forecasting. The financial data warehouse is currently built in **Snowflake** and will be migrated to **Databricks**. The candidate will be responsible for transitioning reporting and transformation processes to Databricks while ensuring data accuracy and consistency. **Key Responsibilities:** 1. **Data Analysis & Reporting:** - Build and maintain **month-end accounting and tax dashboards** using SQL and Snowsight in Snowflake. - Transition reporting processes to **Databricks**, creating dashboards and reports to support finance and accounting teams. - Gather requirements from finance and accounting stakeholders to design and deliver actionable insights. 2. **Data Transformation & Aggregation:** - Develop and implement data transformation pipelines in **Databricks** to aggregate financial data and create **balance sheet look-forward views**. - Ensure data accuracy and consistency during the migration from Snowflake to Databricks. - Collaborate with the data engineering team to optimize data ingestion and transformation processes. 3. **Data Integration & ERP Collaboration:** - Support the integration of financial data from the data warehouse into **NetSuite ERP** by ensuring data is properly transformed and validated. - Work with cross-functional teams to ensure seamless data flow between systems. 4. **Data Ingestion & Tools:** - Understand and work with **Fivetran** for data ingestion (no need to be an expert, but familiarity is required). - Troubleshoot and resolve data-related issues in collaboration with the data engineering team. Additional Qualifications: - 3+ years of experience as a **Data Analyst** or similar role, preferably in a financial or accounting context. - Strong proficiency in **SQL** and experience with **Snowflake** and **Databricks**. - Experience building dashboards and reports for financial data (e.g., month-end close, tax reporting, balance sheets). - Familiarity with **Fivetran** or similar data ingestion tools. - Understanding of financial data concepts (e.g., general ledger, journals, balance sheets, income statements). - Experience with data transformation and aggregation in a cloud-based environment. - Strong communication skills to collaborate with finance and accounting teams. - Nice-to-have: Experience with **NetSuite ERP** or similar financial systems. please share details directly at jyoti.c@globalaaplications.com
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad, Ahmedabad
Work from Office
Grade Level (for internal use): 10 The Team: We seek a highly motivated, enthusiastic, and skilled engineer for our Industry Data Solutions Team. We strive to deliver sector-specific, data-rich, and hyper-targeted solutions for evolving business needs. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts and Infrastructure Teams. The Impact: Enterprise Data Organization is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. Whats in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of dataideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies, including AWS Cloud and Databricks . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages, and tools, including unit testing, performance testing, and monitoring, and implementation Support business and technology teams as necessary during design, development, and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially in writing, with the business and other technical groups Basic Qualifications: Bachelor's/Masters Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Nice to have knowledge in Grafana, Kibana, Big data, Kafka, Git Hub, EMR, Terraforms, AI-ML Advanced SQL programming skills Highly recommended skillset in Databricks , SPARK , Scalatechnologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing.
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Minimum 5+ years of Developing, designing, and implementing of Data Engineering. Collaborate with data engineers and architects to design and optimize data models for Snowflake Data Warehouse. Optimize query performance and data storage in Snowflake by utilizing clustering, partitioning, and other optimization techniques. Experience working on projects were housed within an Amazon Web Services (AWS) cloud environment. Experience working on projects housed within a Tableau and DBT Work closely with business stakeholders to understand requirements and translate them into technical solutions. Excellent presentation and communication skills, both written and verbal, ability to problem solve and design in an environment with unclear requirements.
Posted 3 weeks ago
6.0 - 8.0 years
12 - 16 Lacs
Bengaluru
Hybrid
Role: Business Systems Analyst III Location: Bengaluru ( Hybrid ) The Opportunity: Our client is seeking a highly skilled and motivated Snowflake FinOps Engineer to play a critical role in managing the spend of our growing Snowflake data platform. You will be responsible for ensuring the efficient and cost-effective operation of our Snowflake environment, combining deep technical expertise in Snowflake administration with a strong focus on financial accountability and resource optimization. This is an exciting opportunity to make a significant impact on our data infrastructure and contribute to a data-driven culture. Responsibleness': Snowflake Cost Optimization (FinOps): Develop and implement a comprehensive Snowflake cost optimization strategy aligned with business objectives. Continuously monitor and analyze Snowflake credit consumption and storage costs, identifying key cost drivers and trends. Proactively identify and implement opportunities for cost reduction through techniques such as virtual warehouse rightsizing, query optimization, data lifecycle management, and feature utilization. Develop and maintain dashboards and reports to track Snowflake spending, identify anomalies, and communicate cost optimization progress to stakeholders. Collaborate with engineering and analytics teams to educate them on cost-aware Snowflake practices and promote a culture of cost efficiency. Implement and manage Snowflake cost controls and alerts to prevent unexpected spending. Evaluate and recommend new Snowflake features and pricing models to optimize cost and performance. Automate cost monitoring, reporting, and optimization tasks using scripting and other tools. Work closely with finance and procurement teams on Snowflake budgeting and forecasting. Establish, document, and enforce a comprehensive tagging standard for Snowflake objects (e.g., virtual warehouses, tables, users) to improve cost tracking, resource allocation, and governance. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience 5+ years in administering and managing Snowflake data warehouse environments. Strong understanding of Snowflake architecture, features, and best practices. Demonstrated experience in implementing and driving cost optimization strategies for Snowflake. Proficiency in SQL and experience with data analysis and visualization tools (e.g., Tableau, Looker, Power BI). Experience with scripting languages (e.g., Python) for automation tasks is highly desirable. Familiarity with FinOps principles and practices in a cloud environment is a significant advantage. Excellent analytical and problem-solving skills with a strong attention to detail. Strong communication and collaboration skills, with the ability to explain technical concepts to both technical and non-technical audiences. Ability to work independently and manage multiple priorities in a fast-paced environment. Snowflake certifications (e.g., SnowPro Core) are a plus.
Posted 3 weeks ago
0.0 - 3.0 years
1 - 4 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer Company Name: Kinara Capital Job Description: As a Data Engineer at Kinara Capital, you will play a critical role in building and maintaining the data infrastructure necessary for effective data analysis and decision-making. You will collaborate with data scientists, analysts, and other stakeholders to support data-driven initiatives. Your primary responsibilities will include designing and implementing robust data pipelines, ensuring data quality and integrity, and optimizing data storage and retrieval processes. Key Responsibilities: - Develop, construct, test, and maintain data architectures including databases and large-scale processing systems. - Create and manage data pipelines to ingest, process, and transform data from various sources. - Collaborate with data scientists and analysts to understand data needs and develop solutions to meet those needs. - Monitor data quality and implement data governance best practices. - Optimize SQL queries and improve performance of data-processing systems. - Ensure data privacy and security standards are met and maintained. - Document data processes and pipelines to facilitate knowledge sharing within the team. Skills and Tools Required: - Proficiency in programming languages such as Python, Java, or Scala. - Experience with data warehousing solutions, such as Amazon Redshift, Google BigQuery, or Snowflake. - Strong knowledge of SQL and experience with relational databases like MySQL, PostgreSQL, or Oracle. - Familiarity with big data technologies like Apache Hadoop, Apache Spark, or Apache Kafka. - Understanding of data modeling and ETL (Extract, Transform, Load) processes. - Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform. - Familiarity with data visualization tools (e.g., Tableau, Power BI) is a plus. - Strong analytical and problem-solving skills, with attention to detail. - Excellent communication skills to work collaboratively with cross-functional teams. Join Kinara Capital and leverage your data engineering skills to help drive innovative solutions and empower businesses through data.
Posted 3 weeks ago
8.0 - 12.0 years
8 - 12 Lacs
Pune, Bengaluru
Hybrid
Role & responsibilities - Overall 8+ years of prior experience as Data engineer/ Data analyst/ BI Engineer. - At least 5 years of Consulting or client service delivery experience on Amazon AWS (AWS) - At least 5 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions - Minimum of 5 years of hands-on experience in AWS and Big Data technologies such as Python, SQL, EC2, S3, Lambda, Spark/SparkSQL, Redshift, Snowflake, Snaplogic. - Prior experience on Snaplogic, AWS Glue, Lambda is must to have. - 3-5+ years of hands on experience in programming languages such as python, pyspark, spark, SQL,. - 2+ years of experience with DevOps tools such as GitLabs, Jenkins, Code Build, CodePipeline, CodeDeploy, etc. - Bachelors or higher degree in Computer Science or a related discipline. - AWS certification like Solution Architect Associate or Associate AWS Developer or AWS Big Data Specialty (nice to have).
Posted 3 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Chennai, Mumbai (All Areas)
Work from Office
Build ETL jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce and AWS technologies Build out data lineage artifacts to ensure all current and future systems are properly documented Required Candidate profile exp with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate & transfer large volumes of data and perform quality checks Exp in healthcare industry with PHI/PII
Posted 3 weeks ago
6.0 - 11.0 years
10 - 18 Lacs
Hyderabad
Work from Office
TCS_ Walk in_ Hyderabad_ Snowflake Developer Role: Snowflake Developer Experience: 7-15 years Walk in Date: 5th July 25 Location: Deccan Park, (Deccan Park, 2S2 Zone)Plot No.1, Hitech City Main Rd, Software Units Layout, HUDA Techno Enclave, Madhapur, Hyderabad, Telangana 500081 Desired Competencies (Technical/Behavioral Competency): Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake
Posted 3 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
Noida
Work from Office
Must be: Bachelors or Masters degree in Computer Science, Information Technology, or a related discipline. 35+ years of experience in SQL Development and Data Engineering . Strong hands-on skills in T-SQL , including complex joins, indexing strategies, and query optimization. Proven experience in Power BI development, including building dashboards, writing DAX expressions, and using Power Query . Should be: At least 1+ year of hands-on experience with one or more components of the Azure Data Platform : Azure Data Factory (ADF) Azure Databricks Azure SQL Database Azure Synapse Analytics Solid understanding of data warehouse architecture , including star and snowflake schemas , and data lake design principles. Familiarity with: Data Lake and Delta Lake concepts Lakehouse architecture Data governance , data lineage , and security controls within Azure
Posted 3 weeks ago
3.0 - 7.0 years
9 - 13 Lacs
Jaipur
Work from Office
Job Summary Auriga IT is seeking a proactive, problem-solving Data Analyst with 35 years of experience owning end-to-end data pipelines. Youll partner with stakeholders across engineering, product, marketing, and finance to turn raw data into actionable insights that drive business decisions. You must be fluent in the core libraries, tools, and cloud services listed below. Your Responsibilities: Pipeline Management Design, build, and maintain ETL/ELT workflows using orchestration frameworks (e.g., Airflow, dbt). Exploratory Data Analysis & Visualization Perform EDA and statistical analysis using Python or R . Prototype and deliver interactive charts and dashboards. BI & Reporting Develop dashboards and scheduled reports to surface KPIs and trends. Configure real-time alerts for data anomalies or thresholds. Insights Delivery & Storytelling Translate complex analyses (A/B tests, forecasting, cohort analysis) into clear recommendations. Present findings to both technical and non-technical audiences. Collaboration & Governance Work cross-functionally to define data requirements, ensure quality, and maintain governance. Mentor junior team members on best practices in code, version control, and documentation. Key Skills: You must know at least one technology from each category below: Data Manipulation & Analysis Python: pandas, NumPy R: tidyverse (dplyr, tidyr) Visualization & Dashboarding Python: matplotlib, seaborn, Plotly R: ggplot2, Shiny BI Platforms Commercial or Open-Source (e.g., Tableau, Power BI, Apache Superset, Grafana) ETL/ELT Orchestration Apache Airflow, dbt, or equivalent Cloud Data Services AWS (Redshift, Athena, QuickSight) GCP (BigQuery, Data Studio) Azure (Synapse, Data Explorer) Databases & Querying RDBMS Strong SQL Skill (PostgreSQL, MySQL, Snowflake) Decent Knowledge of NoSQL databases Additionally : Bachelors or Masters in a quantitative field (Statistics, CS, Economics, etc.). 3-5 years in a data analyst (or similar) role with end-to-end pipeline ownership. Strong problem-solving mindset and excellent communication skills. Certification of power BI, Tableau is a plus Desired Skills & Attributes Familiarity with version control (Git) and CI/CD for analytics code. Exposure to basic machine-learning workflows (scikit-learn, caret). Comfortable working in Agile/Scrum environments and collaborating across domains.
Posted 3 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Kolkata, Pune, Chennai
Hybrid
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex SQL queries to optimize database performance and troubleshoot issues. Implement automation scripts using Python to streamline tasks and improve efficiency. Participate in code reviews to ensure adherence to coding standards and best practices. Mandatory Skills - Snowflake , SQL , Python , DBT
Posted 3 weeks ago
2.0 - 7.0 years
0 - 1 Lacs
Mumbai
Remote
Data Engineer Company Name: Fluid AI Role Overview: As a Data Engineer, you will be responsible for designing and maintaining the data frameworks that power our Gen-AI products. Youll work closely with engineering, product, and AI research teams to ensure our data models are scalable, secure, and optimized for real-world performance across diverse use cases. This is a hands-on and strategic role, ideal for someone who thrives in fast-paced, innovative environments. Key Responsibilities: Design, implement, and optimize data architectures to support large-scale AI and machine learning systems Collaborate with cross-functional teams to define data models, APIs, and integration flows Architect secure, scalable data pipelines for structured and unstructured data Oversee data governance, access controls, and compliance (GDPR, SOC2, etc.) Select appropriate data storage technologies (SQL/NoSQL/data lakes) for various workloads Work with MLOps and DevOps teams to enable real-time data availability and model serving Evaluate and integrate third-party APIs, datasets, and connectors Contribute to system documentation and data architecture diagrams Support AI researchers with high-quality, well-structured data pipelines Required Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field 5+ years of experience as a Data Architect, Data Engineer, or in a similar role Expertise in designing cloud-based data architectures (AWS, Azure, GCP) Strong knowledge of SQL, NoSQL, and distributed databases (PostgreSQL, MongoDB, Cassandra, etc.) Experience with big data tools like Spark, Kafka, Airflow, or similar Familiarity with data warehousing tools (Redshift, BigQuery, Snowflake) Solid understanding of data privacy, compliance, and governance best practices Preferred Qualifications: Experience working on AI/ML or Gen AI-related products Proficiency in Python or another scripting language used for data processing Exposure to building APIs for data ingestion and consumption Prior experience supporting enterprise-level SaaS products Strong analytical and communication skills Travel & Documentation Requirement: Candidate must hold a valid passport Willingness to travel overseas for 1 week (as part of client collaboration) Having a valid US visa (e.g., B1/B2, H1B, Green Card, etc.) is a strong advantage Why Join Us: Work on high-impact, cutting-edge Generative AI products Collaborate with some of the best minds in AI, engineering, and product Flexible work culture with global exposure Opportunity to work on deeply technical challenges with real-world impact
Posted 4 weeks ago
4.0 - 9.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities : Technical Skills Proficiency in Snowflake scripting - stored procedures (SQL), user defined functions, Common Table expressions, Window functions. Experience in creation of Snowpark python procedures and user defined functions. Knowledge of Snowpark architecture, creation of Snowpark data frames, data transformations in Snowpark. Process Parquet, JSON semi structured data in Snowflake using parse_json, lateral flatten DBT (Data build tool) DBT Cloud and DBT core hands on experience in creating models as per requirements. DBT experience in creating and using macros, jinja scripting, hooks, automated tests, snapshots, DBT packages. Experience implementing data sharing, replication, dynamic data masking using masking policies, secure views, row access policies. Usage of tags, streams, tasks, external tables, time travel, clone, storage integration, stages, file format, clustering of larger tables on clustering keys, role-based access control Identify and fix performance issues in Snowflake and DBT Experience working Visual Studio Code IDE. Familiarity with git concepts like creation of branches, cloning of repo, creation of pull requests. Knowledge of Apache airflow to schedule pipelines. Ability to create DB models to load dimensional models consisting of facts, dimensions. Knowledge of Azure Devops pipelines, agents, repos. Understanding of AWS services S3, Private link, IAM roles, security groups, VPC end points
Posted 4 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Pune
Hybrid
Snowflake L2/L3 Senior Engineer (Managed Services) 3 Positions Location: Hybrid Employment Type: Full-Time Experience Level: 5+ years (with 3+ in Snowflake) Shift: Rotational Shifts (24/7) Role Overview: We are looking for an experienced Snowflake Senior Engineer to oversee both development and production support operations within our managed services model. This is a dual-role leadership position requiring strong technical capabilities in Snowflake as well as expertise in managing ongoing production operations, enhancements, and client coordination. Key Responsibilities: Lead a team Snowflake developers and support engineers providing: Enhancements & feature development L2/L3 production support (incident management, monitoring, RCA) Manage and prioritize the support backlog and enhancement pipeline Serve as technical SME for Snowflake development and troubleshooting Ensure high platform availability, and performance tuning Conduct performance analysis, and enforce Snowflake best practices Coordinate with client stakeholders, DevOps, data engineers, and QA teams Own support SLAs, incident resolution timelines, and change management Prepare regular service reports and participate in governance calls Required Skills & Experience: 3+ years of hands-on Snowflake development and administration 6+ years of experience in data engineering or BI/DW support Experience leading teams in a managed services or enterprise support model Strong SQL, performance tuning, and debugging skills Knowledge of CI/CD, Python, ADF or similar orchestration tools Familiarity with monitoring tools and Snowflake Account Usage views Experience with Azure Data Factory Preferred: SnowPro Certification (Core/Advanced) Experience with ServiceNow or Jira Experience with Azure Data Factory Experience managing global support teams Snowflake L2/L3 Engineer 2 openings Location: Hybrid Employment Type: Full-Time Experience Level: 2-4 years Shift: Rotational Shifts (24/7) Role Overview: Join our Snowflake Managed Services team as a Software Engineer to work on data platform development, enhancements, and production support . You will support Snowflake environments across multiple clients, ensuring stability, performance, and continuous improvement. Key Responsibilities: Design and develop Snowflake pipelines, data models, and transformations Provide L2/L3 production support for Snowflake jobs, queries, and integrations Troubleshoot failed jobs, resolve incidents, and conduct RCA Tune queries, monitor warehouses, and help optimize Snowflake usage and cost Handle service requests like user provisioning, access changes, and role management Document issues, enhancements, and standard procedures (runbooks) Required Skills & Experience: 2+ years of hands-on experience in Snowflake development and support Strong SQL, data modeling, and performance tuning experience Exposure to CI/CD pipelines and scripting languages (e.g., Python, Shell) Experience with data pipelines and orchestration tools ( ADF) Preferred: SnowPro Core Certification Experience with ticketing systems (ServiceNow, Jira) Cloud experience with Azure Basic understanding of ITIL processes Role & responsibilities Preferred candidate profile
Posted 4 weeks ago
6.0 - 10.0 years
20 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Snowflake Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad, Bengaluru and Chennai ( Hybrid - local candidates) Notice Period: Immediate to 15 Days Salary: As per your experience Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion
Posted 4 weeks ago
10.0 - 15.0 years
14 - 18 Lacs
Bengaluru
Work from Office
About the Role: Were looking for an experienced Engineering Manager to lead the development of highly scalable, reliable, and secure platform services and database connectors that power mission-critical data pipelines for thousands of enterprise customers. These pipelines connect to modern data warehouses such as Snowflake, BigQuery, and Databricks , as well as Data Lakes like Apache . This is a rare opportunity to own and build foundational systems , solve complex engineering challenges , and lead a high-impact team delivering best-in-class performance at scale. You will play a central role in shaping our platform vision, driving high accountability , and fostering a culture of technical excellence and high performance while working closely with cross-functional stakeholders across product, program, support, and business teams. What Youll Do: Lead, mentor and inspire a team of software engineers who take pride in ownership and delivering impact. Ensure operational excellence through proactive monitoring, automated processes, and a culture of continuous improvement with strong accountability. Drive a strong quality-first mindset , embedding it into the development lifecyclefrom design to deployment. Drive technical leadership through architecture reviews , code guidance, and solving critical platform challenges. Build and operate multi-tenant, distributed backend systems at scale. Act as a technical leader youve operated at least at Tech Lead, Staff Engineer, or Principal Engineer level in your career. Champion a culture of high accountability , clear ownership, and high visibility across engineering and cross-functional stakeholders. Collaborate deeply with Product, Program, Support, and Business functions to drive alignment and execution. Embed principles of observability, reliability, security, and auditability into all aspects of the platform. Inspire the team to pursue engineering excellence , driving best-in-class implementations and visible results. Define and track data-driven KPIs to ensure operational efficiency, performance, and team effectiveness. Take end-to-end ownership of product lines, ensuring on-time delivery and customer success . Contribute to team growth , hiring, and building an inclusive, learning-focused engineering environment. What Were Looking For: 10+ years of experience in backend or systems software development. 2+ years in a formal or informal Engineering Manager, Sr. Engineering Manager, or Tech Lead role in a fast-paced engineering environment. Progression through senior IC roles like Tech Lead, Staff, or Principal Engineer . Strong experience with distributed systems , cloud-native architectures , and multi-tenant platforms. Proven ability to drive cross-team collaboration with product, support, business, and program teams. Demonstrated ability to drive accountability , set clear goals, and raise the performance bar for the team. Expertise in system design, scalability, performance optimization, and cost control. Proven track record of mentoring engineers , guiding architecture, and leading impactful initiatives. Clear communicator, adept at both strategy and execution. Bonus Points: Experience with data engineering platforms , ETL systems , or database internals . Exposure to product-driven companies , especially in infrastructure, SaaS , or backup/data systems . Demonstrated history of fast-tracked growth or high-visibility impact. Led or contributed to re-architecture or greenfield systems at scale.
Posted 4 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Title: Technical Data Analyst Work Mode: Remote Contract Duration: 6 Months to 1 Year Location: Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote (Open to candidates across India) Experience: 5+ Years Job Overview: We are seeking a highly skilled Technical Data Analyst for a remote contract position (6 to 12 months) to help build a single source of truth for our high-volume direct-to-consumer accounting and financial data warehouse. You will work closely with Finance & Accounting teams and play a pivotal role in dashboard creation, data transformation, and migration from Snowflake to Databricks. Key Responsibilities: 1. Data Analysis & Reporting Develop month-end accounting and tax dashboards using SQL in Snowflake (Snowsight) Migrate and transition reports/dashboards to Databricks Gather, analyze, and transform business requirements from finance/accounting stakeholders into data products 2. Data Transformation & Aggregation Build transformation pipelines in Databricks to support balance sheet look-forward views Maintain data accuracy and consistency throughout the Snowflake Databricks migration Partner with Data Engineering to optimize pipeline performance 3. ERP & Data Integration Support integration of financial data with NetSuite ERP Validate transformed data to ensure correct ingestion and mapping into ERP systems 4. Ingestion & Data Ops Work with Fivetran for ingestion and resolve any pipeline or data accuracy issues Monitor data workflows and collaborate with engineering teams on troubleshooting Required Skills & Qualifications: 5+ years of experience as a Data Analyst (preferably in Finance/Accounting domain) Strong in SQL, with proven experience in Snowflake and Databricks Experience in building financial dashboards (month-end close, tax reporting, balance sheets) Understanding of financial/accounting data: GL, journal entries, balance sheet, income statements Familiarity with Fivetran or similar data ingestion tools Experience with data transformation in a cloud environment Strong communication and stakeholder management skills Nice to have: Experience working with NetSuite ERP Apply Now: Please share your updated resume with the following details: Full Name Total Experience Relevant Experience in SQL, Snowflake, Databricks Experience in Finance or Accounting domain Current Location Availability (Notice Period) Current and Expected Rate
Posted 4 weeks ago
5.0 - 9.0 years
7 - 11 Lacs
Pune
Work from Office
hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.
Posted 4 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Gurugram, Bengaluru
Work from Office
Department: Technology Reports To: Middle and Back Office Data Product Owner About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Middle and Back Office Data Analyst role is instrumental in the creation and execution of a future state design for Fund Servicing & Oversight data across Fidelitys key business areas. The successful candidate will have an in- depth knowledge of data domains that represent Middle and Back-office operations and technology. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Data Product - Requirements Definition and Delivery of Data Outcomes Analysis of data product requirements to enable business outcomes, contributing to the data product roadmap Capture both functional and non-functional data requirements considering the data product and consumers perspectives. Conduct workshops with both the business and tech stakeholders for requirements gathering, elicitation and walk throughs. Responsible for the definition of data requirements, epics and stories within the product backlog and providing analysis support throughout the SDLC. Responsible for supporting the UAT cycles, attaining business sign off on outcomes being delivered Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering principles. Coordination and Communication: Excellent communication skills to influence technology and business stakeholders globally, attaining alignment and sign off on the requirements. Coordinate with internal and external stakeholders to communicate data product deliveries and the change impact to the operating model. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you At least 10 years of proven experience as a business/technical/data analyst within technology and/or business changes within the financial servicesasset management industry. Minimum 5 years as a senior business/technical/data analyst adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Proven experience. of delivering data driven business outcomes using industry leading data platforms such as Snowflake. Excellent knowledge of data life cycle that drives Middle and Back Office capabilities such as trade execution, matching, confirmation, trade settlement, record keeping, accounting, fund & cash positions, custody, collaterals/margin movements, corporate actions , derivations and calculations such as holiday handling, portfolio turnover rates, funds of funds look through . In Depth expertise in data and calculations across the investment industry covering the below. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituents and licensing restrictions on them. ABOR & IBOR data: This includes calculation engines covering input data sets, calculations and treatment of various instruments for ABOR and IBOR data leveraging platforms such as Simcorp, Neoxam, Invest1, Charles River, Aladdin etc. Knowledge of TPAs, how data can be structured in a unified way from heterogenous structures. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Excellent hands-on SQL, Advanced Excel, Python, ML (optional) and proven experience and knowledge of data solutions. Knowledge of data management, data governance, and data engineering practices Hands on experience on data modelling techniques such as dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi