Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
10 - 18 Lacs
Hyderabad
Work from Office
TCS_ Walk in_ Hyderabad_ Snowflake Developer Role: Snowflake Developer Experience: 7-15 years Walk in Date: 5th July 25 Location: Deccan Park, (Deccan Park, 2S2 Zone)Plot No.1, Hitech City Main Rd, Software Units Layout, HUDA Techno Enclave, Madhapur, Hyderabad, Telangana 500081 Desired Competencies (Technical/Behavioral Competency): Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake
Posted 3 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
Noida
Work from Office
Must be: Bachelors or Masters degree in Computer Science, Information Technology, or a related discipline. 35+ years of experience in SQL Development and Data Engineering . Strong hands-on skills in T-SQL , including complex joins, indexing strategies, and query optimization. Proven experience in Power BI development, including building dashboards, writing DAX expressions, and using Power Query . Should be: At least 1+ year of hands-on experience with one or more components of the Azure Data Platform : Azure Data Factory (ADF) Azure Databricks Azure SQL Database Azure Synapse Analytics Solid understanding of data warehouse architecture , including star and snowflake schemas , and data lake design principles. Familiarity with: Data Lake and Delta Lake concepts Lakehouse architecture Data governance , data lineage , and security controls within Azure
Posted 3 weeks ago
3.0 - 7.0 years
9 - 13 Lacs
Jaipur
Work from Office
Job Summary Auriga IT is seeking a proactive, problem-solving Data Analyst with 35 years of experience owning end-to-end data pipelines. Youll partner with stakeholders across engineering, product, marketing, and finance to turn raw data into actionable insights that drive business decisions. You must be fluent in the core libraries, tools, and cloud services listed below. Your Responsibilities: Pipeline Management Design, build, and maintain ETL/ELT workflows using orchestration frameworks (e.g., Airflow, dbt). Exploratory Data Analysis & Visualization Perform EDA and statistical analysis using Python or R . Prototype and deliver interactive charts and dashboards. BI & Reporting Develop dashboards and scheduled reports to surface KPIs and trends. Configure real-time alerts for data anomalies or thresholds. Insights Delivery & Storytelling Translate complex analyses (A/B tests, forecasting, cohort analysis) into clear recommendations. Present findings to both technical and non-technical audiences. Collaboration & Governance Work cross-functionally to define data requirements, ensure quality, and maintain governance. Mentor junior team members on best practices in code, version control, and documentation. Key Skills: You must know at least one technology from each category below: Data Manipulation & Analysis Python: pandas, NumPy R: tidyverse (dplyr, tidyr) Visualization & Dashboarding Python: matplotlib, seaborn, Plotly R: ggplot2, Shiny BI Platforms Commercial or Open-Source (e.g., Tableau, Power BI, Apache Superset, Grafana) ETL/ELT Orchestration Apache Airflow, dbt, or equivalent Cloud Data Services AWS (Redshift, Athena, QuickSight) GCP (BigQuery, Data Studio) Azure (Synapse, Data Explorer) Databases & Querying RDBMS Strong SQL Skill (PostgreSQL, MySQL, Snowflake) Decent Knowledge of NoSQL databases Additionally : Bachelors or Masters in a quantitative field (Statistics, CS, Economics, etc.). 3-5 years in a data analyst (or similar) role with end-to-end pipeline ownership. Strong problem-solving mindset and excellent communication skills. Certification of power BI, Tableau is a plus Desired Skills & Attributes Familiarity with version control (Git) and CI/CD for analytics code. Exposure to basic machine-learning workflows (scikit-learn, caret). Comfortable working in Agile/Scrum environments and collaborating across domains.
Posted 3 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Kolkata, Pune, Chennai
Hybrid
Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data warehousing solutions using Snowflake. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions on time. Develop complex SQL queries to optimize database performance and troubleshoot issues. Implement automation scripts using Python to streamline tasks and improve efficiency. Participate in code reviews to ensure adherence to coding standards and best practices. Mandatory Skills - Snowflake , SQL , Python , DBT
Posted 3 weeks ago
2.0 - 7.0 years
0 - 1 Lacs
Mumbai
Remote
Data Engineer Company Name: Fluid AI Role Overview: As a Data Engineer, you will be responsible for designing and maintaining the data frameworks that power our Gen-AI products. Youll work closely with engineering, product, and AI research teams to ensure our data models are scalable, secure, and optimized for real-world performance across diverse use cases. This is a hands-on and strategic role, ideal for someone who thrives in fast-paced, innovative environments. Key Responsibilities: Design, implement, and optimize data architectures to support large-scale AI and machine learning systems Collaborate with cross-functional teams to define data models, APIs, and integration flows Architect secure, scalable data pipelines for structured and unstructured data Oversee data governance, access controls, and compliance (GDPR, SOC2, etc.) Select appropriate data storage technologies (SQL/NoSQL/data lakes) for various workloads Work with MLOps and DevOps teams to enable real-time data availability and model serving Evaluate and integrate third-party APIs, datasets, and connectors Contribute to system documentation and data architecture diagrams Support AI researchers with high-quality, well-structured data pipelines Required Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field 5+ years of experience as a Data Architect, Data Engineer, or in a similar role Expertise in designing cloud-based data architectures (AWS, Azure, GCP) Strong knowledge of SQL, NoSQL, and distributed databases (PostgreSQL, MongoDB, Cassandra, etc.) Experience with big data tools like Spark, Kafka, Airflow, or similar Familiarity with data warehousing tools (Redshift, BigQuery, Snowflake) Solid understanding of data privacy, compliance, and governance best practices Preferred Qualifications: Experience working on AI/ML or Gen AI-related products Proficiency in Python or another scripting language used for data processing Exposure to building APIs for data ingestion and consumption Prior experience supporting enterprise-level SaaS products Strong analytical and communication skills Travel & Documentation Requirement: Candidate must hold a valid passport Willingness to travel overseas for 1 week (as part of client collaboration) Having a valid US visa (e.g., B1/B2, H1B, Green Card, etc.) is a strong advantage Why Join Us: Work on high-impact, cutting-edge Generative AI products Collaborate with some of the best minds in AI, engineering, and product Flexible work culture with global exposure Opportunity to work on deeply technical challenges with real-world impact
Posted 3 weeks ago
4.0 - 9.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities : Technical Skills Proficiency in Snowflake scripting - stored procedures (SQL), user defined functions, Common Table expressions, Window functions. Experience in creation of Snowpark python procedures and user defined functions. Knowledge of Snowpark architecture, creation of Snowpark data frames, data transformations in Snowpark. Process Parquet, JSON semi structured data in Snowflake using parse_json, lateral flatten DBT (Data build tool) DBT Cloud and DBT core hands on experience in creating models as per requirements. DBT experience in creating and using macros, jinja scripting, hooks, automated tests, snapshots, DBT packages. Experience implementing data sharing, replication, dynamic data masking using masking policies, secure views, row access policies. Usage of tags, streams, tasks, external tables, time travel, clone, storage integration, stages, file format, clustering of larger tables on clustering keys, role-based access control Identify and fix performance issues in Snowflake and DBT Experience working Visual Studio Code IDE. Familiarity with git concepts like creation of branches, cloning of repo, creation of pull requests. Knowledge of Apache airflow to schedule pipelines. Ability to create DB models to load dimensional models consisting of facts, dimensions. Knowledge of Azure Devops pipelines, agents, repos. Understanding of AWS services S3, Private link, IAM roles, security groups, VPC end points
Posted 3 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Pune
Hybrid
Snowflake L2/L3 Senior Engineer (Managed Services) 3 Positions Location: Hybrid Employment Type: Full-Time Experience Level: 5+ years (with 3+ in Snowflake) Shift: Rotational Shifts (24/7) Role Overview: We are looking for an experienced Snowflake Senior Engineer to oversee both development and production support operations within our managed services model. This is a dual-role leadership position requiring strong technical capabilities in Snowflake as well as expertise in managing ongoing production operations, enhancements, and client coordination. Key Responsibilities: Lead a team Snowflake developers and support engineers providing: Enhancements & feature development L2/L3 production support (incident management, monitoring, RCA) Manage and prioritize the support backlog and enhancement pipeline Serve as technical SME for Snowflake development and troubleshooting Ensure high platform availability, and performance tuning Conduct performance analysis, and enforce Snowflake best practices Coordinate with client stakeholders, DevOps, data engineers, and QA teams Own support SLAs, incident resolution timelines, and change management Prepare regular service reports and participate in governance calls Required Skills & Experience: 3+ years of hands-on Snowflake development and administration 6+ years of experience in data engineering or BI/DW support Experience leading teams in a managed services or enterprise support model Strong SQL, performance tuning, and debugging skills Knowledge of CI/CD, Python, ADF or similar orchestration tools Familiarity with monitoring tools and Snowflake Account Usage views Experience with Azure Data Factory Preferred: SnowPro Certification (Core/Advanced) Experience with ServiceNow or Jira Experience with Azure Data Factory Experience managing global support teams Snowflake L2/L3 Engineer 2 openings Location: Hybrid Employment Type: Full-Time Experience Level: 2-4 years Shift: Rotational Shifts (24/7) Role Overview: Join our Snowflake Managed Services team as a Software Engineer to work on data platform development, enhancements, and production support . You will support Snowflake environments across multiple clients, ensuring stability, performance, and continuous improvement. Key Responsibilities: Design and develop Snowflake pipelines, data models, and transformations Provide L2/L3 production support for Snowflake jobs, queries, and integrations Troubleshoot failed jobs, resolve incidents, and conduct RCA Tune queries, monitor warehouses, and help optimize Snowflake usage and cost Handle service requests like user provisioning, access changes, and role management Document issues, enhancements, and standard procedures (runbooks) Required Skills & Experience: 2+ years of hands-on experience in Snowflake development and support Strong SQL, data modeling, and performance tuning experience Exposure to CI/CD pipelines and scripting languages (e.g., Python, Shell) Experience with data pipelines and orchestration tools ( ADF) Preferred: SnowPro Core Certification Experience with ticketing systems (ServiceNow, Jira) Cloud experience with Azure Basic understanding of ITIL processes Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
6.0 - 10.0 years
20 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Snowflake Developer _ Reputed US based IT MNC If you are a Snowflake Matillion Developer, Email your CV to jagannaath@kamms.net Experience : 5 Years + (Must be 100% real time experience can apply) Role : Snowflake Developer Preferred : Snowflake certifications (SnowPro Core/Advanced) Position Type: Full time/ Permanent Location : Hyderabad, Bengaluru and Chennai ( Hybrid - local candidates) Notice Period: Immediate to 15 Days Salary: As per your experience Responsibilities: 5+ years of experience in data engineering, ETL, and Snowflake development. Strong expertise in Snowflake SQL scripting, performance tuning, data warehousing concepts. Strong knowledge of cloud platforms (AWS/Azure/GCP) and cloud-based data architecture. Proficiency in SQL, Python, or scripting languages for automation & transformation. Experience with API integrations & data ingestion frameworks. Understanding of data governance, security policies, and access control in Snowflake. Excellent communication skills ability to interact with business and technical stakeholders. Self-starter who can work independently and drive projects to completion
Posted 3 weeks ago
10.0 - 15.0 years
14 - 18 Lacs
Bengaluru
Work from Office
About the Role: Were looking for an experienced Engineering Manager to lead the development of highly scalable, reliable, and secure platform services and database connectors that power mission-critical data pipelines for thousands of enterprise customers. These pipelines connect to modern data warehouses such as Snowflake, BigQuery, and Databricks , as well as Data Lakes like Apache . This is a rare opportunity to own and build foundational systems , solve complex engineering challenges , and lead a high-impact team delivering best-in-class performance at scale. You will play a central role in shaping our platform vision, driving high accountability , and fostering a culture of technical excellence and high performance while working closely with cross-functional stakeholders across product, program, support, and business teams. What Youll Do: Lead, mentor and inspire a team of software engineers who take pride in ownership and delivering impact. Ensure operational excellence through proactive monitoring, automated processes, and a culture of continuous improvement with strong accountability. Drive a strong quality-first mindset , embedding it into the development lifecyclefrom design to deployment. Drive technical leadership through architecture reviews , code guidance, and solving critical platform challenges. Build and operate multi-tenant, distributed backend systems at scale. Act as a technical leader youve operated at least at Tech Lead, Staff Engineer, or Principal Engineer level in your career. Champion a culture of high accountability , clear ownership, and high visibility across engineering and cross-functional stakeholders. Collaborate deeply with Product, Program, Support, and Business functions to drive alignment and execution. Embed principles of observability, reliability, security, and auditability into all aspects of the platform. Inspire the team to pursue engineering excellence , driving best-in-class implementations and visible results. Define and track data-driven KPIs to ensure operational efficiency, performance, and team effectiveness. Take end-to-end ownership of product lines, ensuring on-time delivery and customer success . Contribute to team growth , hiring, and building an inclusive, learning-focused engineering environment. What Were Looking For: 10+ years of experience in backend or systems software development. 2+ years in a formal or informal Engineering Manager, Sr. Engineering Manager, or Tech Lead role in a fast-paced engineering environment. Progression through senior IC roles like Tech Lead, Staff, or Principal Engineer . Strong experience with distributed systems , cloud-native architectures , and multi-tenant platforms. Proven ability to drive cross-team collaboration with product, support, business, and program teams. Demonstrated ability to drive accountability , set clear goals, and raise the performance bar for the team. Expertise in system design, scalability, performance optimization, and cost control. Proven track record of mentoring engineers , guiding architecture, and leading impactful initiatives. Clear communicator, adept at both strategy and execution. Bonus Points: Experience with data engineering platforms , ETL systems , or database internals . Exposure to product-driven companies , especially in infrastructure, SaaS , or backup/data systems . Demonstrated history of fast-tracked growth or high-visibility impact. Led or contributed to re-architecture or greenfield systems at scale.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Title: Technical Data Analyst Work Mode: Remote Contract Duration: 6 Months to 1 Year Location: Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote (Open to candidates across India) Experience: 5+ Years Job Overview: We are seeking a highly skilled Technical Data Analyst for a remote contract position (6 to 12 months) to help build a single source of truth for our high-volume direct-to-consumer accounting and financial data warehouse. You will work closely with Finance & Accounting teams and play a pivotal role in dashboard creation, data transformation, and migration from Snowflake to Databricks. Key Responsibilities: 1. Data Analysis & Reporting Develop month-end accounting and tax dashboards using SQL in Snowflake (Snowsight) Migrate and transition reports/dashboards to Databricks Gather, analyze, and transform business requirements from finance/accounting stakeholders into data products 2. Data Transformation & Aggregation Build transformation pipelines in Databricks to support balance sheet look-forward views Maintain data accuracy and consistency throughout the Snowflake Databricks migration Partner with Data Engineering to optimize pipeline performance 3. ERP & Data Integration Support integration of financial data with NetSuite ERP Validate transformed data to ensure correct ingestion and mapping into ERP systems 4. Ingestion & Data Ops Work with Fivetran for ingestion and resolve any pipeline or data accuracy issues Monitor data workflows and collaborate with engineering teams on troubleshooting Required Skills & Qualifications: 5+ years of experience as a Data Analyst (preferably in Finance/Accounting domain) Strong in SQL, with proven experience in Snowflake and Databricks Experience in building financial dashboards (month-end close, tax reporting, balance sheets) Understanding of financial/accounting data: GL, journal entries, balance sheet, income statements Familiarity with Fivetran or similar data ingestion tools Experience with data transformation in a cloud environment Strong communication and stakeholder management skills Nice to have: Experience working with NetSuite ERP Apply Now: Please share your updated resume with the following details: Full Name Total Experience Relevant Experience in SQL, Snowflake, Databricks Experience in Finance or Accounting domain Current Location Availability (Notice Period) Current and Expected Rate
Posted 3 weeks ago
5.0 - 9.0 years
7 - 11 Lacs
Pune
Work from Office
hiring a Data Operations Engineer for a 6-month contractual role based in Pune. The ideal candidate should have 5-9 years of experience in Data Operations, Technical Support, or Reporting QA. You will be responsible for monitoring data health, validating config payloads, troubleshooting Airflow DAGs, documenting best practices, and supporting Ad Tech partner integrations. Proficiency in Snowflake, Airflow, Python scripting, and SQL is mandatory. Excellent communication, problem-solving skills, and a proactive attitude are essential for success in this role.
Posted 3 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Gurugram, Bengaluru
Work from Office
Department: Technology Reports To: Middle and Back Office Data Product Owner About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Middle and Back Office Data Analyst role is instrumental in the creation and execution of a future state design for Fund Servicing & Oversight data across Fidelitys key business areas. The successful candidate will have an in- depth knowledge of data domains that represent Middle and Back-office operations and technology. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Data Product - Requirements Definition and Delivery of Data Outcomes Analysis of data product requirements to enable business outcomes, contributing to the data product roadmap Capture both functional and non-functional data requirements considering the data product and consumers perspectives. Conduct workshops with both the business and tech stakeholders for requirements gathering, elicitation and walk throughs. Responsible for the definition of data requirements, epics and stories within the product backlog and providing analysis support throughout the SDLC. Responsible for supporting the UAT cycles, attaining business sign off on outcomes being delivered Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering principles. Coordination and Communication: Excellent communication skills to influence technology and business stakeholders globally, attaining alignment and sign off on the requirements. Coordinate with internal and external stakeholders to communicate data product deliveries and the change impact to the operating model. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you At least 10 years of proven experience as a business/technical/data analyst within technology and/or business changes within the financial servicesasset management industry. Minimum 5 years as a senior business/technical/data analyst adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Proven experience. of delivering data driven business outcomes using industry leading data platforms such as Snowflake. Excellent knowledge of data life cycle that drives Middle and Back Office capabilities such as trade execution, matching, confirmation, trade settlement, record keeping, accounting, fund & cash positions, custody, collaterals/margin movements, corporate actions , derivations and calculations such as holiday handling, portfolio turnover rates, funds of funds look through . In Depth expertise in data and calculations across the investment industry covering the below. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituents and licensing restrictions on them. ABOR & IBOR data: This includes calculation engines covering input data sets, calculations and treatment of various instruments for ABOR and IBOR data leveraging platforms such as Simcorp, Neoxam, Invest1, Charles River, Aladdin etc. Knowledge of TPAs, how data can be structured in a unified way from heterogenous structures. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Excellent hands-on SQL, Advanced Excel, Python, ML (optional) and proven experience and knowledge of data solutions. Knowledge of data management, data governance, and data engineering practices Hands on experience on data modelling techniques such as dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 3 weeks ago
6.0 - 11.0 years
10 - 20 Lacs
Hyderabad, Bangalore Rural, Bengaluru
Work from Office
We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Master’s degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 3 weeks ago
3.0 - 8.0 years
6 - 16 Lacs
Hyderabad, Chennai, Delhi / NCR
Hybrid
Hiring for Snowflake Developer with experience range 2+ years & above Mandatory Skills: Snowflake Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location- PAN INDIA
Posted 3 weeks ago
2.0 - 7.0 years
5 - 8 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Looking for Snowflake Developer. Design and implement data solutions using Snowflake cloud data platform. Location: PAN INDIA EXP- 2+yrs
Posted 3 weeks ago
5.0 - 10.0 years
5 - 15 Lacs
Hyderabad, Pune
Hybrid
Job Overview : We are seeking an experienced IICS (Informatica Intelligent Cloud Services) Developer with hands-on experience in the IICS platform . The ideal candidate must have strong knowledge of Snowflake and be proficient in building and managing integrations between different systems and databases. The role will involve working with cloud-based integration solutions, ensuring data flows seamlessly across platforms, and optimizing performance for large-scale data processes. Key Responsibilities : Design, develop, and implement data integration solutions using IICS (Informatica Intelligent Cloud Services). Work with Snowflake data warehouse solutions, including data loading, transformation, and querying. Build, monitor, and maintain efficient data pipelines between cloud-based systems and Snowflake. Troubleshoot and resolve integration issues within the IICS platform and Snowflake . Ensure optimal data processing performance and manage data flow between various cloud applications and databases. Collaborate with data architects, analysts, and stakeholders to gather requirements and design integration solutions. Implement best practices for data governance, security, and data quality within the integration solutions. Perform unit testing and debugging of IICS data integration tasks. Optimize integration workflows to ensure they meet performance and scalability needs. Key Skills : Hands-on experience with IICS (Informatica Intelligent Cloud Services) . Strong knowledge and experience working with Snowflake as a cloud data warehouse. Proficient in building ETL/ELT workflows , including integration of various data sources into Snowflake . Experience with SQL and writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms. Experience with cloud integration platforms and working with RESTful APIs and other integration protocols. Ability to troubleshoot, optimize, and maintain data pipelines effectively. Knowledge of data governance, security principles, and data quality standards. Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field (or equivalent experience). Minimum of 5 years of experience in data integration development. Proficiency in Snowflake and cloud-based data solutions. Strong understanding of ETL/ELT processes and integration design principles. Experience working in Agile or similar development methodologies.
Posted 3 weeks ago
4.0 - 9.0 years
14 - 18 Lacs
Bengaluru
Work from Office
About the Role: Platform Product Owner Data Pipelines Were looking for a product-driven, data-savvy Platform Product Owner to lead the evolution of Hevos Data Pipelines Platform. This role blends strategic product thinking with operational excellence and offers full ownershipfrom defining product outcomes to driving delivery health and platform reliability. Youll work closely with Engineering, Architecture, and cross-functional teams to shape the platform roadmap, define user value, and ensure successful outcomes through measurable impact. If you're passionate about building scalable, high-impact data productsand excel at balancing strategy with executionthis role is for you. Key Responsibilities: Product Ownership & Strategy Define and evolve the product vision and roadmap in collaboration with Product Leadership. Translate vision into a value-driven, structured product backlog focused on scalability, reliability, and user outcomes. Craft clear user stories with well-defined acceptance criteria and success metrics. Partner with Engineering and Architecture to design and iterate on platform capabilities aligned with long-term strategy. Analyze competitive products to identify experience gaps, technical differentiators, and new opportunities. Ensure platform capabilities deliver consistent value to internal teams and end users. Product Operations & Delivery Insights Define and track key product health metrics (e.g., uptime, throughput, SLA adherence, adoption). Foster a metrics-first culture in product deliveryensuring every backlog item ties to measurable outcomes. Triage bugs and feature requests, assess impact, and feed insights into prioritization and planning. Define post-release success metrics and establish feedback loops to evaluate feature adoption and performance. Build dashboards and reporting frameworks to increase visibility into product readiness, velocity, and operations. Improve practices around backlog hygiene, estimation accuracy, and story lifecycle management. Ensure excellence in release planning and launch execution to meet quality and scalability benchmarks. Collaboration & Communication Champion the product vision and user needs across all stages of development. Collaborate with Support, Customer Success, and Product Marketing to ensure customer insights inform product direction. Develop enablement materials (e.g., internal walkthroughs, release notes) to support go-to-market and support teams. Drive alignment and accountability throughout the product lifecyclefrom planning to post-release evaluation. Qualifications: Required Bachelors degree in Computer Science or a related engineering field. 5+ years of experience as a Product Manager/Product Owner, with time spent on platform/infrastructure products at B2B startups. Hands-on experience with ETL tools or modern data platforms (e.g., Talend, Informatica, AWS Glue, Snowflake, BigQuery, Redshift, Databricks). Strong understanding of the product lifecycle with an operations-focused mindset. Proven ability to collaborate with engineering teams to build scalable, reliable features. Familiarity with data integration, APIs, connectors, and streaming/real-time data pipelines. Analytical mindset with experience tracking KPIs and making data-informed decisions. Excellent communication and cross-functional collaboration skills. Proficiency with agile product development tools (e.g., Jira, Aha!, Linear). Preferred Experience in a data-intensive environment. Engineer-turned-Product Manager with a hands-on technical background. MBA from a Tier-1 institute.
Posted 3 weeks ago
3.0 - 6.0 years
8 - 12 Lacs
Noida, Hyderabad, Gurugram
Work from Office
Mandatory Skills- Power BI Desktop & Service (report development, publishing, workspace management) Advanced DAX (complex calculations and measures) Power Query (M language) (data transformation and cleansing) Data Modeling (star schema, snowflake schema, relationships) Row-Level Security (RLS) and role-based access control Integration with multiple data sources (SQL, Azure Data Lake, Excel, APIs) Performance Optimization (query folding, refresh schedules, rendering efficiency) ETL Collaboration (working with data engineering teams and pipelines) Governance & Lifecycle Management (version control, audit trails) Migration of legacy reports to Power BI Requirement Gathering & Stakeholder Collaboration Data Quality & Governance Alignment Detailed JD- Lead the design, development, and implementation of Power BI dashboards and reports to provide actionable insights for business stakeholders. Collaborate with business users, data analysts, and other stakeholders to gather reporting requirements and translate them into comprehensive data visualization solutions. Develop and maintain complex DAX calculations, measures, and Power BI data models to support efficient data analysis and reporting. Ensure data accuracy and consistency by designing robust data models, integrating multiple data sources such as SQL databases, Azure Data Lake, Excel, and other APIs. Optimize Power BI performance, including query efficiency, data refresh schedules, and report rendering, ensuring minimal delays in accessing real-time data. Implement Power BI security roles, row-level security (RLS), and access controls to ensure secure and role-based data access for users. Guide the deployment of Power BI reports and dashboards using Power BI Service, integrating with data pipelines and automating workflows as necessary. Provide leadership and mentorship to junior Power BI developers, offering technical guidance and promoting best practices in report design, data modeling, and performance optimization. Work closely with data engineering teams to integrate Power BI with ETL processes, data warehouses, and cloud environments such as Azure or AWS. Implement Power BI governance, including version control, report lifecycle management, and audit trails, ensuring consistency and compliance across reporting environments. Lead the migration of legacy reporting solutions to Power BI, ensuring a smooth transition with minimal business disruption. Collaborate with data governance teams to ensure that Power BI reports align with data quality standards, business glossaries, and metadata management practices. Utilize Power Query to transform and clean data before loading it into Power BI, ensuring readiness for analysis and visualization. Lead data storytelling efforts by creating compelling visuals, including interactive dashboards, KPIs, and drill-down capabilities that help stakeholders make informed decisions. Stay updated on the latest Power BI features, updates, and best practices, incorporating new functionalities into existing solutions to enhance reporting capabilities. Provide end-user training and support to ensure stakeholders can effectively use Power BI reports and dashboards for self-service analytics. Oversee the integration of Power BI with other tools like Power Automate and Power Apps to create a seamless data and reporting ecosystem. Drop your resume at Aarushi.Shukla@coforge.com
Posted 3 weeks ago
7.0 - 12.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Your Job As a Data Engineer you will be a part of an team that designs, develops, and delivers Data Pipelines and Data Analytics Solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Solution India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KSGI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Solution (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees. Our Team The Enterprise data and analytics team at Georgia Pacific is focused on creating an enterprise capability around Data Engineering Solutions for operational and commercial data as well as helping businesses develop, deploy, manage monitor Data Pipelines and Analytics solutions of manufacturing, operations, supply chain and other key areas. What You Will Do ETL SolutionsDesign, implement, and manage large-scale ETL solutions using the AWS technology stack, including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch. Data Pipeline ManagementDesign, develop, enhance, and debug existing data pipelines to ensure seamless operations. Data ModellingProven Experience in Designing, Developing Data Modeling. Best Practices ImplementationDevelop and implement best practices to ensure high data availability, computational efficiency, cost-effectiveness, and data quality within Snowflake and AWS environments. EnhancementBuild and enhance Data Products, processes, functionalities, and tools to streamline all stages of data lake implementation and analytics solution development, including proof of concepts, prototypes, and production systems. Production SupportProvide ongoing support for production data pipelines, ensuring high availability and performance. Issue ResolutionMonitor, troubleshoot, and resolve issues within data pipelines and ETL processes promptly. AutomationDevelop and implement scripts and tools to automate routine tasks and enhance system efficiency Who You Are (Basic Qualifications) Bachelor's degree in Computer Science, Engineering, or a related IT field, with at least 7+ years of experience in software development. 5+ Years of hands-on experience of Designing, implementing, and managing large-scale ETL solutions using the AWS technology stack including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch. Primary skill setSQL, S3, AWS Glue, Pyspark, Python, Lambda, Columnar DB (Redshift), AWS IAM, Step Functions, Git, Terraform, CI/CD. Good to haveExperience with the MSBI stack, including SSIS, SSAS, and SSRS. What Will Put You Ahead In-depth knowledge of entire suite of services in AWS Data Service Platform. Strong coding experience using Python, Pyspark. Experience of designing and implementing Data Modeling. Cloud Data Analytics/Engineering certification. Who We Are At Koch, employees are empowered to do what they do best to make life better. Learn how ourhelps employees unleash their potential while creating value for themselves and the company.Additionally, everyone has individual work and personal needs. We seek to enable the best work environment that helps you and the business work together to produce superior results.
Posted 3 weeks ago
7.0 - 9.0 years
9 - 11 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
We're Hiring: Data Governance LeadLocations:Offices in Austin (USA), Singapore, Hyderabad, Indore, Ahmedabad (India)Primary Job Location: Mumbai/ Hyderabad / Indore / Ahmedabad (Work from Office) Compensation Range: Competitive | Based on experience and expertise To Apply, Share Your Resume With:Current CTCExpected CTCNotice PeriodPreferred Location What You Will Do Role Overview A Key Responsibilities 1 Governance Strategy & Stakeholder EnablementDefine and drive enterprise-level data governance frameworks and policies Align governance objectives with compliance, analytics, and business priorities Work with IT, Legal, Compliance, and Business teams to drive adoption Conduct training, workshops, and change management programs 2 Microsoft Purview Implementation & AdministrationAdminister Microsoft Purview: accounts, collections, RBAC, and scanning policies Design scalable governance architecture for large-scale data environments (>50TB) Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, and Snowflake 3 Metadata & Data Lineage ManagementDesign metadata repositories and workflows Ingest technical/business metadata via ADF, REST APIs, PowerShell, Logic Apps Validate end-to-end lineage (ADF Synapse Power BI), impact analysis, and remediation 4 Data Classification & SecurityImplement and govern sensitivity labels (PII, PCI, PHI) and classification policies.Integrate with Microsoft Information Protection (MIP), DLP, Insider Risk, and Compliance Manager.Enforce lifecycle policies, records management, and information barriers. Working knowledge of GDPR, HIPAA, SOX, CCPA.Strong communication and leadership to bridge technical and business governance.
Posted 3 weeks ago
5.0 - 10.0 years
18 - 27 Lacs
Pune, Chennai, Bengaluru
Work from Office
• ETL (extract, transform, and loading) Azure,SQL,Snowflake, data factory,
Posted 3 weeks ago
8.0 - 10.0 years
5 - 9 Lacs
Noida
Work from Office
We are looking for a skilled Power BI Dashboarding and Visualization Developer with 8 to 10 years of experience. The ideal candidate will have a strong background in designing and developing interactive dashboards and visualizations using Power BI, as well as integrating and optimizing Power BI solutions within cloud environments. Roles and Responsibility Design and develop interactive dashboards and visualizations using Power BI. Integrate and optimize Power BI solutions within AWS and Azure environments. Collaborate with business users to gather requirements and deliver insights. Ensure data accuracy, security, and performance. Develop and maintain complex data models and reports using Power BI. Troubleshoot and resolve issues related to Power BI dashboard development. Job Strong experience with Power BI, including DAX, Power Query, and data modeling. Proficiency in SQL for querying and data manipulation. Familiarity with data warehouses such as Redshift, Snowflake, or Synapse. Knowledge of Azure Data Factory and AWS Glue for data integration. Understanding of REST APIs and integrating external data sources. Experience with Git for version control and CI/CD pipelines. Excellent communication and problem-solving skills.
Posted 3 weeks ago
6.0 - 11.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled Snowflake Ingress/Egress Specialist with 6 to 12 years of experience to manage and optimize data flow into and out of our Snowflake data platform. This role involves implementing secure, scalable, and high-performance data pipelines, ensuring seamless integration with upstream and downstream systems, and maintaining compliance with data governance policies. Roles and Responsibility Design, implement, and monitor data ingress and egress pipelines in and out of Snowflake. Develop and maintain ETL/ELT processes using tools like Snowpipe, Streams, Tasks, and external stages (S3, Azure Blob, GCS). Optimize data load and unload processes for performance, cost, and reliability. Coordinate with data engineering and business teams to support data movement for analytics, reporting, and external integrations. Ensure data security and compliance by managing encryption, masking, and access controls during data transfers. Monitor data movement activities using Snowflake Resource Monitors and Query History. Job Bachelor's degree in Computer Science, Information Systems, or a related field. 6-12 years of experience in data engineering, cloud architecture, or Snowflake administration. Hands-on experience with Snowflake features such as Snowpipe, Streams, Tasks, External Tables, and Secure Data Sharing. Proficiency in SQL, Python, and data movement tools (e.g., AWS CLI, Azure Data Factory, Google Cloud Storage Transfer). Experience with data pipeline orchestration tools such as Apache Airflow, dbt, or Informatica. Strong understanding of cloud storage services (S3, Azure Blob, GCS) and working with external stages. Familiarity with network security, encryption, and data compliance best practices. Snowflake certification (SnowPro Core or Advanced) is preferred. Experience with real-time streaming data (Kafka, Kinesis) is desirable. Knowledge of DevOps tools (Terraform, CI/CD pipelines) is a plus. Strong communication and documentation skills are essential.
Posted 3 weeks ago
3.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are looking for a skilled Data Engineer with 3 to 6 years of experience in processing data pipelines using Databricks, PySpark, and SQL on Cloud distributions like AWS. The ideal candidate should have hands-on experience with Databricks, Spark, SQL, and AWS Cloud platform, especially S3, EMR, Databricks, Cloudera, etc. Roles and Responsibility Design and develop large-scale data pipelines using Databricks, Spark, and SQL. Optimize data operations using Databricks and Python. Develop solutions to meet business needs reflecting a clear understanding of the objectives, practices, and procedures of the corporation, department, and business unit. Evaluate alternative risks and solutions before taking action. Utilize all available resources efficiently. Collaborate with cross-functional teams to achieve business goals. Job Experience working in projects involving data engineering and processing. Proficiency in large-scale data operations using Databricks and overall comfort with Python. Familiarity with AWS compute, storage, and IAM concepts. Experience with S3 Data Lake as the storage tier. ETL background with Talend or AWS Glue is a plus. Cloud Warehouse experience with Snowflake is a huge plus. Strong analytical and problem-solving skills. Relevant experience with ETL methods and retrieving data from dimensional data models and data warehouses. Strong experience with relational databases and data access methods, especially SQL. Excellent collaboration and cross-functional leadership skills. Excellent communication skills, both written and verbal. Ability to manage multiple initiatives and priorities in a fast-paced, collaborative environment. Ability to leverage data assets to respond to complex questions that require timely answers. Working knowledge of migrating relational and dimensional databases on AWS Cloud platform.
Posted 3 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for a skilled Database Engineer with 5 to 10 years of experience to design, develop, and maintain our database infrastructure. This position is based remotely. Roles and Responsibility Design, develop, and maintain the database infrastructure to store and manage company data efficiently and securely. Work with databases of varying scales, including small-scale and big data processing. Implement data security measures to protect sensitive information and comply with relevant regulations. Optimize database performance by analyzing query execution plans, implementing indexing strategies, and improving data retrieval and storage mechanisms. Collaborate with cross-functional teams to understand data requirements and support the design of the database architecture. Migrate data from spreadsheets or other sources to relational database systems or cloud-based solutions like Google BigQuery and AWS. Develop import workflows and scripts to automate data import processes. Ensure data integrity and enforce data quality standards, including data validation rules, constraints, and referential integrity. Monitor database health and resolve issues, while collaborating with the full-stack web developer to implement efficient data access and retrieval mechanisms. Demonstrate creativity in problem-solving and contribute ideas for improving data engineering processes and workflows, exploring third-party technologies as alternatives to legacy approaches for efficient data pipelines. Embrace a learning mindset, staying updated with emerging database technologies, tools, and best practices, and use Python for tasks such as data manipulation, automation, and scripting. Collaborate with the Data Research Engineer to estimate development efforts and meet project deadlines, taking accountability for achieving development milestones. Prioritize tasks to ensure timely delivery in a fast-paced environment with rapidly changing priorities, while also collaborating with fellow members of the Data Research Engineering Team as required. Perform tasks with precision and build reliable systems, leveraging online resources effectively like StackOverflow, ChatGPT, Bard, etc., considering their capabilities and limitations. Job Proficiency in SQL and relational database management systems like PostgreSQL or MySQL, along with database design principles. Strong familiarity with Python for scripting and data manipulation tasks, with additional knowledge of Python OOP being advantageous. Demonstrated problem-solving skills with a focus on optimizing database performance and automating data import processes. Knowledge of cloud-based databases like AWS RDS and Google BigQuery. Experience with data warehousing concepts and tools (e.g., Snowflake, Redshift) to support advanced analytics and reporting. Skills in working with APIs for data ingestion or connecting third-party systems, which could streamline data acquisition processes. Proficiency with tools like Prometheus, Grafana, or ELK Stack for real-time database monitoring and health checks beyond basic troubleshooting. Familiarity with continuous integration/continuous deployment (CI/CD) tools (e.g., Jenkins, GitHub Actions). Deeper expertise in cloud platforms (e.g., AWS Lambda, GCP Dataflow) for serverless data processing or orchestration. Knowledge of database development and administration concepts, especially with relational databases like PostgreSQL and MySQL. Knowledge of Python programming, including data manipulation, automation, and object-oriented programming (OOP), with experience in modules such as Pandas, SQLAlchemy, gspread, PyDrive, and PySpark. Knowledge of SQL and understanding of database design principles, normalization, and indexing. Knowledge of data migration, ETL (Extract, Transform, Load) processes, or integrating data from various sources. Knowledge of data security best practices, including access controls, encryption, and compliance standards. Strong problem-solving and analytical skills with attention to detail. Creative and critical thinking. Strong willingness to learn and expand knowledge in data engineering. Familiarity with Agile development methodologies is a plus. Experience with version control systems, such as Git, for collaborative development. Ability to thrive in a fast-paced environment with rapidly changing priorities. Ability to work collaboratively in a team environment. Good and effective communication skills. Comfortable with autonomy and ability to work independently. About Company Marketplace is an experienced team of industry experts dedicated to helping readers make informed decisions and choose the right products with ease. We arm people with trusted advice and guidance, so they can make confident decisions and get back to doing the things they care about most.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France