Jobs
Interviews

2244 Snowflake Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Your Impact: As a developer you will work as part of a highly skilled team of professionals who are responsible for architecture, designing and developing of cost effective and sustainable solutions for Security Product business of OpenText. Strong organizational skills, technical expertise and attention to detail are key in this customer-focused role. What the role offers: Translate business requirements using complex methods/models to determine appropriate system solutions Work within a cross-functional team to provide technical expertise in the design and planning of system solutions. Research, identify, test, certify, and select technology required for solution delivery. Maximize the performance, uptime, and supportability of the product. Developing highly scalable Security product using technologies such as Java, J2EE, REST, Azure, Aws, GCP and Snowflake. Working with team to design solutions to security problems, monitor and analyze the security vulnerabilities reported in bundled 3rd party products. Designs and implements new interface components in collaboration with the product owner and other OpenText development teams. Collaborates with engineer and development partners to develop reliable, cost-effective, and high-quality software solutions. Maintains the existing components and resolves problems reported by customers. Enhances existing components with new capabilities whilst maintaining compatibility. Provide feedback on test plans, test cases, and test methodologies. Research new technologies for product improvements and future roadmap. Communicate with stakeholders, provide project progress, highlight any risks involved along with mitigation plan. What you need to succeed: Bachelor's or masters degree in computer science, Information Systems, or equivalent. 2-5 years of software development experience building large-scale and highly distributed applications. Developing highly scalable Security product using technologies such as Java, J2EE, REST/SOAP, AWS, GCP, Snowflake, Azure. Demonstrated ability to have completed multiple, complex technical projects. Strong programming skills in Java, J2EE. Experience in Cloud (Aws or GCP or Azure) is must. Experience in working in Devops, Continuous Integration environment. Excellent communication skills and ability to interact effectively with both technical and non-technical staff In-depth technical experience in IT infrastructure area, Understanding of operational challenges involved in managing complex systems Previous experience in being a part of complex integration projects Technical execution of project activities and responsibilities for on-time delivery and results. Interfacing with customer facing functions to gather project requirements and performing due diligence as required. Providing technical guidance for trouble shooting and issue resolution when needed. Familiarity with Agile Software Development (preferably Scrum). Unit testing and mock framework like mockito. Desired Skills Understanding of the Security domain. Experience in Azure, Aws, GCP and Hadoop. Working knowledge in Linux. Cloud technologies and cloud application development. Good knowledge about the security threat models and good knowledge of various security encryption techniques. Knowledge of different types of security vulnerabilities, attack vectors and common type of cyberattacks.

Posted 1 week ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Ahmedabad

Hybrid

Snowflake Data Engineer Exp-5-10 years Location-Hyderabad Role & responsibilities Key Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines in Snowflake to support data migration from legacy systems. Leverage Python for data transformation, automation, and orchestration of migration workflows. Optimize and refactor complex SQL queries to ensure efficient data processing and reporting in Snowflake. Collaborate on data modeling and schema design to align with Snowflake architecture and performance best practices. Monitor and troubleshoot data pipeline performance during and after migration phases. Work closely with data analysts, scientists, and business stakeholders to ensure accurate and timely data delivery. Implement and enforce data governance, security policies , and access controls within Snowflake. Collaborate with DevOps teams to integrate data engineering workflows into broader CI/CD frameworks. Required Skills: 45 years of experience in data engineering , with proven expertise in Snowflake and Python . Strong command of Snowflake features such as scripting, time travel, virtual warehouses, and query optimization. Hands-on experience with ETL tools , data integration strategies, and migration methodologies. Solid understanding of data warehousing principles , normalization techniques, and performance optimization. Familiarity with cloud platforms (AWS, Azure, or GCP) and orchestration tools. Excellent problem-solving skills and ability to work independently in a dynamic, fast-paced environment

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

punjab

On-site

About Us We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today! JOB DESCRIPTION/ RESPONSIBILITIES: We are looking for Data Engineers:- Candidate must have a minimum of (10) years of experience in a Data Engineer role, including the following tools/technologies: Experience with relational (SQL) databases. Experience with data warehouses like Oracle, SQL & Snowflake. Technical expertise in data modeling, data mining and segmentation techniques. Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration (PDI),ADF (Azure Data Factory),Snowpipe,Fivetran,DBT Experience with batch and real time data ingestion and processing frameworks. Experience with languages like Python, Java, etc. Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus. * Develops code and solutions that transfers/transforms data across various systems * Maintains deep technical knowledge of various tools in the data warehouse, data hub, * and analytical tools. * Ensures data is transformed and stored in efficient methods for retrieval and use. * Maintains data systems to ensure optimal performance * Develops a deep understanding of underlying business systems involved with analytical systems. * Follows standard software development lifecycle, code control, code standards and * process standards. * Maintains and develops technical knowledge by self-training of current toolsets and * computing environments, participates in educational opportunities, maintains professional * networks, and participates in professional organizations related to their tech skills Systems Analysis * Works with key stakeholders to understand business needs and capture functional and * technical requirements. * Offers ideas that simplify the design and complexity of solutions delivered. * Effectively communicates any expectations required of stakeholders or other resources * during solution delivery. * Develops and executes test plans to ensure successful rollout of solution, including accuracy * and quality of data. Service Management * Effectively communicates to leaders and stakeholders any obstacles that occur during * solution delivery. * Defines and manages promised delivery dates. * Pro-actively research, analyzes, and predicts operational issues, informing leadership * where appropriate. * Offers viable options to solve unexpected/unknown issues that occur during * solution development and delivery. * EDUCATION / JOB-RELATED TECHNICAL SKILLS: * Bachelors Degree in Computer Science/Information Technology or equivalent * Ability to effectively communicate with others at all levels of the Company both verbally and in * writing. Demonstrates a courteous, tactful, and professional approach with employees and * others. * Ability to work in a large, global corporate structure Our Commitment to Our People Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. Thats why everything we do is geared toward a sustainable futurefor our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial. Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally. Flexible and competitive benefits plans offer the right options to meet your individual/family needs . We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. Together, we have the opportunity and the power to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team! Our Commitment to Diversity, Equity & Inclusion At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise . Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. Equal Opportunity Employer ,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant, Power BI Developer In this role, you will be responsible for coding, testing, and delivering high quality deliverables, and should be willing to learn new technologies. Responsibilities Understand fundamentals of data preparation/data modelling necessary for visualization purposes. Develop the report and visualization using Power BI Work experience in Stored Procedure, SQL Strong knowledge Database concepts and internals Work experience in Snowflake and MS SQL Strong understanding of Power BI user and group security configuration Qualifications we seek in you! Minimum Qualifications BE/B Tech/MCA Excellent written and verbal communication skills Preferred Qualifications/ Skills Good conceptual knowledge and experience in other BI tools is an added advantage. Experienced Power BI developer including experience in Azure environment. Experience required in implementing Power BI reports reading data from different data sources including on premise data servers, cloud services and several file formats. Experience in creating Reports, Dashboard and visualizations in PowerBI Competent with Power BI Desktop, Gateway, Service and Report Server. Good Experience in Paginated Reports using Report Builder. Experience in migrating SSRS reports/Tabular reports to Power BI Paginated Reports. Good understanding of Power Query M and hands-on experience in building sophisticated DAX queries Experience in implementing static and dynamic Row Level Security, extensive experience with Dataset design, Data Cleansing and Data Aggregation Understanding of relational database structures, theories, principles, and practices Solid SQL reporting skills with the ability to create SQL views and write SQL queries to build custom datasets for reporting or analysis Able to nurture robust working relationships with the team, peers and clients, scheduling flexibility required. Overall the candidate should have problem solving, macro-level research and an analytic approach and good in numbers. Job Principal Consultant Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Mar 19, 2025, 10:19:53 PM Unposting Date Apr 18, 2025, 11:59:00 PM Master Skills List Consulting Job Category Full Time,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Position Type : Full time Type Of Hire : Experienced (relevant combo of work and education) Education Desired : Bachelor of Computer Science Travel Percentage : 0% Principal/Sr. Lead Engineer Automation Are you curious, motivated, and forward-thinking At FIS youll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun. About the team FIS Protegent helps compliance officers cope with increasing data volumes, understand more sophisticated product risks, stay ahead of evolving regulations and rapidly respond to ongoing demands from auditors and regulators. FIS Compliance Suite of products helps firms meet their compliance and regulatory obligations from providing comprehensive surveillance for insider trading and market manipulation to assisting in supervisory controls and supporting management reporting. What you will be doing You will be a part of the FIS Compliance Suite (formerly Protegent) next generation product development initiative, joining a team of highly motivated and focused developers to develop our next generation compliance product. As a Senior Automation Engineer at FIS Global, you will play a key role in automation for cloud-based applications by understanding data flow and configurations for multiples environments to expedite release/build verifications to improve on quality and stability. Leveraging your expertise in scripting, AWS, Jenkins and Snowflake, you will collaborate with cross-functional teams to solve challenging problems and drive strategic automation initiatives. You will be responsible to devise and utilize automation framework and DevOps best practices to set up end to end automation pipeline to improve build and release quality; review and validate data for uniformity and accuracy, analyse results for failures, and interpret with clear objectives in mind. You will get an opportunity to work with multiple products and businesses and develop a good understanding of the interesting world of trading and compliance. What you bring: Knowledge / Experience - Minimum of five years of experience in AWS, Snowflake and DevOps automation work with a proven track record of delivering impactful solutions. - Strong SQL skills - Proficiency in programming languages such as Python, Unix scripting as well as experience with Jenkins build pipeline and release deployment automation. - Strong analytical and problem-solving skills, with the ability to translate business requirements into analytical solutions. - Excellent communication and presentation skills, with the ability to convey complex technical concepts - Experience working with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, GCP) is a plus. - Demonstrated ability to work effectively in a collaborative team environment and manage multiple priorities in a fast-paced, dynamic setting. - Proven ability to automate data creation work to reduce manual efforts. Responsibilities: Define automation plan and own it end to end for release verification and CI/CD pipeline set up. Understand product architecture and workflow to build optimized automation pipeline for continuous delivery. Work closely with product and solution management teams to understand business use cases and convert into efficient automation set up and execution to reduce time to market. Stay abreast of the latest advancements in DevOps and AWS automation to leverage latest concepts and methodologies. Contribute to the development and implementation of best practices for DevOps, automation, and release deployment/verification. Setting up new and upgrading existing environments for automation pipeline and monitor it for failure analysis for daily sanity and regression verification. Qualifications Bachelor's or masters degree in computer science or a related field. Competencies Fluent in English Excellent communicator ability to discuss initiatives for automation and provide optimized solution Attention to detail and quality focus. Organized approach manage and adapt priorities according to client and internal requirements. Self-starter but team mindset - work autonomously and as part of a global team What we offer you A multifaceted job with a high degree of responsibility, visibility, and ownership. An opportunity to work with one of the fastest growing segments and products in the capital markets space with great opportunity for growth. Strong exposure to the exciting trading and compliance space, with excellent learning opportunities. A broad range of professional education and personal development possibilities FIS is your final career step! A competitive salary and benefits A variety of career development tools, resources and opportunities With a 50-year history rooted in the financial services industry, FIS is the world's largest global provider dedicated to financial technology solutions. We champion clients from banking to capital markets, retail to corporate and everything touched by financial services. Headquartered in Jacksonville, Florida, our 53,000 worldwide employees help serve more than 20,000 clients in over 130 countries. Our technology powers billions of transactions annually that move over $9 trillion around the globe. FIS is a Fortune 500 company and is a member of Standard & Poors 500 Index. FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the FIS Online Privacy Notice. Privacy Statement FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at FIS works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

The position at Iris Software in Noida, UP, India is looking for a candidate with 3-4 years of working experience. The ideal candidate must have a strong background in Python Django and should also possess expertise in SQL, Snowflake, and DBT. As a part of the Iris Software team, you will be working on complex, mission-critical applications using cutting-edge technologies such as Python, Django, SQL, Snowflake, and more. The company's vision is to be the most trusted technology partner for its clients and create an environment where professionals can realize their full potential. Iris Software values its employees and offers a supportive work culture where individuals are encouraged to grow both professionally and personally. The company's Employee Value Proposition focuses on enabling employees to excel in their careers, be challenged by inspiring work, and be part of a culture that recognizes and nurtures talent. Joining Iris Software comes with a range of benefits aimed at supporting the financial, health, and well-being needs of its employees. From competitive salaries to comprehensive health insurance and flexible work arrangements, the company is committed to providing a rewarding work environment that fosters personal and professional growth. If you are looking to be a part of one of India's Top 25 Best Workplaces in the IT industry and want to work with a rapidly growing IT services company, Iris Software could be the place for you to do your best work and thrive in an award-winning work culture.,

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Bengaluru

Work from Office

Looking for Data engineer with 4+ yrs exp Skills: Azure functionalities,AWS lambda ,serverless, Python ,API,snowflake Work from office--Bangalore ( Yeshvanthpur)--India

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Position Title: Senior Specialist Reports to : Program Manager- Analytics BI Position Summary A Specialist shall work with the development team and responsible for development task as individual contribution. He/she should be able to mentor team and able to help in resolving issues. He/she should be technical sound and able to communicate with client perfectly. Key Duties & Responsibilities Work as Lead Developer Data engineering project for E2E Analytics. Ensure Project delivery on time. Mentor other team mates and guide them. Will take the requirement from client and communicate as well. Ensure Timely documents creation for knowledge base, user guides, and other various communications systems. Ensures delivery against Business needs, team goals and objectives, i.e., meeting commitments and coordinating overall schedule. Works with large datasets in various formats, integrity/QA checks, and reconciliation for accounting systems. Leads efforts to troubleshoot and solve process or system related issues. Understand, support, enforce and comply with company policies, procedures and Standards of Business Ethics and Conduct. Experience working with Agile methodology Experience, Skills and Knowledge: Bachelors degree in Computer Science or equivalent experience is required. B.Tech/MCA preferable. Minimum 5 7 years experience. Excellent communications and strong commitment for delivering the highest level of service Technical Skills Expert knowledge and experience working with Spark, Scala Experience in Azure data Factory ,Azure Data bricks, data Lake Experience working with SQL and Snowflake Experience with data integration tools such as SSIS, ADF Experience with programming languages such as Python Expert in Astronomer Airflow. Experience with programming languages such as Python, Spark, Scala Experience or exposure on Microsoft Azure Data Fundamentals Key Competency Profile Own youre a development by implementing and sharing your learnings Motivate each other to perform at our highest level Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Communicate effectively if there any challenge. Accountability and Responsibility should be there. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visit: r1rcm.com Visit us on Facebook,

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Design and implement Snowflake Cortex solutions for advanced analytics and AI use cases. Optimize data pipelines, ensure security compliance, and support scalable deployments across cloud environments.

Posted 1 week ago

Apply

3.0 - 10.0 years

18 - 22 Lacs

Hyderabad

Work from Office

WHAT YOU'LL DO Lead the development of scalable data infrastructure solutions Leverage your data engineering expertise to support data stakeholders and mentor less experienced Data Engineers. Design and optimize new and existing data pipelines Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 8 to 10 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 12 Lacs

Navi Mumbai

Work from Office

Hello Candidates , We are Hiring !! Job Position - Data Engineer Experience - 5+ years Location - NAVI MUMBAI ( Juinagar ) Work Mode - WFO Job Description We are looking for an experienced and results-driven Senior Data Engineer to join our Data Engineering team. In this role, you will design, develop, and maintain robust data pipelines and infrastructure that enable efficient data flow across our systems. As a senior contributor, you will also help define best practices, mentor junior team members, and contribute to the long-term vision of our data platform. You will work closely with cross-functional teams to deliver reliable, scalable, and high-performance data systems that support critical business intelligence and analytics initiatives. Responsibilities Design, build, and maintain scalable ETL/ELT pipelines to support analytics, Data Warehouse, and business operations. Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions. Develop and manage data models, data lakes, and data warehouse solutions in cloud environments (e.g., AWS, Azure, GCP). Monitor and optimize the performance of data pipelines and storage systems. Ensure data quality, integrity, and security across all platforms. Optimize and tune SQL queries and ETL jobs for performance and scalability. Collaborate with business analysts, data scientists, and stakeholders to understand requirements and deliver data solutions. Contribute to architectural decisions and development standards across the data engineering team. Participate in code reviews and provide guidance to junior developers. Leverage tools such as Airflow, Spark, Kafka, dbt, or Snowflake to build modern data infrastructure. Ensure data accuracy, completeness, and integrity across systems. Implement best practices in data governance, security, and compliance (e.g., GDPR, HIPAA). Mentor junior developers and participate in peer code reviews. Create and maintain detailed technical documentation. Required Qualifications Bachelors degree in Computer Science, Information Systems, or a related field; Masters degree is a plus. 5+ years of experience in data warehousing, ETL development, and data modeling. Strong hands-on experience with one or more databases: Snowflake, Redshift, SQL Server, Oracle, Postgres, Teradata, BigQuery. Proficiency in SQL and scripting languages (e.g., Python, Shell). Deep knowledge of data modeling techniques and ETL frameworks. Excellent communication, analytical thinking, and troubleshooting skills. Preferred Qualifications Experience with modern data stack tools like dbt, Fivetran, Stitch, Looker, Tableau, or Power BI. Knowledge of data lakes, lakehouses, and real-time data streaming (e.g., Kafka). Agile/Scrum project experience and version control using Git. NOTE - Candidates can share their Resume - shruti.a@talentsketchers.com

Posted 1 week ago

Apply

8.0 - 10.0 years

36 - 60 Lacs

Pune

Work from Office

We are seeking a data modelling professional with strong experience in data modeling. The ideal candidate should have hands-on experience in designing scalable, efficient data solutions and integrating them within enterprise systems. Provident fund Health insurance

Posted 1 week ago

Apply

2.0 - 7.0 years

12 - 16 Lacs

Hyderabad

Work from Office

WHAT YOU'LL DO Build scalable data infrastructure solutions Design and optimize new and existing data pipelines Integrate new data sources into our existing data architecture Collaborate with a cross-functional product engineering teams and data stakeholders deliver on Codecademy’s data needs WHAT YOU'LL NEED 3 to 5 years of hands-on experience building and maintaining large scale ETL systems Deep understanding of database design and data structures. SQL, & NoSQL. Fluency in Python. Experience working with cloud-based data platforms (we use AWS) SQL and data warehousing skills -- able to write clean and efficient queries Ability to make pragmatic engineering decisions in a short amount of time Strong project management skills; a proven ability to gather and translate requirements from stakeholders across functions and teams into tangible results WHAT WILL MAKE YOU STAND OUT Experience with tools in our current data stack: Apache Airflow, Snowflake, dbt, FastAPI, S3, & Looker. Experience with Kafka, Kafka Connect, and Spark or other data streaming technologies Familiarity with the database technologies we use in production: Snowflake, Postgres, and MongoDB. Comfort with containerization technologies: Docker, Kubernetes, etc.

Posted 1 week ago

Apply

6.0 - 10.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Responsibilities: Identify & understand business requirements for Reporting, design, and implement future proof8 solutions, optimizing data delivery, etc. Solution-based thinking in the DWH design and integration / optimization of the existing designs related to DWHv Together with the Business end users and the contact persons of a specific domain / capability, responsible for building a standard & sustainable DWH solution Thinking along and involvement in the E2E DWH design (everything needed for this) to help build reports and dashboards for our partners in an insightful and efficient way Support the ETL engineers in translating the functional requirement into technical designs Responsible for the delivery and follow-up of the solutions within the lifecycle management Coaching/Training team members on various processes - BI analysis Profile: Should have atleast 6+ years of experience in business requirements analysis (Reporting) for BI projects Should be able to understand & convert business requirements (functional) into logical & physical DWH models (Star, Snowflake) using Kimball methodologyv Should be proficient in understanding DWH databases, expertise in analysing & writing complex SQLs in an efficient way Strong working experience in Data Modelling tools - erwin Data Modeller or SAP Powerdesigner is a must Assemble large, complex data sets that meet functional / non-functional business requirements

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Apache Spark, Databricks Unified Data Analytics PlatformMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with team members to identify data needs and optimize data workflows, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture to support business needs.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Databricks Unified Data Analytics Platform, Apache Spark, Python (Programming Language).- Strong understanding of data modeling and database design principles.- Experience with ETL tools and data integration techniques.- Familiarity with cloud data warehousing solutions and big data technologies. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

2 - 5 Lacs

Bengaluru

Work from Office

About The Role Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer, you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Your typical day will involve performing continuous testing for security, API, and regression suites, creating automation strategies, and supporting data and environment configurations. You will also participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process, ensuring that the highest quality standards are met throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and implement testing strategies that align with project goals and timelines.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous learning. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Experience with data integration and ETL processes.- Strong understanding of database management and SQL.- Familiarity with automated testing tools and frameworks.- Ability to analyze and interpret complex data sets. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud-based data solutions and architecture.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will be pivotal in fostering a collaborative environment that encourages innovation and efficiency in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with timelines and objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data warehousing concepts and architecture.- Experience with ETL processes and data integration techniques.- Familiarity with SQL and database management.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Shell ScriptingMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the established requirements. Your role will be pivotal in driving the development of innovative solutions that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Shell Scripting.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Gurugram

Work from Office

About The Role Project Role : Technology Support Engineer Project Role Description : Resolve incidents and problems across multiple business system components and ensure operational stability. Create and implement Requests for Change (RFC) and update knowledge base articles to support effective troubleshooting. Collaborate with vendors and help service management teams with issue analysis and resolution. Must have skills : Snowflake Data Warehouse Good to have skills : Microsoft Azure DevOps, GitHub Actions, Python on AzureMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Support Engineer, you will engage in a dynamic environment where you will resolve incidents and problems across various business system components. Your typical day will involve ensuring operational stability, creating and implementing Requests for Change, and updating knowledge base articles to facilitate effective troubleshooting. You will also collaborate with vendors and assist service management teams in analyzing and resolving issues, contributing to a seamless operational flow and enhanced service delivery. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for junior team members to enhance their skills and knowledge.- Monitor and evaluate team performance to ensure alignment with operational goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Python on Azure, GitHub Actions, Microsoft Azure DevOps.- Strong understanding of data warehousing concepts and best practices.- Experience with incident management and problem resolution processes.- Familiarity with service management tools and methodologies.- Ability to analyze system performance and implement improvements. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Remote

Position Responsibilities: As a part of the Data Warehouse Team, implement technology improvements to Enterprise Data Warehouse environment. In this role, you will be a key contributor to the implementation of Snowflake with DBT, as part of our Migration from Oracle to a cloud-based data warehousing. Collaborate closely with cross-functional teams to design, develop, and optimize ELT processes within a cloud-based Data Warehouse environment. Develop and maintain Fivetran data pipelines to ensure smooth data extraction and loading from various source systems into Snowflake. Implement and enhance ETL programs using Informatica Power Center against the Data Warehouse and Adobe Campaign (Neolane) databases. Contribute to technical architectural planning, digital data modeling, process flow documentation and the design and development of innovative Digital business solutions. Create technical designs and mapping specifications. Work with both technical staff and business constituents to translate Digital business requirements into technical solutions. Estimate workload and participate in an Agile project team approach. Proven individual contributor and team player with high communication skills. Ability to lead, manage & validate workload for up to 2 offshore developers. Provide on-call support of the Data Warehouse nightly processing. Be an active participant in both technology and business initiatives. Position Requirements & Qualifications: At least 8 years of experience supporting Data Warehouse & Data related environments. Conduct efficient data integration with other-third party tools and snowflake. Hands on experience with snowflake development. Familiarity with cloud-based Data Warehousing solutions, particularly Snowflake. Advanced experience required in Informatica Power Center (5+ Years). Ability to code in Python, JavaScript. Knowledge of data governance practices and data security considerations in a cloud environment. Experience in working with Web services using Informatica for external vendor data integration. Experience in working with number of XML data sources & API Calls Solid experience in performance tuning ETL Jobs & Database queries Advanced Oracle & Snowflake database skills including packages, procedures, indexing and query tuning (5+ years). Solid understanding of Data Warehouse design theory including dimensional data modeling Working experience with cloud computing architecture. Experience in working with Azure DevOps, Jira, TFS (Team Foundation Server) or other similar Agile Project Management tool. Ability to thrive in change by having a fast, flexible, cooperative work style and ability to reprioritize at a moments notice. Bachelor's degree required. Notice Period - 0-15 days .

Posted 1 week ago

Apply

6.0 - 10.0 years

10 - 19 Lacs

Chennai

Work from Office

Job Title: Senior Data Analyst Location: Chennai Reports To: CEO Job Summary: We are seeking a highly skilled and experienced Data Scientist with 6+ years of hands-on experience in data science, analytics, and stakeholder engagement. The ideal candidate should have strong expertise in Python , Tableau , Snowflake , Machine Learning, Statistical testing and should be comfortable driving business insights through storytelling and daily interactions with stakeholders. Key Responsibilities: Design, build, and deploy scalable machine learning models to solve complex business problems Write and optimize complex SQL queries, particularly on the Snowflake platform Develop insightful dashboards and visualizations using Tableau Conduct data exploration, cleaning, and transformation using Python & R and relevant libraries (Pandas, NumPy, Scikit-learn, etc.) Perform A/B testing & other statistical techniques Translate analytical insights into clear, compelling stories and recommendations for stakeholders Collaborate cross-functionally with product, engineering, and business teams to understand data needs and deliver solutions Present findings and recommendations to both technical and non-technical stakeholders regularly Requirements: 6+ years of professional experience in data science or advanced analytics Strong proficiency in Python for data manipulation, analysis, and modelling Proven experience in building dashboards and reports using Tableau Expertise in writing complex SQL queries , especially on Snowflake Solid understanding of machine learning techniques, statistical tests and model deployment best practices Excellent communication and storytelling skills to convey data-driven insights Comfortable working closely with stakeholders daily to gather requirements and present findings

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Gurugram

Work from Office

About The Role Project Role : Technology Support Engineer Project Role Description : Resolve incidents and problems across multiple business system components and ensure operational stability. Create and implement Requests for Change (RFC) and update knowledge base articles to support effective troubleshooting. Collaborate with vendors and help service management teams with issue analysis and resolution. Must have skills : Snowflake Data Warehouse Good to have skills : Google Cloud Platform AdministrationMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Support Engineer, you will engage in a dynamic environment where you will resolve incidents and problems across various business system components, ensuring operational stability. Your typical day will involve collaborating with different teams, implementing Requests for Change, and updating knowledge base articles to enhance troubleshooting effectiveness. You will also work closely with vendors and service management teams to analyze and resolve issues, contributing to a seamless operational flow and improved service delivery. The Data Platform Engineering team needs to deploy new Snowflake account in Google Cloud Platform (GCP) as a part of the Cloud 3.0 program. To ensure the successful end-to-end implementation of Snowflake in GCP, dedicated engineers are essential. Currently, we lack a dedicated engineer focused on GCP Snowflake.Snowflake Implementation Responsibilities:End-to-End Implementation:The candidate will be responsible for the comprehensive implementation of Snowflake on GCP, ensuring all components are configured correctly and efficiently.Logging and Monitoring:The role will include establishing robust logging and monitoring mechanisms essential for tracking performance and system health.Disaster Recovery (DR):The engineer will implement DR strategies to ensure data integrity and availability in case of failures.Candidate Profile:Technical Expertise:The ideal candidate will be a hands-on senior cloud solution engineer with extensive experience in Snowflake and data engineering. Proficiency in programming languages such as Java or Python and familiarity with Terraform is mandatory.Cloud Proficiency:Expertise in Google Cloud Platform is essential for this role, particularly in services related to data storage, processing, and analytics.Operational Support:The candidate should possess the capability to support modern data platforms with hands-on programming to ensure seamless operations and maintenance. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions to enhance team knowledge and skills.- Monitor system performance and proactively identify areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Google Cloud Platform Administration.- Strong understanding of data warehousing concepts and best practices.- Experience in troubleshooting and resolving technical issues in a timely manner.- Familiarity with incident management and change management processes. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data migration concepts using external stage, internal stage and proficient in SQL stored procedures.- Good To Have Skills: Experience with data integration tools and DBT.- Strong understanding of SQL and database management.- Experience in creating complex SQL stored procedures and good in using SQL and different in-build functions.- Good in SQL performance tuning techniques and optimization techniques.- Familiarity with cloud computing concepts and services.- Good in Python concepts. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : Snowflake SchemaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are functioning optimally and meeting the needs of stakeholders. Your role will require you to stay updated on industry trends and best practices to enhance application performance and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Snowflake Schema.- Strong understanding of data warehousing concepts and best practices.- Experience with SQL and data modeling techniques.- Familiarity with cloud-based data solutions and integration methods. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies