Home
Jobs

1589 Snowflake Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

12 - 18 Lacs

Hyderabad, Bengaluru

Hybrid

Work Location : Hyderabad & Bangalore Work Timings : 1 PM IST till 10.30 PM IST Exp : 5 8 Years Key Skills : Expertise building solutions (not operation/BPO users) using any of the enterprise level reconciliation platforms (ex: TLM, Duco, IntelliMatch etc.), ETL development, Strong in working with SQL, preferably in SQL Server and Postgres.

Posted 4 days ago

Apply

5.0 - 10.0 years

18 - 27 Lacs

Noida

Hybrid

Develop interactive dashboards , reports using Tableau to visualize key business metrics.Write, optimize, and troubleshoot SQL queries to extract, clean, and analyze data. Work with Snowflake for data management, transformation, and analytics.

Posted 4 days ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Pune

Work from Office

The headlines Job Title Data Consultant (Managed Services & Support) Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??700,000 ??2,100,000/annum A bit about the role We're looking for passionate Data Consultants who thrive in a fast-paced, problem-solving environment to join our global Managed Services & Support team spanning India and the UK In this role, you'll help keep our live cloud data solutions operating as they should be, ensuring data pipelines run smoothly and reporting layers stay up to date You'll take a proactive approach and help identify and resolve issues before they arise while optimising technical debt for long-term stability This is perfect for someone who enjoys client interaction and is passionate about ensuring cloud data platforms perform at their best What you'll be doing Monitoring and troubleshooting live data pipelines, ensuring smooth operations and up-to-date reporting layers Managing a support queue, diagnosing and resolving issues related to ETL processes, Snowflake, Matillion, and data pipelines Proactively optimising existing solutions, identifying areas for improvement, and reducing technical debt Collaborating with senior consultants and engineers to escalate and resolve complex technical challenges Engaging with clients, ensuring clear communication, managing expectations, and providing best-practice recommendations Documenting and sharing knowledge, contributing to internal training and process improvements What you'll need to succeed SQL knowledge and experience working with cloud data platforms (Snowflake, Matillion, or similar ETL tools) Strong problem-solving skills with the ability to troubleshoot pipeline failures and connectivity issues Excellent communication skills, able to engage with both technical teams and business stakeholders Experience with support queue management systems (e g , JIRA, ServiceNow, FreshService) is a plus A proactive mindset, comfortable working under pressure in a fast-paced, client-focused environment So, what's in it for you The chance to work with cutting-edge cloud data technologies, solving real-world business challenges You can fast-track your career in cloud data support and analytics with training and development opportunities An opportunity to be part of a collaborative, international team, working across India and the UK A competitive salary, exciting career progression, and a chance to make a real impact About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Databricks, Matillion, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!

Posted 4 days ago

Apply

3.0 - 5.0 years

14 - 18 Lacs

Gurugram

Work from Office

Responsibilities Lead the development of a modern, modular, and flexible restaurant technology platform Lead the development and co-manage the roadmap for our HutBot platform, our in-restaurant management app Assess, build and support restaurant ordering platforms, integrating POS with third-party apps and aggregators Oversee the integration of Kiosks, Mobile Tablets, smart kitchen, delivery management systems, and BOH applications such as inventory, labor, learning management, and other employee-facing apps Develop and maintain Enterprise architecture by building integrations between different platforms and apps Minimum Requirements 10+ years of development experience managing large projects and teams with progressive career growth Development experience in Typescript/NodeJS with React framework preferred, however we may consider strong candidates with proven experience in related technologies e g Python, C# etc Familiarity with cloud technologies, with experience in AWS being a bonus, along with proficiency in infrastructure-as-code tools like Terraform Strong understanding of modern database systems, including RDS (Postgres), NoSQL (DynamoDB, DocumentDB), and analytics tools like Snowflake, Domo (GDH), and Google Analytics Experience in building and supporting restaurant ordering platforms, integration of POS with third-party apps and aggregators, Kiosks, Mobile Tablets, smart kitchen, delivery management systems, BOH applications such as inventory, labor, learning management, and other employee-facing apps Experience in managing and building Enterprise architecture by building integrations between different platforms and apps while managing long-term strategic focus and roadmaps Experience in managing large teams across multiple time zones Preferred Requirements Development experience in Typescript/NodeJS with React framework preferred, however we may consider strong candidates with proven experience in related technologies e g Python, C# etc Familiarity with cloud technologies, with experience in AWS being a bonus, along with proficiency in infrastructure-as-code tools like Terraform Strong understanding of modern database systems, including RDS (Postgres), NoSQL (DynamoDB, DocumentDB), and analytics tools like Snowflake, Domo (GDH), and Google Analytics The Yum! Brands story is simple We have the four distinctive, relevant and easy global brands KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -born from the hopes and dreams, ambitions and grit of passionate entrepreneurs And we want more of this to create our future! As the worlds largest restaurant company we have a clear and compelling mission: to build the worlds most love, trusted and fastest-growing restaurant brands The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results Were looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences and we are only getting started Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFCs still use its 75-year-old finger lickingood recipe including secret herbs and spices to hand-bread its chicken every day Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world We dont just say we are a great place to work our commitments to the world and our employees show it Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index Our employees work in an environment where the value of ?believe in all people? is lived every day, enjoying benefits including but not limited to: 4 weeksvacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match all encompassed in Yum!s world-famous recognition culture

Posted 4 days ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Join us as a MI Reporting Engineer at Barclays, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence You'll harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences As a part of team of developers, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions To be successful as a MI Reporting Engineer you should have experience with: Hands on experience in developing complex/medium/easy reports in Tableau, QlikView & SAP BO reports Comfortable with Extracting, transforming and loading data from multiple sources such as Teradata and Hive into BI tools Experience in Snowflake / AWS Quicksight preferrable Create performance efficient data models and dashboards Solid working knowledge of writing SQL queries in Teradata and Hive/Impala Experience in writing PySpark queries and exposure to AWS Athena Attention to details with strong analytical and problem solving skills Exceptional communication and interpersonal skills Comfortable working in a corporate environment, someone who has business acumen and an innovative mind-set Some Other Highly Valued Skills Includes High level understanding of ETL processes Banking domain experience Quantitative mind set, with a desire to work in a data-intensive environment Familiarity with Agile delivery methodologies and project management techniques You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based out of Pune Purpose of the role To design and develop compelling visualizations that effectively communicate data insights to stakeholders across the bank, influencing decision-making and improving business outcomes Accountabilities Performing exploratory data analysis and data cleansing to prepare data for visualization Translation of complex data into clear, concise, and visually appealing charts, graphs, maps, and other data storytelling formats Utilisation of best practices in data visualization principles and design aesthetics to ensure clarity, accuracy, and accessibility Documentation of visualization methodologies and findings in clear and concise reports Presentation of data insights and visualizations to stakeholders at all levels, including executives, business users, and data analysts Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate Will have an impact on the work of related teams within the area Partner with other functions and business areas Takes responsibility for end results of a teams operational processing and activities Escalate breaches of policies / procedure appropriately Take responsibility for embedding new policies/ procedures adopted due to risk mitigation Advise and influence decision making within own area of expertise Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function Make evaluative judgements based on the analysis of factual information, paying attention to detail Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents Guide and persuade team members and communicate complex / sensitive information Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 4 days ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Pune

Work from Office

Step into role of a Senior Data Engineer At Barclays, innovation isnt encouraged, its expected As a Senior Data Engineer you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 4 days ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Hyderabad

Work from Office

About Sanofi We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve peoples lives Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions Sanofi has recently embarked into a vast and ambitious digital transformation program A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives Who You Are:You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofis advanced analytic, AI and ML initiatives for the betterment of our global patients and customers You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofis Digital Transformation through becoming an AI first organization This means: AI Factory Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets There are multiple vacancies across our Digital profiles and NA region Further assessments will be completed to determine specific function and level of hired candidates Job Highlights Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large & complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the companys standards, industry practices and emerging technologies Key Functional Requirements & Qualifications Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements & Qualifications Bachelors Degree or equivalent in Computer Science, Engineering, or relevant field 4 to 5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice To Haves Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Pursue Progress Discover Extraordinary Better is out there Better medications, better outcomes, better science But progress doesnt happen without people people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen So, lets be those people Watch our ALL IN video and check out our Diversity, Equity and Inclusion actions at sanofi com! null

Posted 4 days ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Pune

Work from Office

The headlines Job Title Data Consultant (Delivery) Start Date Mid-July 2025 Location Hybrid; 2 days a week on-site in our office in Creaticity Mall, Shashtrinagar, Yerawada Salary ??700,000 ??2,100,000/annum A bit about the role Were looking for passionate Data Consultants to join our Delivery team; a thriving and fast-growing community of some of the industrys best cloud data engineers at all levels, ranging from interns and graduates up to seasoned experts In this role, you'll combine technical expertise with strong commercial and client-facing skills You'll get the unique opportunity to work with advanced tools and methodologies, develop innovative solutions, and play an integral part in delivering value to our clients In a culture that values growth, mentorship, and technical excellence, this is the perfect opportunity for a data engineer looking to make a real impact within an international, industry-leading consultancy What you'll be doing Delivering high-quality data solutions by successfully managing development tasks with minimal guidance Working with industry-leading technologies such as Snowflake, Matillion, Power BI, and Databricks, with a focus on mastering at least one toolset while expanding your expertise in others Building trusted relationships with clients, managing expectations, and finding opportunities to add value beyond project scope Contributing to internal knowledge-sharing, delivering presentations, training sessions, and thought leadership content Driving business impact by engaging with stakeholders, understanding business challenges, and translating them into data-driven solutions Leading investigations, client workshops, and demonstrations to showcase technical expertise and problem-solving skills Balancing multiple priorities effectively, knowing when to escalate issues and when to push forward with solutions independently Helping shape the Snap Analytics team by mentoring junior consultants and sharing your expertise with others What you'll need to succeed Technical Expertise Strong experience in SQL, data modelling, and ETL processes Exposure to tools like Snowflake, Matillion, Databricks, or Power BI is highly desirable A Problem-Solving Mindset The ability to identify multiple solutions, analyse trade-offs, and confidently propose the best approach Client Engagement Skills Strong communication and stakeholder management abilities, ensuring seamless collaboration with clients at all levels Analytical Thinking The capability to evaluate data solutions critically and proactively identify opportunities for optimisation Ownership & Initiative Be self-motivated and accountable, with a proactive approach to learning and personal development A 'Team Player' Mentality Willingness to contribute to internal initiatives, support colleagues, and help grow Snap Analytics as a company So, what's in it for you A chance to work with the latest cloud data platforms, shaping enterprise-scale data solutions We'll support your journey towards technical certifications and leadership roles A collaborative and supportive culture in which we believe in knowledge-sharing, teamwork, and helping each other succeed The opportunity to write blogs, contribute to industry discussions, and become a recognised expert in your field A rewarding compensation package with opportunities for progression About Snap Analytics We're a high-growth data analytics consultancy on a mission to help enterprise businesses unlock the full potential of their data With offices in the UK, India, and South Africa, we specialise in cutting-edge cloud analytics solutions, transforming complex data challenges into actionable business insights We partner with some of the biggest brands worldwide to modernise their data platforms, enabling smarter decision-making through Snowflake, Matillion, Databricks, and other cloud technologies Our approach is customer-first, innovation-driven, and results-focused, delivering impactful solutions with speed and precision At Snap, were not just consultants, were problem-solvers, engineers, and strategists who thrive on tackling complex data challenges Our culture is built on collaboration, continuous learning, and pushing boundaries, ensuring our people grow just as fast as our business Join us and be part of a team thats shaping the future of data analytics!

Posted 4 days ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Hyderabad, Pune, Mumbai (All Areas)

Hybrid

Work Experience 2+yrs Job Title Snowflake Developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Educational Requirements MCA,MSc,MTech,Bachelor of Engineering,BCA,BSc,BTech Service Line Data & Analytics Unit * Location - PAN INDIA

Posted 4 days ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Pune, Gurugram, Delhi / NCR

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 4 days ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

We are looking for a skilled Snowflake Developer with 5-7 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have expertise in designing, developing, and implementing data warehousing solutions using Snowflake. Roles and Responsibility Design and develop scalable data warehousing solutions using Snowflake. Collaborate with cross-functional teams to identify business requirements and design data models. Develop and maintain complex SQL queries for data extraction and manipulation. Implement data validation and quality checks to ensure accuracy and integrity. Optimize database performance and troubleshoot issues. Work closely with stakeholders to understand business needs and provide technical guidance. Job Requirements Strong understanding of data modeling and data warehousing concepts. Proficiency in writing complex SQL queries and stored procedures. Experience with Snowflake development tools and technologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.

Posted 4 days ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Pune, Chennai, Bengaluru

Hybrid

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Science,Bachelor Of Comp. Applications,Master Of Technology,Master Of Engineering,Master Of Comp. Applications,Master Of Science Service Line Engineering Services Responsibilities Master's degree in Computer Science, Statistics, Mathematics, or a related field. 3+ years of experience in data science and machine learning with a strong focus on model development and deployment. Expert-level knowledge of statistics, including probability theory, hypothesis testing, and statistical inference. In-depth knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional : Experience with Natural Language Processing (NLP) and Computer Vision (CV) techniques. Knowledge of DevOps methodologies and practices for continuous integration/continuous delivery (CI/CD). Experience with data warehousing and data lakes solutions like BigQuery or Snowflake. Familiarity with real-time data processing and streaming analytics. Passion for learning and staying at the forefront of data science and machine learning advancements. Preferred Skills: Technology-Analytics - Techniques-Cluster Analysis Technology-Analytics - Techniques-Decision Trees Technology-Analytics - Techniques-Linear Regression Technology-Machine Learning-Python

Posted 4 days ago

Apply

6.0 - 11.0 years

35 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role: Snowflake Data Engineer Mandatory Skills: #Snowflake, #AZURE, #Datafactory, SQL, Python, #DBT / #Databricks . Location (Hybrid) : Bangalore, Hyderabad, Chennai, Pune, Gurugram & Noida. Budget: Up to 50 LPA' Notice: Immediate to 30 Days Serving Notice Experience: 6-11 years Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 4 days ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

We need Snowflake Developer profiles with Python experience urgently with having hands on experience in Snowflake coding + Advance SQL + Python

Posted 4 days ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

We are looking for a skilled Senior Data Engineer with 5-8 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have a strong background in data engineering and excellent problem-solving skills. Roles and Responsibility Design, develop, and implement large-scale data pipelines and architectures. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data systems and databases. Ensure data quality, integrity, and security. Optimize data processing workflows for improved performance and efficiency. Troubleshoot and resolve technical issues related to data engineering. Job Requirements Strong knowledge of data engineering principles and practices. Experience with data modeling, database design, and data warehousing. Proficiency in programming languages such as Python, Java, or C++. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.

Posted 4 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Explore an Exciting Career at Accenture Do you believe in creating an impactAre you a problem solver who enjoys working on transformative strategies for global clientsAre you passionate about being part of an inclusive, diverse and collaborative culture Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Technology Strategy & Advisory. The Practice- A Brief Sketch: The Technology Strategy & Advisory Practice is a part of and focuses on the clients most strategic priorities. We help clients achieve growth and efficiency through innovative R&D transformation, aimed at redefining business models using agile methodologies. As part of this high performing team, you will work on the scaling Data & Analyticsand the data that fuels it allto power every single person and every single process. You will part of our global team of experts who work on the right scalable solutions and services that help clients achieve your business objectives faster. Business Transformation: Assessment of Data & Analytics potential and development of use cases that can transform business Transforming Businesses: Envisioning and Designing customized, next-generation data and analytics products and services that help clients shift to new business models designed for today's connectedlandscape of disruptive technologies Formulation of Guiding Principles and Components: Assessing impact to clients technology landscape/ architecture and ensuring formulation of relevant guiding principles and platform components. Product and Frameworks :Evaluate existing data and analytics products and frameworks available and develop options for proposed solutions. Bring your best skills forward to excel in the role: Leverage your knowledge of technology trends across Data & Analytics and how they can be applied to address real world problems and opportunities. Interact with client stakeholders to understand their Data & Analytics problems, priority use-cases, define a problem statement, understand the scope of the engagement, and also drive projects to deliver value to the client Design & guide development of Enterprise-wide Data & Analytics strategy for our clients that includes Data & Analytics Architecture, Data on Cloud, Data Quality, Metadata and Master Data strategy Establish framework for effective Data Governance across multispeed implementations. Define data ownership, standards, policies and associated processes Define a Data & Analytics operating model to manage data across organization . Establish processes around effective data management ensuring Data Quality & Governance standards as well as roles for Data Stewards Benchmark against global research benchmarks and leading industry peers to understand current & recommend Data & Analytics solutions Conduct discovery workshops and design sessions to elicit Data & Analytics opportunities and client pain areas. Develop and Drive Data Capability Maturity Assessment, Data & Analytics Operating Model & Data Governance exercises for clients A fair understanding of data platform strategy for data on cloud migrations, big data technologies, large scale data lake and DW on cloud solutions. Utilize strong expertise & certification in any of the Data & Analytics on Cloud platforms Google, Azure or AWS Collaborate with business experts for business understanding, working with other consultants and platform engineers for solutions and with technology teams for prototyping and client implementations. Create expert content and use advanced presentation, public speaking, content creation and communication skills for C-Level discussions. Demonstrate strong understanding of a specific industry , client or technology and function as an expert to advise leadership. Manage budgeting and forecasting activities and build financial proposalsQualification Your experience counts! MBA from a tier 1 institute 5 7 years of Strategy Consulting experience at a consulting firm 3+ years of experience on projects showcasing skills across these capabilities- Data Capability Maturity Assessment, Data & Analytics Strategy, Data Operating Model & Governance, Data on Cloud Strategy, Data Architecture Strategy At least 2 years of experience on architecting or designing solutions for any two of these domains - Data Quality, Master Data (MDM), Metadata, data lineage, data catalog. Experience in one or more technologies in the data governance space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. 3+ years of experience in designing end-to-end Enterprise Data & Analytics Strategic Solutions leveraging Cloud & Non-Cloud platforms like AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera, Informatica, Snowflake, Palantir Deep Understanding of data supply chain and building value realization framework for data transformations 3+ years of experience leading or managing teams effectively including planning/structuring analytical work, facilitating team workshops, and developing Data & Analytics strategy recommendations as well as developing POCs Foundational understanding of data privacy is desired Mandatory knowledge of IT & Enterprise architecture concepts through practical experience and knowledge of technology trends e.g. Mobility, Cloud, Digital, Collaboration A strong understanding in any of the following industries is preferred:Financial Services, Retail, Consumer Goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources or equivalent domains CDMP Certification from DAMA preferred Cloud Data & AI Practitioner Certifications (Azure, AWS, Google) desirable but not essential

Posted 4 days ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Noida, Pune, Gurugram

Hybrid

Role: Lead Data Engineer Experience: 7-12 years Must-Have: 7+ years of relevant experienceinData Engineeringand delivery. 7+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark. Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc. Role & Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects. Education: BE/B.Tech from a reputed institute.

Posted 4 days ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

We are looking for a skilled Big Data professional with 4 to 9 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have a strong background in big data and excellent analytical skills. Roles and Responsibility Design, develop, and implement big data solutions using various technologies. Collaborate with cross-functional teams to identify business problems and develop solutions. Develop and maintain large-scale data systems and architectures. Analyze complex data sets to extract insights and trends. Implement data quality checks and ensure data integrity. Stay updated with industry trends and emerging technologies in big data. Job Requirements Strong understanding of big data concepts and technologies. Excellent analytical and problem-solving skills. Ability to work in a fast-paced environment and meet deadlines. Strong communication and collaboration skills. Experience with big data tools and frameworks such as Hadoop, Spark, and NoSQL databases. Strong attention to detail and ability to deliver high-quality results.

Posted 4 days ago

Apply

5.0 - 8.0 years

20 - 35 Lacs

Bengaluru

Work from Office

Key Responsibilities: •Design, develop, and optimize data models within the Celonis Execution Management System (EMS). •Extract, transform, and load (ETL) data from flat files and UDP into Celonis. • Work closely with business stakeholders and data analysts to understand data requirements and ensure accurate representation of business processes. •Develop and optimize PQL (Process Query Language) queries for process mining. • Collaborate with group data engineers, architects, and analysts to ensure high-quality data pipelines and scalable solutions. •Perform data validation, cleansing, and transformation to enhance data quality. •Monitor and troubleshoot data integration pipelines, ensuring performance and reliability. •Provide guidance and best practices for data modeling in Celonis. Qualifications & Skills: • 5+ years of experience in data engineering, data modeling, or related roles Proficiency in SQL, ETL processes, and database management (e.g.,PostgreSQL, Snowflake, BigQuery, or similar). •Experience working with large-scale datasets and optimizing data models for performance. •Data management experience that spans across the data lifecycle and critical functions (e.g., data profiling, data modeling, data engineering, data consumption product and services). • Strong problem-solving skills and ability to work in an agile, fast-paced environment. •Excellent communications and demonstrated hands-on experience communicating technical topics with non-technical audiences. • Ability to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. •Excellent collaboration skills to work with cross-functional teams Roles and Responsibilities Key Responsibilities: •Design, develop, and optimize data models within the Celonis Execution Management System (EMS). •Extract, transform, and load (ETL) data from flat files and UDP into Celonis. • Work closely with business stakeholders and data analysts to understand data requirements and ensure accurate representation of business processes. •Develop and optimize PQL (Process Query Language) queries for process mining. • Collaborate with group data engineers, architects, and analysts to ensure high-quality data pipelines and scalable solutions. •Perform data validation, cleansing, and transformation to enhance data quality. •Monitor and troubleshoot data integration pipelines, ensuring performance and reliability. •Provide guidance and best practices for data modeling in Celonis. Qualifications & Skills: • 5+ years of experience in data engineering, data modeling, or related roles Proficiency in SQL, ETL processes, and database management (e.g.,PostgreSQL, Snowflake, BigQuery, or similar). •Experience working with large-scale datasets and optimizing data models for performance. •Data management experience that spans across the data lifecycle and critical functions (e.g., data profiling, data modeling, data engineering, data consumption product and services). • Strong problem-solving skills and ability to work in an agile, fast-paced environment. •Excellent communications and demonstrated hands-on experience communicating technical topics with non-technical audiences. • Ability to effectively collaborate and manage the timely completion of assigned activities while working in a highly virtual team environment. •Excellent collaboration skills to work with cross-functional teams

Posted 4 days ago

Apply

4.0 - 8.0 years

20 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Reliable backend services and cloud-native applications. Kubernetes with best practices in availability, monitoring, and cost-efficiency. Implement and manage CI/CD pipelines and infrastructure automation. Collaborate with frontend, Required Candidate profile Kubernetes, Cloud platform (GCP, AWS, Azure, or OCI), Backend Programming (Python, Java, or Kotlin) Strong hands-on experience with Kubernetes of atleast 2 years in production environments.

Posted 4 days ago

Apply

4.0 - 9.0 years

3 - 8 Lacs

Bengaluru

Work from Office

We are organizing a direct walk-in drive at Bengaluru location. Please find below details and skills for which we have a walk-in at TCS - Bengaluru on 2ndJuly 2025 Experience: 4- 10 years Skill Name:- (1) Snowflake Engineers (2) Pyspark Engineers (3) Python Engineers (4) Data Solution Designers (5) Attacama specialists (6) DBT Engineers (7) Glue Engineers (8) Senior Data modellers

Posted 4 days ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

Indore, Pune

Work from Office

What will your role look like Assist Business analysts in developing, maintaining and supporting operational/live reports Dashboards, and scorecards using Microsoft Power BIImplementation of Row Level Security by defining various constraints for each defined ROLE. Why you will love this role Besides a competitive package, an open workspace full of smart and pragmatic team members, with ever-growing opportunities for professional and personal growth Besides a competitive package, an open workspace full of smart and pragmatic team members, with ever-growing opportunities for professional and personal growth Be a part of a learning culture where teamwork and collaboration are encouraged, diversity is valued and excellence, compassion, openness and ownership is rewarded We would like you to bring along Power BI Desktop, mobile, service development MSBI (SSIS, Tabular SSAS, SSRS) with DAX SQL Server 2012/2014/2016 -TSQL development knowledge Snowflake Microstrategy Informatica Power Center ADF and other Azure BI Technologies Experience in creating dashboards, volume reports, operating summaries, and presentations and graphs. SSRS Integration to Power BI. Knowledge in SSAS Experience in Data Gateway for data refreshing, content Pack Library Experience in Managing Embed Codes, Knowledge on Power BI Mobile Knowledge on SQL Server 2012 or advanced versions. SQL Server 2016 is most preferable. Proficient at data visualization using Power BI and a strong application development Knowledge on Azure is preferable Experience in creating calculated measures and columns with DAX in MS Power BI Desktop. Experience in Custom Visuals and Groups usage. Expert in publishing reports to app.powerbi.com and setting up the necessary connection details and scheduling. Expert knowledge connecting Microsoft Power BI Desktop to various data sources. SQL Server, SSAS Expert knowledge using advanced calculations using MS Power BI Desktop (Aggregate, Date, Logical, String, Table) Expert creating different visualizations using Slicers, Lines, Pies, Histograms, Maps, Scatter, Bullets, Heat Maps, Tree maps, etc.

Posted 4 days ago

Apply

3.0 - 4.0 years

8 - 13 Lacs

Noida, Gurugram

Work from Office

R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Position Title Specialist Reports to Program Manager- Analytics BI Location Noida Position summary A user shall work with the development team and responsible for development task as individual contribution .He/she should be technical sound and able to communicate with client perfectly . Key duties & responsibilities Work as Specialist Data engineering project for E2E Analytics. Ensure Project delivery on time. Mentor other team mates and guide them. Will take the requirement from client and communicate as well. Ensure Timely documents creation for knowledge base, user guides, and other various communications systems. Ensures delivery against Business needs, team goals and objectives, i.e., meeting commitments and coordinating overall schedule. Works with large datasets in various formats, integrity/QA checks, and reconciliation for accounting systems. Leads efforts to troubleshoot and solve process or system related issues. Understand, support, enforce and comply with company policies, procedures and Standards of Business Ethics and Conduct. Experience working with Agile methodology Experience, Skills and Knowledge: Bachelors degree in computer science or equivalent experience is required. B.Tech/MCA preferable. Minimum 3 4 years experience. Excellent communications and strong commitment for delivering the highest level of service Technical Skills Expert knowledge and experience working with Spark, Scala Experience in Azure data Factory ,Azure Data bricks, data Lake Experience working with SQL and Snowflake Experience with data integration tools such as SSIS, ADF Experience with programming languages such as Python Expert in Astronomer Airflow. Experience with programming languages such as Python, Spark, Scala Experience or exposure on Microsoft Azure Data Fundamentals Key competency profile Own your development by implementing and sharing your learnings Motivate each other to perform at our highest level Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Communicate effectively if there any challenge. Accountability and Responsibility should be there. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 4 days ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : IBM InfoSphere DataStage Good to have skills : Snowflake Data WarehouseMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of innovative solutions that meet client needs. You will be responsible for delivering high-quality code while adhering to best practices and standards in software development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior professionals to foster their growth and development. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM InfoSphere DataStage.- Must To Have Skills: Experience with Snowflake Data Warehouse.- Strong understanding of ETL processes and data integration techniques.- Minimum 4 years of experience with database management and SQL.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 7 years of experience in IBM InfoSphere DataStage & Snowflake.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 7.0 years

9 - 14 Lacs

Noida, Gurugram

Work from Office

R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, Top 100 Best Companies for Women by Avtar & Seramount, and amongst Top 10 Best Workplaces in Health & Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 16,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. Position Title Senior Specialist Reports to Program Manager- Analytics BI Position summary: A Specialist shall work with the development team and responsible for development task as individual contribution. He/she should be able to mentor team and able to help in resolving issues. He/she should be technical sound and able to communicate with client perfectly. Key duties & responsibilities Work as Lead Developer Data engineering project for E2E Analytics. Ensure Project delivery on time. Mentor other team mates and guide them. Will take the requirement from client and communicate as well. Ensure Timely documents creation for knowledge base, user guides, and other various communications systems. Ensures delivery against Business needs, team goals and objectives, i.e., meeting commitments and coordinating overall schedule. Works with large datasets in various formats, integrity/QA checks, and reconciliation for accounting systems. Leads efforts to troubleshoot and solve process or system related issues. Understand, support, enforce and comply with company policies, procedures and Standards of Business Ethics and Conduct. Experience working with Agile methodology Experience, Skills and Knowledge: Bachelors degree in Computer Science or equivalent experience is required. B.Tech/MCA preferable. Minimum 5 7 years experience. Excellent communications and strong commitment for delivering the highest level of service Technical Skills Expert knowledge and experience working with Spark, Scala Experience in Azure data Factory ,Azure Data bricks, data Lake Experience working with SQL and Snowflake Experience with data integration tools such as SSIS, ADF Experience with programming languages such as Python Expert in Astronomer Airflow. Experience with programming languages such as Python, Spark, Scala Experience or exposure on Microsoft Azure Data Fundamentals Key competency profile: Own youre a development by implementing and sharing your learnings Motivate each other to perform at our highest level Work the right way by acting with integrity and living our values every day Succeed by proactively identifying problems and solutions for yourself and others. Communicate effectively if there any challenge. Accountability and Responsibility should be there. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies