Home
Jobs

1503 Snowflake Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Functional Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 1 day ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 1 day ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 1 day ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

Data Analyst1 Very good knowledge of Snowflake data warehouse, data infrastructure, data platforms, ETL implementation, data modelling and design Ability to gather, view, and analyze data Apply statistical and data analysis techniques to identify patterns, trends, correlations, and anomalies in large datasets. Utilize advanced analytical tools and programming languages (e.g., Python, R, SQL) to conduct exploratory data analysis. Develop and implement data models to support predictive and prescriptive analytics. Connecting to data sources, importing data and transforming data for Business Intelligence. Strong hands-on experience in writing SQLs, basic and complex Good knowledge of Snowflake cloud data platform Create clear and informative visualizations (charts, graphs, dashboards) to present insights to non-technical stakeholders. Strong exposure to Visualization , transformation, data analysis and formatting skills. Develop interactive dashboards and reports using data visualization tools (e.g.Snowsight, Tableau, Power BI) to facilitate data-driven decision-making. Good to have knowledge of Finance and Accounting domain Familiarity with cloud ecosystems Should be able to back track and do deep analysis of issues and provide RCA. Good testing and documentation skills Should be able to communicate with fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems

Posted 1 day ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Snowflake Data Engineer1 were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks

Posted 1 day ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Snowflake data engineer1 As mentioned earlier, for this role, were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks

Posted 1 day ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Noida

Work from Office

Naukri logo

Join Us in Transforming Healthcare with the Power of Data & AI At Innovaccer, were on a advanced Healthcare Intelligence Platform ever created. Grounded in an AI-first design philosophy, our platform turns complex health data into real-time intelligence, empowering healthcare systems to make faster, smarter decisions. We are building a unified , end-to-end data platform that spans Data Acquisition & Integration, Master Data Management , Data Classification & Governance , Advanced Analytics & AI Studio , App Marketplace , AI-as-BI capabilities, etc. All of this is powered by an Agent-first approach , enabling customers to build solutions dynamically and at scale. Youll have the opportunity to define and develop platform capabilities that help healthcare organizations tackle some of the industrys most pressing challenges, such as Kidney disease management, Clinical trials optimization for pharmaceutical companies, Supply chain intelligence for pharmacies, and many more real-world applications. Were looking for talented engineers and platform thinkers who thrive on solving large-scale, complex, and meaningful problems. If youre excited about working at the intersection of healthcare, AI, and cutting-edge platform engineering, wed love to hear from you. About the Role We are looking for a Staff Engineer to design and develop highly scalable, low-latency data platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, design, and build scalable data tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of tooling from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering with strong experience building distributed systems. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Bachelor's or Masters degree in Computer Science, Engineering, or a related field.

Posted 1 day ago

Apply

8.0 - 13.0 years

16 - 20 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and sustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. About the Job At Brightly, our dedication to innovation drives our product management team to create products that address our customers' evolving needs. As a Senior Product Manager for Data, you will work in a collaborative, energetic, dynamic, and creative environment to drive the product strategy of market-leading data products for our emerging and existing vertical-focused products and services. Reporting to the Director of Product Management, you will play a crucial role in assisting in a forward-thinking Data & AI strategy aligned with market needs. What youll be doing Your key responsibilities are to develop and execute a comprehensive product strategy aligned with market demands and business goals through: Drive monetizationBuild new high value offers on our Snowflake Data cloud. Build Data-as-a-product to drive revenue. Enable Reporting, Analytics and AI roadmap. Data-Driven Decision MakingUtilize analytics and data to drive product decisions, measure success, and iterate on features. Market AnalysisStay up to date with industry trends, competitor products, and emerging technologies to ensure our data products remain competitive. Stakeholder ManagementCollaborate with stakeholders across the organization to align on product goals, priorities, financials, and timelines. Exposure working with Legal to ensure compliance and data governance, data integrity and data retention Customer FocusDeep empathy for users and a passion for creating delightful product experiences. User Research and InsightsConduct user research and gather insights to inform product decisions and ensure the data products meet user needs and expectations. Resource ManagementIdentify value driven opportunities (including establishing TAM, SAM, SOM), managing, understand and share financial outlook, align resources effectively to drive success for your domain. What youll need EducationBachelors degree and advanced degree in a technical field or an MBA from a top business school (IIM, XLRI, ISB) or equivalent experience. ExperienceOverall 8+ years of experience with at least 2 years in a business facing role, preferably in SaaS / PaaS. AdaptabilityComfortable navigating and prioritizing in situations of ambiguity, especially in the early stages of discovery and product development. Motivated self-starter with the ability to learn and adapt. Communication Skills: Strong communication and social skills; ability to work across teams with geographically remote team members, including the ability to frame complex concepts for a non-technical audience. InfluenceDemonstrated ability to influence without authority and communicate to multi-level audiences including growing and mentoring more junior product peers. Who we are Brightly, the global leader in intelligent asset management solutions, enables organizations to transform the performance of their assets. Brightlys sophisticated cloud-based platform leverages more than 20 years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightlys complete suite of intuitive software- including CMMS, EAM, Strategic Asset Management, IoT Remote Monitoring, Sustainability and Community Engagement. Paired with award-winning training, support, and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable communities. The Brightly culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish, and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live. Together we are Brightly.

Posted 1 day ago

Apply

7.0 - 12.0 years

14 - 18 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture Were guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.

Posted 1 day ago

Apply

8.0 - 13.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make optimal use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective processes. We are looking for Sr. AWS Cloud Architect Architect and Design Develop scalable and efficient data solutions using AWS services such as AWS Glue, Amazon Redshift, S3, Kinesis(Apache Kafka), DynamoDB, Lambda, AWS Glue(Streaming ETL) and EMR Integration Integrate real-time data from various Siemens organizations into our data lake, ensuring seamless data flow and processing. Data Lake Management Design and manage a large-scale data lake using AWS services like S3, Glue, and Lake Formation. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Snowflake Integration Implement and manage data pipelines to load data into Snowflake, utilizing Iceberg tables for optimal performance and flexibility. Performance Optimization Optimize data processing pipelines for performance, scalability, and cost-efficiency. Security and Compliance Ensure that all solutions adhere to security best practices and compliance requirements. Collaboration Work closely with cross-functional teams, including data engineers, data scientists, and application developers, to deliver end-to-end solutions. Monitoring and Troubleshooting Implement monitoring solutions to ensure the reliability and performance of data pipelines. Troubleshoot and resolve any issues that arise. Youd describe yourself as: Experience 8+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Strong knowledge of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. Find out more about Siemens careers at

Posted 1 day ago

Apply

3.0 - 5.0 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Hello eager tech expert! To create a better future, you need to think outside the box. Thats why we at Siemens need innovators who arent afraid to push boundaries to join our diverse team of tech gurus. Got what it takesThen help us create lasting, positive impact! Working for Siemens Financial Services Information Technology (SFS IT), you will work on the continuous enhancement of our Siemens Credit Warehouse solution by translating business requirements into IT solutions and working hand in hand on the implementation of these with our interdisciplinary and international team of IT experts. The Siemens Credit Warehouse is a business-critical IT-application that provides credit rating information and credit limits of our customers to all Siemens entities worldwide. We are looking for an experienced Release Manager to become part of our Siemens Financial Services Information Technology (SFS IT) Data Management team. You will have a pivotal role in the moderation of all aspects related to release management of our Data Platform, liaising between the different stakeholders that range from senior management to our citizen developer community. Through your strong communication & presentation skills, coupled with your solid technical background and critical thinking, youre able to connect technical topics to a non-technical/management quorum, leading the topics under your responsibility towards a positive outcome based on your natural constructive approach. Youll break new ground by: Lead topics across multiple stakeholders from different units in our organization (IT and Business). Actively listen to issues and problems faced by technical and non-technical members. Produce outstanding technical articles for documentation purposes. Youre excited to build on your existing expertise, including University degree in computer science, business information systems or similar area of knowledge. At least 3 to 5 years experience in a release manager role. Strong technical background with proven track record in: Data engineering and data warehousing, esp. with Snowflake and dbt open source, ideally dbt cloud that allow you to champing CI/ CD processes end-to-end setup and development of release management processes (CI/CD) and concepts. Azure DevOps (esp. CI/CD, project setup optimization), Github and Gitlab, including Git Bash. Reading YAML code for Azure DevOps pipeline and error handling. Very good programming skills in SQL (esp. DDL and DML statements). General good understanding of Azure Cloud tech stack (Azure Portal, Logic Apps, Synapse, Blob Containers, Kafka, Clusters and Streaming). A proven Track on AWS is a big plus. Experience in Terraform is a big plus. Create a better #TomorrowWithUs! Protecting the environment, conserving our natural resources, fostering the health and performance of our people as well as safeguarding their working conditions are core to our social and business commitment at Siemens. This role is based in Pune/Mumbai. Youll also get to visit other locations in India and beyond, so youll need to go where this journey takes you. In return, youll get the chance to work with international team and working on global topics.

Posted 1 day ago

Apply

3.0 - 5.0 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

Siemens Digital Industries Software is seeking a detail-oriented Data Analyst to join our Strategy team where your analytical prowess will directly shape the future of industrial innovation. This pivotal role is crafted for an analytical thinker who thrives on solving complex problems and is eager to grow into a Data Science professional within our world-renowned technology ecosystem. As a Siemens Data Analyst, you'll experience an outstanding opportunity to work at the intersection of AI-driven solutions and global industrial digitalization, with clear pathways for professional development and mentorship from world-class technology leaders. Your work will not just analyze data, but will contribute to transformative technologies that solve real-world challenges for the Strategy organization. Responsibilities: Develop and maintain robust data pipelines using tools like Snowflake and ELT/ETL processes. Optimally communicate complex data findings and recommendations to both technical and non-technical stakeholders. Gather, clean, and analyze large datasets to extract meaningful insights. Develop and maintain dashboards, reports, and visualizations to track key business metrics. Collaborate with partners to define data-driven solutions and support strategic decision-making. Identify trends, patterns, and correlations in data to improve business processes. Apply SQL, Python, or R to perform data manipulation and analysis. Ensure data accuracy, integrity, and consistency across reporting platforms. Work closely with data engineers to optimize data pipelines and database structures. Experiment with innovative AI technologies (Langchain for generative AI) to explore new marketing possibilities and develop data-driven recommendations. Explore opportunities to integrate machine learning and AI techniques into analysis over time. Qualifications: Bachelors degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. We are looking for candidate with 3-5 years of experience in data analysis, business intelligence, data science or a related field. Proficiency in SQL for querying and handling data! Experience with data visualization tools (e.g., Tableau, Power BI, Looker, Matplotlib, Seaborn). Familiarity with Python or R for data analysis and automation and git version control. We are seeking for candidate with solid understanding of regression analysis and other statistical methods. Experience with AI/Machine Learning frameworks (e.g., Langchain) is a strong plus. Experience with Snowflake data warehousing is a plus! Adaptable and eager to learn new tools and technologies. A results-oriented approach with a good work ethic and a passion for innovation. Strong analytical and problem-solving skills with attention to detail. Excellent communication skills to collaborate with technical and non-technical teams. Willingness to expand into machine learning and data science methodologies. Easy to work with global team and can build positive relationships with a broad set of stakeholders that have diverse interests. A collection of over 377,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and creativity and help us shape tomorrow! We offer a comprehensive reward package which includes a competitive basic salary, bonus scheme, generous holiday allowance, pension, and private healthcare. Siemens Software. Transform the every day with us' #li-plm #LI-Hybrid, #SWSaaS ...Less

Posted 1 day ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

8.0 - 11.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to provide best fit architectural solutions for one or more projects. You would also provide technology consultation and assist in defining scope and sizing of work You would implement solutions, create technology differentiation and leverage partner technologies. Additionally, you would participate in competency development with the objective of ensuring the best-fit and high quality technical solutions. You would be a key contributor in creating thought leadership within the area of technology specialization and in compliance with guidelines, policies and norms of Infosys.If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of architectural design patterns, performance tuning, database and functional designs Hands-on experience in Service Oriented Architecture Ability to lead solution development and delivery for the design solutions Experience in designing high level and low level documents is a plus Good understanding of SDLC is a pre-requisite Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-Data on Cloud-DataStore-Snowflake Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 1 day ago

Apply

9.0 - 11.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Science,Bachelor Of Comp. Applications,Master Of Technology,Master Of Engineering,Master Of Comp. Applications,Master Of Science Service Line Engineering Services Responsibilities Master's degree in Computer Science, Statistics, Mathematics, or a related field. 7+ years of experience in data science and machine learning with a strong focus on model development and deployment. Expert-level knowledge of statistics, including probability theory, hypothesis testing, and statistical inference. In-depth knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional : Experience with Natural Language Processing (NLP) and Computer Vision (CV) techniques. Knowledge of DevOps methodologies and practices for continuous integration/continuous delivery (CI/CD). Experience with data warehousing and data lakes solutions like BigQuery or Snowflake. Familiarity with real-time data processing and streaming analytics. Passion for learning and staying at the forefront of data science and machine learning advancements. Preferred Skills: Technology-Analytics - Techniques-Cluster Analysis Technology-Analytics - Techniques-Decision Trees Technology-Analytics - Techniques-Linear Regression Technology-Machine Learning-Python

Posted 1 day ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Base SAS Certified professional.Develop, implement, and optimize analytical models using SAS and SQL.Strong Knowledge within SAS DI,SAS EG, SAS BI tools.Analyze large datasets to derive actionable insights and support business decision-making.Design, implement, and maintain ETL workflows to extract, transform, and load data efficiently.Develop advanced SAS programs using SAS Macros for automation and data processing.Troubleshoot and optimize SAS code for performance improvements.Work on data warehousing projects to enable efficient data storage and retrieval.Basic Unix Scripts related Knowledge.JL5B- 8+ Years of ExperienceAdvanced SAS Certified with 5+Worked on Min 2SAS Migration Projects execution.Strong Knowledge within SAS DI,SAS EG, SAS BI tools.SAS Migration Projects to SAS Scripts to Python, Pyspark, Databricks, ADF and Snowflake etc.Good Knowledge within SAS VIYA 3.4,4.0 platforms.Good Knowledge on SAS- SMC, LSF and other Schedulers Basic Knowledge on SAS Admin information, Basic SAS Grid level Knowledge.Good Unix commands, Scripts related Knowledge.Handle case studies or complex data scenarios, ensuring data quality and integrity.Develop advanced SAS programs using SAS Macros for automation and data processing.Troubleshoot and optimize SAS code for performance improvements.Collaborate with the data engineering team to build and manage robust data pipelines.Work on data warehousing projects to enable efficient data storage and retrieval.Present findings and insights clearly to both technical and non-technical stakeholders.Work closely with teams across departments to gather requirements and deliver solutions Technical and Professional : Min 5+ years in the analytics domain, with a strong portfolio of relevant projects.Proficiency in SAS, SAS Macros, and SQL.Hands-on experience in ETL processes and tools.Knowledge of data engineering concepts and data warehousing best practices. Preferred Skills: Technology-Reporting Analytics & Visualization-SAS Enterprise Guide

Posted 1 day ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated on industry trends and best practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

9.0 - 11.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering,Bachelor Of Technology,Bachelor Of Science,Bachelor Of Comp. Applications,Master Of Technology,Master Of Engineering,Master Of Comp. Applications,Master Of Science Service Line Engineering Services Responsibilities Evaluate the performance of machine learning models and refine them to improve accuracy andgeneralizability. Communicate data insights to stakeholders in a clear and concise manner, using data visualization techniques and storytelling. collaborate with data engineers, software developers, and business stakeholders to integrate data science solutions into products and services. Stay up-to-date with the latest trends and developments in data science, machine learning, and artificial intelligence. Additional Responsibilities: GExperience with Natural Language Processing (NLP) and Computer Vision (CV) techniques. Knowledge of DevOps methodologies and practices for continuous integration/continuous delivery (CI/CD). Experience with data warehousing and data lakes solutions like BigQuery or Snowflake. Familiarity with real-time data processing and streaming analytics. Passion for learning and staying at the forefront of data science and machine learning advancements. Technical and Professional : Master's degree in Computer Science, Statistics, Mathematics, or a related field. 7+ years of experience in data science and machine learning with a strong focus on model development and deployment. Expert-level knowledge of statistics, including probability theory, hypothesis testing, and statistical inference. In-depth knowledge of machine learning algorithms, including linear regression, logistic regression, decision trees, random forests, xgboost, and ensemble learning. Strong programming skills in Python and proficiency in data science libraries like pandas, scikit-learn, numpy, Pytorch/Keras, and TensorFlow. Experience with cloud computing platforms, particularly Google Cloud Platform (GCP). Excellent data visualization skills using tools like matplotlib, seaborn, or Tableau. Strong communication and presentation skills, both written and verbal. Preferred Skills: Technology-Analytics - Techniques-Cluster Analysis Technology-Analytics - Techniques-Decision Trees Technology-Analytics - Techniques-Linear Regression Technology-Machine Learning-Python

Posted 1 day ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.-Good to have AWS , python and data vault skills- Strong understanding of data modeling and ETL processes.- Experience with SQL and data querying techniques.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize data warehouse performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering,BTech,BSc,BCA,MCA,MTech,MSc Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional : Primary skills:Technology-Data on Cloud-DataStore-Snowflake Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 1 day ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving discussions and contribute to the overall success of the projects you are involved in, while also focusing on continuous improvement and efficiency in application development. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 day ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Pune

Work from Office

Naukri logo

Educational Bachelor of Engineering,Bachelor Of Technology Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Location of posting - Infosys Ltd. is committed to ensuring you have the best experience throughout your journey with us. We currently have open positions in a number of locations across India - Bangalore, Pune, Hyderabad, Chennai, Chandigarh, Mysore, Kolkata, Trivandrum, Indore, Nagpur, Mangalore, Noida, Bhubaneswar, Coimbatore, Mumbai, Jaipur, Hubli, Vizag.While we work in accordance with business requirements, we shall strive to offer you the location of your choice, where possible. Technical and Professional : As a Snowflake Data Vault developer, individual is responsible for designing, implementing, and managing Data Vault 2.0 models on Snowflake platform. Candidate should have at least 1 end to end Data Vault implementation experience. Below are the detailed skill requirements: Designing and building flexible and highly scalable Data Vault 1.0 and 2.0 models. Suggest optimization techniques in existing Data Vault models using ghost entries, bridge, PIT tables, reference tables, Satellite split / merge, identification of correct business key etc. Design and administer repeating design patterns for quick turn around Engage and collaborate with customers effectively to understand the Data Vault use cases and brief the technical team with technical specifications Working knowledge of Snowflake is desirable Working knowledge of DBT is desirable Preferred Skills: Technology-Data on Cloud-DataStore-Snowflake

Posted 1 day ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: Data Quality Implementation & Monitoring (Acceldata & Demand Tools): Design, develop, and implement data quality rules and checks using Acceldata to monitor data accuracy, completeness, consistency, and timeliness. Configure and utilize Acceldata to profile data, identify anomalies, and establish data quality thresholds. Investigate and resolve data quality issues identified by Acceldata, working with relevant teams for remediation. Leverage DemandTools within our Salesforce environment to identify, merge, and prevent duplicate records across Leads, Contacts, and Accounts. Implement data standardization and cleansing processes within Salesforce using DemandTools. Develop and maintain data quality dashboards and reports using Acceldata to provide visibility into data health. Data Onboarding & Integration Quality: Collaborate with engineers and platform teams to understand data sources and pipelines built using Fivetran / ingestion tool. Ensure data transformations within Fivetran to maintain data integrity and quality. Develop and execute test plans and test cases to validate the successful and accurate onboarding of data into our snowflake environment. Metadata Management & Data Governance: Work with the Atlan platform to understand and contribute to the establishment of a comprehensive data catalog. Assist in defining and implementing data governance policies and standards within Atlan. Validate the accuracy and completeness of metadata within Atlan to ensure data discoverability and understanding. Collaborate on data lineage tracking and impact analysis using Atlan. Collaboration & Communication: Work closely with data engineers, platform team, data analysts, business stakeholders, and Salesforce administrators. Clearly communicate data quality findings, risks, and remediation steps. Participate in data governance meetings and contribute to the development of data quality best practices. Document data quality rules, processes, and monitoring procedures. Required Skills & Experience: Proven experience (e.g., 3+ years) as a Data Quality Engineer or similar role. Hands-on experience with Fivetran / data ingestion application for data integration and understanding its data transformation capabilities. Familiarity with Atlan or other modern data catalog and metadata management tools. Strong practical experience with Acceldata or similar data quality monitoring and observability platforms. Familiarity in using DemandTools for data quality management within Salesforce. Solid understanding of data quality principles, methodologies, and best practices. Strong SQL skills for data querying and analysis. Experience with data profiling and data analysis techniques. Excellent analytical, problem-solving, and troubleshooting skills. Strong communication and collaboration skills. Ability to work independently and manage tasks effectively in a remote environment. Preferred Skills & Experience: Experience with other data quality tools or frameworks. Knowledge of data warehousing concepts and technologies (e.g., Snowflake, BigQuery). Experience with scripting languages like Python for data manipulation and automation. Familiarity with Salesforce data model and administration.

Posted 1 day ago

Apply

10.0 - 13.0 years

12 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

About the Opportunity Job TypeApplication 31 July 2025 TitlePrincipal Data Engineer (Associate Director) DepartmentISS LocationBangalore Reports ToHead of Data Platform - ISS Grade 7 Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.Be accountable for technical delivery and take ownership of solutions.Lead a team of senior and junior developers providing mentorship and guidance.Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.Drive technical innovation within the department to increase code reusability, code quality and developer productivity.Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.Experience designing event-based or streaming data architectures using Kafka.Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements.Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.Experience implementing CDC ingestion.Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes.Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making.Communication:Strong in strategic communication and stakeholder engagement.Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies