Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake, with SnowPro Core certification being a must-have. In at least one project, you have utilized DBT to deploy models in production. Furthermore, you have experience in configuring and deploying Airflow, integrating various operators in Airflow, especially DBT & Snowflake. Your capabilities also include designing build, release pipelines, and a solid understanding of the Azure DevOps Ecosystem. Proficiency in Python, particularly PySpark, allows you to write metadata-driven programs. You are well-versed in Data Vault (Raw, Business) and concepts such as Point In Time and Semantic Layer. In ambiguous situations, you demonstrate resilience and possess the ability to clearly articulate problems in a business-friendly manner. Documenting processes, managing artifacts, and evolving them over time are practices you believe in and adhere to diligently. Required Skills: data vault, dbt, python, snowflake, data, Azure Cloud, AWS, articulate, PySpark, concepts, Azure, Airflow, artifacts, Azure DevOps.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will play a crucial role in enhancing the Analytics capabilities for our businesses. Your responsibilities will include engaging with key stakeholders to comprehend Fidelity's sales, marketing, client services, and propositions context. You will collaborate with internal teams such as the data support team and technology team to develop new tools, capabilities, and solutions. Additionally, you will work closely with IS Operations to expedite the development and sharing of customized data sets. Maximizing the adoption of Cloud-Based Data Management Services will be a significant part of your role. This involves setting up sandbox analytics environments using platforms like Snowflake, AWS, Adobe, and Salesforce. You will also support data visualization and data science applications to enhance business operations. In terms of stakeholder management, you will work with key stakeholders to understand business problems and translate them into suitable analytics solutions. You are expected to facilitate smooth execution, delivery, and implementation of these solutions through effective engagement with stakeholders. Your role will also involve collaborating with the team to share knowledge and best practices, including coaching on deep learning and machine learning methodologies. Taking independent ownership of projects and initiatives within the team is crucial, demonstrating leadership and accountability. Furthermore, you will be responsible for developing and evaluating tools, methodologies, or infrastructure to address long-term business challenges. This may involve enhancing modelling software, methodologies, data requirements, and optimization environments to elevate the team's capabilities. To excel in this role, you should possess 5 to 8 years of overall experience in Analytics, with at least 4 years of experience in SQL, Python, open-source Machine Learning Libraries, and Deep Learning. Experience working in an AWS Environment, preferably using Snowflake, is preferred. Proficiency in analytics applications such as Python, SAS, SQL, and interpreting statistical results is necessary. Knowledge of SPARK, Hadoop, and Big Data Platforms will be advantageous.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
tirupati, andhra pradesh
On-site
You are an experienced Snowflake Data Engineer with expertise in Python and SQL, holding a Snowflake certification and having at least 4 years of hands-on experience with Snowflake. Your primary responsibility will be to design, develop, and maintain robust data pipelines in a cloud environment, ensuring efficient data integration, transformation, and storage within the Snowflake data platform. Your key responsibilities will include designing and developing data pipelines to handle large volumes of structured and unstructured data using Snowflake and SQL. You will also be responsible for developing and maintaining efficient ETL/ELT processes to integrate data from various sources into Snowflake, ensuring data quality and availability. Additionally, you will write Python scripts to automate data workflows, implement data transformation logic, and integrate with external APIs for data ingestion. You will create and optimize complex SQL queries for data extraction, transformation, and reporting purposes. Moreover, you will develop and maintain data models to support business intelligence and analytics, leveraging Snowflake best practices. Ensuring proper data governance, security, and compliance within the Snowflake environment will also be one of your responsibilities by implementing access controls, encryption, and monitoring. Collaboration is key, as you will work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable solutions. As a qualified candidate, you must have a Snowflake Certification, 4+ years of experience with Snowflake, and active Snowflake certification. You should possess strong experience with Python for data processing, automation, and API integration. Expertise in writing and optimizing complex SQL queries and experience with data warehousing and database management is essential. Hands-on experience with designing and implementing ETL/ELT pipelines using Snowflake is also required. Familiarity with cloud environments such as AWS, GCP, or Azure, especially in relation to data storage and processing, is necessary. Experience with implementing data governance frameworks and security protocols in a cloud data platform is also a prerequisite. Preferred skills include experience with CI/CD pipelines for data projects, knowledge of Apache Airflow or other orchestration tools, and familiarity with big data technologies and distributed systems. Educational background should include a Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Additionally, possessing strong problem-solving and analytical skills, excellent communication skills to interact with both technical and non-technical stakeholders, and the ability to work in a fast-paced, agile environment are essential soft skills for this role.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
About Mindstix Software Labs: Mindstix accelerates digital transformation for the world's leading brands. We are a team of passionate innovators specialized in Cloud Engineering, DevOps, Data Science, and Digital Experiences. Our UX studio and modern-stack engineers deliver world-class products for our global customers that include Fortune 500 Enterprises and Silicon Valley startups. Our work impacts a diverse set of industries - eCommerce, Luxury Retail, ISV and SaaS, Consumer Tech, and Hospitality. A fast-moving open culture powered by curiosity and craftsmanship. A team committed to bold thinking and innovation at the very intersection of business, technology, and design. That's our DNA. Roles and Responsibilities: Mindstix is looking for a proficient Data Engineer. You are a collaborative person who takes pleasure in finding solutions to issues that add to the bottom line. You appreciate technical work by hand and feel a sense of ownership. You require a keen eye for detail, work experience as a data analyst, and in-depth knowledge of widely used databases and technologies for data analysis. Your responsibilities include: - Building outstanding domain-focused data solutions with internal teams, business analysts, and stakeholders. - Applying data engineering practices and standards to develop robust and maintainable solutions. - Being motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases. - Being a natural problem-solver and intellectually curious across a breadth of industries and topics. - Being acquainted with different aspects of Data Management like Data Strategy, Architecture, Governance, Data Quality, Integrity & Data Integration. - Being extremely well-versed in designing incremental and full data load techniques. Qualifications and Skills: - Bachelors or Master's degree in Computer Science, Information Technology, or allied streams. - 2+ years of hands-on experience in the data engineering domain with DWH development. - Must have experience with end-to-end data warehouse implementation on Azure or GCP. - Must have SQL and PL/SQL skills, implementing complex queries and stored procedures. - Solid understanding of DWH concepts such as OLAP, ETL/ELT, RBAC, Data Modelling, Data Driven Pipelines, Virtual Warehousing, and MPP. - Expertise in Databricks - Structured Streaming, Lakehouse Architecture, DLT, Data Modeling, Vacuum, Time Travel, Security, Monitoring, Dashboards, DBSQL, and Unit Testing. - Expertise in Snowflake - Monitoring, RBACs, Virtual Warehousing, Query Performance Tuning, and Time Travel. - Understanding of Apache Spark, Airflow, Hudi, Iceberg, Nessie, NiFi, Luigi, and Arrow (Good to have). - Strong foundations in computer science, data structures, algorithms, and programming logic. - Excellent logical reasoning and data interpretation capability. - Ability to interpret business requirements accurately. - Exposure to work with multicultural international customers. - Experience in the Retail/ Supply Chain/ CPG/ EComm/Health Industry is a plus. Who Fits Best - You are a data enthusiast and problem solver. - You are a self-motivated and fast learner with a strong sense of ownership and drive. - You enjoy working in a fast-paced creative environment. - You appreciate great design, have a strong sense of aesthetics and have a keen eye for detail. - You thrive in a customer-centric environment with the ability to actively listen, empathize and collaborate with globally distributed teams. - You are a team player who desires to mentor and inspire others to do their best. - You love expressing ideas and articulating well with strong written and verbal English communication and presentation skills. - You are detail-oriented with an appreciation for craftsmanship. Benefits: - Flexible working environment. - Competitive compensation and perks. - Health insurance coverage. - Accelerated career paths. - Rewards and recognition. - Sponsored certifications. - Global customers. - Mentorship by industry leaders. Location: This position is primarily based at our Pune (India) headquarters, requiring all potential hires to work from this location. A modern workplace is deeply collaborative by nature, while also demanding a touch of flexibility. We embrace deep collaboration at our offices with reasonable flexi-timing and hybrid options to our seasoned team members. Equal Opportunity Employer.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are an experienced Senior QA Specialist being sought to join a dynamic team for a critical AWS to GCP migration project. Your primary responsibility will involve the rigorous testing of data pipelines and data integrity in GCP cloud to ensure seamless reporting and analytics capabilities. Your key responsibilities will include designing and executing test plans to validate data pipelines re-engineered from AWS to GCP, ensuring data integrity and accuracy. You will work closely with data engineering teams to understand AVRO, ORC, and Parquet file structures in AWS S3, and analyze the data in external tables created in Athena used for reporting. It will be essential to ensure that schema and data in Bigquery match against Athena to support reporting in PowerBI. Additionally, you will be required to test and validate Spark pipelines and other big data workflows in GCP. Documenting all test results and collaborating with development teams to resolve discrepancies will also be part of your responsibilities. Furthermore, providing support to UAT business users during UAT testing is expected. To excel in this role, you should possess proven experience in QA testing within a big data DWBI ecosystem. Strong familiarity with cloud platforms such as AWS, GCP, or Azure, with hands-on experience in at least one is necessary. Deep knowledge of data warehousing solutions like BigQuery, Redshift, Synapse, or Snowflake is essential. Expertise in testing data pipelines and understanding different file formats like Avro and Parquet is required. Experience with reporting tools such as PowerBI or similar is preferred. Your excellent problem-solving skills and ability to work independently will be valuable, along with strong communication skills and the ability to collaborate effectively across teams.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Cloud Data Integration Consultant, you will be responsible for leading a complex data integration project that involves API frameworks, a data lakehouse architecture, and middleware solutions. The project focuses on technologies such as AWS, Snowflake, Oracle ERP, and Salesforce, with a high transaction volume POS system. Your role will involve building reusable and scalable API frameworks, optimizing middleware, and ensuring security and compliance in a multi-cloud environment. Your expertise in API development and integration will be crucial for this project. You should have deep experience in managing APIs across multiple systems, building reusable components, and ensuring bidirectional data flow for real-time data synchronization. Additionally, your skills in middleware solutions and custom API adapters will be essential for integrating various systems seamlessly. In terms of cloud infrastructure and data processing, your strong experience with AWS services like S3, Lambda, Fargate, and Glue will be required for data processing, storage, and integration. You should also have hands-on experience in optimizing Snowflake for querying and reporting, as well as knowledge of Terraform for automating the provisioning and management of AWS resources. Security and compliance are critical aspects of the project, and your deep understanding of cloud security protocols, API security, and compliance enforcement will be invaluable. You should be able to set up audit logs, ensure traceability, and enforce compliance across cloud services. Handling high-volume transaction systems and real-time data processing requirements will be part of your responsibilities. You should be familiar with optimizing AWS Lambda and Fargate for efficient data processing and be skilled in operational monitoring and error handling mechanisms. Collaboration and support are essential for the success of the project. You will need to provide post-go-live support, collaborate with internal teams and external stakeholders, and ensure seamless integration between systems. To qualify for this role, you should have at least 10 years of experience in enterprise API integration, cloud architecture, and data management. Deep expertise in AWS services, Snowflake, Oracle ERP, and Salesforce integrations is required, along with a proven track record of delivering scalable API frameworks and handling complex middleware systems. Strong problem-solving skills, familiarity with containerization technologies, and experience in retail or e-commerce industries are also desirable. Your key responsibilities will include leading the design and implementation of reusable API frameworks, optimizing data flow through middleware systems, building robust security frameworks, and collaborating with the in-house team for seamless integration between systems. Ongoing support, monitoring, and optimization post-go-live will also be part of your role.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
coimbatore, tamil nadu
On-site
As a Data Engineering Lead/Architect with 10+ years of experience, you will play a crucial role in architecting and designing data solutions to meet business requirements effectively. You will collaborate with cross-functional teams to design scalable and efficient data architectures, models, and integration strategies. Your technical leadership will be essential in implementing data pipelines, ETL processes, and data warehousing solutions. Your expertise in Snowflake technologies will be key in building and optimizing data warehouses. You will develop and maintain Snowflake data models and schemas, following best practices such as cost analysis, resource allocation, and security configurations. Additionally, you will leverage Azure cloud services and Databricks platforms to manage and process large datasets efficiently, building and maintaining data pipelines on Azure services. Implementing best practices for data warehousing, ensuring data quality, consistency, and reliability will be part of your responsibilities. You will create and manage data integration processes, including real-time and batch data movement between systems. Your mastery in writing complex SQL and PL/SQL queries will enable you to extract, transform, and load data effectively, optimizing SQL queries and database performance for high volume data processing. Continuous monitoring and enhancement of data pipelines and data storage systems will be crucial for performance tuning and optimization. You will troubleshoot and resolve data-related issues to minimize downtime while documenting data engineering processes, data flows, and architectural decisions. Collaboration with data scientists, analysts, and stakeholders is essential to ensure data availability and usability. Your role will also involve implementing data security measures and adhering to compliance standards to protect sensitive data. In addition to your technical skills, you will be expected to showcase abilities such as driving data engineering strategies, engaging in sales and proposal activities, developing customer relationships, leading a technical team, and mentoring other team members. You should be able to clarify and translate customer requirements into Epics/Stories, removing ambiguity and aligning others to your ideas/solutions. To qualify for this role, you should have a Bachelor's or Master's degree in computer science, Information Technology, or a related field, along with over 10 years of experience in Data Engineering with a strong focus on architecture. Your proven expertise in Snowflake, Azure, and Databricks technologies, comprehensive knowledge of data warehousing concepts, ETL processes, data integration techniques, and exceptional SQL and PL/SQL skills are essential. Certifications in relevant technologies like Snowflake and Azure will be a plus. Strong problem-solving skills, the ability to work in a fast-paced, collaborative environment, and excellent communication skills are also required to convey technical concepts to non-technical stakeholders effectively.,
Posted 2 weeks ago
5.0 - 8.0 years
10 - 20 Lacs
Hyderabad
Work from Office
7+ years of experience as a Data Engineer or Snowflake Developer. Expert-level knowledge of SQL (joins, subqueries, CTEs). Experience with ETL tools (e.g., Informatica, Talend, Matillion). Experience with cloud platforms like AWS, Azure, or GCP.
Posted 2 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Adobe Launch and Analytics. Experience: 8-10 Years.
Posted 2 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Pune
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Telco Solution Architecture. Experience: 8-10 Years.
Posted 2 weeks ago
10.0 - 12.0 years
12 - 14 Lacs
Chennai
Work from Office
Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFPs received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Mainframe. Experience: >10 YEARS.
Posted 2 weeks ago
6.0 - 9.0 years
0 - 2 Lacs
Bengaluru
Work from Office
Manager- Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Corporate and Investment Banking & Investment Research (CIB & IR) As a global leader in knowledge processes, research, and analytics, youll be working with a team that specializes in global market research, working with the top-rated investment research organizations, bulge bracket investment banks, and leading asset managers. We cater to 8 of the top 10 global banks, working alongside their product and sector teams, supporting them on deal origination, execution, valuation, and transaction advisory -related projects. What you will be doing at Evalueserve Construct analytical dashboards from alternative data use cases, such as sector or thematic and financial KPI dashboards. Load and import data into internal warehouses through Azure Blob Storage and/or S3 deliveries, SFTP, and other ingestion mechanisms. Design and implement ETL workflows for preprocessing of transactional and aggregated datasets including complex joins, window functions, aggregations, bins and partitions. Manipulate and enhance time series datasets into relational data stores. Implement and refine panels in transactional datasets and relevant panel normalization. Conduct web scraping, extraction and post-processing of numerical data from web-based datasets. What were looking for Previous experience working within fundamental equity investment workflows, such as exposure to financial modeling. High proficiency in SQL and the Python data stack (pandas, numpy, sklearn). Experience working with scheduling and execution platforms, such as Airflow, Prefect, or similar scheduled DAG frameworks. Understanding of efficient query management in Snowflake, DataBricks, or equivalent platforms. Optional familiarity with automation of workflows that produce Excel outputs, such as through openpyxl. Optional familiarity with integrations and import/exports to REST/gRPC/GraphQL APIs. Security: This role is performed in a dedicated, secure workspace Travel: Annual travel to the U.S. for onsite collaboration is expected. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 2 weeks ago
6.0 - 9.0 years
0 - 2 Lacs
Bengaluru
Work from Office
Manager- Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Corporate and Investment Banking & Investment Research (CIB & IR) As a global leader in knowledge processes, research, and analytics, youll be working with a team that specializes in global market research, working with the top-rated investment research organizations, bulge bracket investment banks, and leading asset managers. We cater to 8 of the top 10 global banks, working alongside their product and sector teams, supporting them on deal origination, execution, valuation, and transaction advisory -related projects. What you will be doing at Evalueserve Constructanalytical dashboards from alternative data use cases, such as sector orthematic and financial KPI dashboards. Load andimport data into internal warehouses through Azure Blob Storage and/or S3deliveries, SFTP, and other ingestion mechanisms. Designand implement ETL workflows for preprocessing of transactional and aggregateddatasets including complex joins, window functions, aggregations, bins andpartitions. Manipulateand enhance time series datasets into relational data stores. Implementand refine panels in transactional datasets and relevant panel normalization. Conduct webscraping, extraction and post-processing of numerical data from web-baseddatasets. What were looking for Previousexperience working within fundamental equity investment workflows, such asexposure to financial modeling. Highproficiency in SQL and the Python data stack (pandas, numpy, sklearn). Experienceworking with scheduling and execution platforms, such as Airflow, Prefect, orsimilar scheduled DAG frameworks. Understandingof efficient query management in Snowflake, DataBricks, or equivalentplatforms. Optionalfamiliarity with automation of workflows that produce Excel outputs, such asthrough openpyxl. Optionalfamiliarity with integrations and import/exports to REST/gRPC/GraphQL APIs. Security:This role is performed in a dedicated, secure workspace Travel:Annual travel to the U.S. for onsite collaboration is expected. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking on achievements AI-powered supply chain optimization solution built on Google Cloud. How Evalueserve is now Leveraging NVIDIA NIM to enhance our AI and digital transformation solutions and to accelerate AI Capabilities . Know more about ho w Evalueserve has climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer: The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 2 weeks ago
8.0 - 13.0 years
27 - 32 Lacs
Bengaluru
Work from Office
Job Title: Senior Data & Analytics Engineer | 8+ years | (Python / R / Java ,SQL / Oracle / MySQL / PostgreSQL , Tableau / Power BI, GIT / SVN ,Snowflake / Hadoop,/ Spark ) Meet the team: Provider Mobility are the engineering team behind Ciscos Control Center Industrial IoT (Internet of Things) product. Control Center has 37 Service Providers, 31 thousand Enterprise customers, and over 275 million end users. The Data and Product Analytics team provide data driven insights or senior leaders, and SLA reporting for Ciscos Service Providers. As a Senior Data & Analyst Engineer you will design, build, and maintain scalable data pipelines and robust analytics frameworks to support Provider Mobilitys business objectives. You will work with a team of data engineers and collaborate with cross-functional teams to ensure seamless integration of data-driven insights into decision-making processes and customer SLA readouts. Your Impact Be responsible for the design, construction and maintenance of scalable data pipelines and data storage architecture to support the generation of business metrics. Collaborate with cross-functional teams to understand metric requirements, identify new data sources and deliver actionable insights. Ensure data architecture performance, data security and compliance with relevant regulations. Be responsible for dashboards, SLA reports, and visualizations to communicate metrics to executives and customers. Mentor and guide data engineers and team members, leading technical discussions and contributing to strategic planning within the data analytics team. Stay up-to-date with industry trends and emerging technologies in data analytics and data engineering. Minimum Qualifications: Bachelors degree in Computer Science, Data Science, Statistics, or a related field. 8+ years of experience in data analysis, data pipeline engineering, or a related role. Strong programming skills in languages such as Python, R, or Java Proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL). Experience with data visualization tools (e.g., Tableau, Power BI). Experience with agile software development and version control systems (e.g., Git). Preferred Qualifications: Familiarity with big data technologies (e.g., Snowflake, Hadoop, Spark) is a plus. Excellent problem-solving skills and attention to detail. Strong communication skills and the ability to work collaboratively in a team environment.
Posted 2 weeks ago
3.0 - 8.0 years
4 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Experience in Snowflake Support,Strong SQL, exp.working with SAP OTC/Finance/WM.Familiarity with cloud platforms & data warehousing principles. Familiarity with data integration & ETL processes.
Posted 2 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
No of years experience Relevant 5+ Years. Detailed job description - Skill Set: Attached Mandatory Skills* Design, Build, Testing of a new Contractor Management Dashboard portal for the program and Assets utilizing current MyPass and other Snowflake Tables. Good to Have Skills Snowflake tables review/optimization for MyPass data. User Training, Feedback and Refinement
Posted 2 weeks ago
4.0 - 8.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Project description We are more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a commitment to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It's how we've contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services - and our open-access model - we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. We are headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. Responsibilities As a Senior Quality Assurance Engineer, you will be responsible for ensuring the quality and reliability of complex data-driven systems, with a focus on financial services applications. You will work closely with Data Engineers, Business Analysts, and Developers across global teams to validate functionality, accuracy, and performance of software solutions, particularly around data migration from on-premises to cloud platforms. Key responsibilities include Leading and executing end-to-end test plans, including functional, unit, regression, and back-to-back testing Designing test strategies for data migration projects, with a strong focus on Oracle to Cloud transitions Verifying data accuracy and transformation logic across multiple environments Writing Python-based automated test scripts and utilities for validation Participating in Agile ceremonies, collaborating closely with cross-functional teams Proactively identifying and documenting defects, inconsistencies, and process improvements Contributing to continuous testing and integration practices Ensuring traceability between requirements, test cases, and delivered code Skills Must have Mandatory Skills Description The ideal candidate must demonstrate strong experience ( minimum 7 Years) and hands-on expertise in the following areas Data Testing (Oracle to Cloud Migration)Deep understanding of testing strategies related to large-scale data movement and transformation validation between legacy on-premise systems and modern cloud platforms. Python ScriptingProficient in using Python for writing automated test scripts and tools to streamline testing processes. Regression TestingProven ability to develop and manage comprehensive regression test suites ensuring consistent software performance over releases. Back-to-Back TestingExperience in comparing results between old and new systems or components to validate data integrity post-migration. Functional TestingSkilled in verifying system behavior against functional requirements in a business-critical environment. Unit TestingCapable of writing and executing unit tests for small code components to ensure correctness at the foundational level. Nice to have While not required, the following skills would be a strong plus and would enhance your effectiveness in the role Advanced Python DevelopmentExperience in building complex QA tools or contributing to CI/CD pipelines using Python. DBT (Data Build Tool)Familiarity with DBT for transformation testing and documentation in data engineering workflows. SnowflakeExposure to Snowflake cloud data warehouse and understanding of its testing and validation mechanisms.
Posted 2 weeks ago
7.0 - 12.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Project description Luxoft has been asked to contract a Developer in support of a number of customer initiatives. The primary objective is to develop based on client requirements in the Telecom/network work environment Responsibilities A Data Engineer with experience in the following techologies Databricks and Azure Apache Spark-based, hands on Python, SQL, Apache Airflow. Databricks clusters for ETL processes. Integration with ADLS, Blob Storage. Efficiently ingest data from various sources, including on-premises databases, cloud storage, APIs, and streaming data. Use Azure Key Vault for managing secrets. Hands on experience working with API's Kafka/Azure EventHub streaming hands on experience Hands on experience with data bricks delta API's and UC catalog Hands on experience working with version control tools Github Data Analytics Supports various ML frameworks. Integration with Databricks for model training. OnPrem Exposure on Linux based systems Unix scripting Skills Must have Python, Apache Airflow, Microsoft Azure and Databricks, SQL, databricks clusters for ETL, ADLS, Blob storage, ingestion from various sources including databases and cloud storage, APIs and streaming data, Kafka/Azure EventHub, databricks delta APIs and UC catalog. EducationTypically, a Bachelor's degree in Computer Science (preferably M.Sc. in Computer Science), Software Engineering, or a related field is required. Experience7+ years of experience in development or related fields. Problem-Solving Skills: Ability to troubleshoot and resolve issues related to application development and deployment. Communication Skills: Ability to effectively communicate technical concepts to team members and stakeholders. This includes written and verbal communication. TeamworkAbility to work effectively in teams with diverse individuals and skill sets. Continuous LearningGiven the rapidly evolving nature of web technologies, a commitment to learning and adapting to new technologies and methodologies is crucial. Nice to have Snowflake, PostGre, Redis exposure GenAI exposure Good understanding of RBAC
Posted 2 weeks ago
5.0 - 10.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Responsibilities We are seeking a highly skilled and motivated Data Scientist with 5+ years of experience to join our team. The ideal candidate will bring strong data science, programming, and data engineering expertise, along with hands-on experience in generative AI, large language models, and modern LLM application frameworks. This role also demands excellent communication and stakeholder management skills to collaborate effectively across business units. Skills Must have Experience5+ years of industry experience as a Data Scientist, with a proven track record of delivering impactful, data-driven solutions. Programming Skills: Advanced proficiency in Python, with extensive experience writing clean, efficient, and maintainable code. Proficiency with version control tools such as Git. Data EngineeringStrong working proficiency with SQL and distributed computing with Apache Spark. Cloud PlatformsExperience building and deploying apps on Azure Cloud. Generative AI & LLMsPractical experience with large language models (e.g., OpenAI, Anthropic, HuggingFace). Knowledge of Retrieval-Augmented Generation (RAG) techniques and prompt engineering is expected. Machine Learning & ModelingStrong grasp of statistical modeling, machine learning algorithms, and tools like scikit-learn, XGBoost, etc. Stakeholder EngagementExcellent communication skills with a demonstrated ability to interact with business stakeholders, understand their needs, present technical insights clearly, and drive alignment across teams. Tools and librariesProficiency with libraries like Pandas, NumPy, and ML lifecycle tools such as MLflow. Team CollaborationProven experience contributing to agile teams and working cross-functionally in fast-paced environments. Nice to have Hands-on experience with Databricks and Snowflake. Hands-on experience building LLM-based applications using agentic frameworks like LangChain, LangGraph, and AutoGen. Familiarity with data visualization platforms such as Power BI, Tableau, or Plotly. Front-end/Full stack development experience. Exposure to MLOps practices and model deployment pipelines in production.
Posted 2 weeks ago
3.0 - 8.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description We are DXC Luxoft Financial Services - an award-winning provider of technology solutions, dedicated to the Financial Services sector. Join our international team and become a member of our open minded, progressive and professional team of financial services consultants. In this role you will be working on projects for biggest investment banks across the globe. You will have a chance to grow your technical and soft skills, and build a thorough expertise in the Capital Markets industry. On top of attractive salary and benefits package, we will invest into your professional training, and allow you to grow your professional career. Responsibilities We are seeking a Data Engineer who can design and build with minimum supervision solutions for our team. The person needs to be knowledgeable on modern software development practices including Scrums, be up to date with the latest Database tools and frameworks and be able to work with the existing team to deliver solutions. The ideal candidate will be diligent, tenacious and delivery focused with a track record of building high quality solutions that are resilient and maintainable. Use of cloud technology is increasing, so experience working with AWS/ Snowflake would be beneficial. Skills Must have MUST HAVE S Proven experience as a Senior Software Engineer with extensive experience in software development at least 7Y total experience Proven experience with DBT (3+ year) Proven experience working with Oracle PLSQL & XML Jason Strong Understanding of ETL, main DWH principles Strong Understanding of sharding and clustering. Optimization of dynamic SQL (4+ year) Good knowledge of T-SQL (4+ years) OTHER Strong analytical skills BS/MS Degree in Computer Science, Engineering, and/or related field or equivalent work experience Good communication skills Able to work in a challenging, fast-paced environment Nice to have Snowflake experience Financial domain knowledge Experience in a market-data environment is a plus Proven experience working with Java and Snowflake
Posted 2 weeks ago
5.0 - 10.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Project description Institutional Banking Data Platform (IDP) is state-of-the-art cloud platform engineered to streamline data ingestion, transformation, and data distribution workflows that underpin Regulatory Reporting, Market Risk, Credit Risk, Quants, and Trader Surveillance. In your role as Software Engineer, you will be responsible for ensuring the stability of the platform, performing maintenance and support activities, and driving innovative process improvements that add significant business value. Responsibilities Problem solving advanced analytical and problem-solving skills to analyse complex information for key insights and present as meaningful information to senior management Communication excellent verbal and written communication skills with the ability to lead discussions with a varied stakeholder across levels Risk Mindset You are expected to proactively identify and understand, openly discuss, and act on current and future risks Skills Must have Bachelor's degree in computer science, Engineering, or a related field/experience. 5+ years of proven experience as a Software Engineer or similar role, with a strong track record of successfully maintaining and supporting complex applications. Strong hands-on experience with Ab Initio GDE, including Express>It, Control Centre, Continuous>flow. Should have handled and worked with XML, JSON, and Web API. Strong hands-on experience in SQL. Hands-on experience in any shell scripting language. Experience with Batch and streaming-based integrations. Nice to have Knowledge of CI/CD tools such as TeamCity, Artifactory, Octopus, Jenkins, SonarQube, etc. Knowledge of AWS services including EC2, S3, CloudFormation, CloudWatch, RDS and others. Knowledge of Snowflake and Apache Kafka is highly desirable. Experience with configuration management and infrastructure-as-code tools such as Ansible, Packer, and Terraform. Experience with monitoring and observability tools like Prometheus/Grafana.
Posted 2 weeks ago
6.0 - 11.0 years
11 - 16 Lacs
Gurugram
Work from Office
Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A
Posted 2 weeks ago
8.0 - 13.0 years
22 - 27 Lacs
Bengaluru
Work from Office
MEET THE TEAM We are a high-performing analytics team on a mission to embed AI into every layer of our decision-making. By demonstrating machine learning, large language models (LLMs), and automation, we're redefining how data powers innovation and efficiency at scale. YOUR IMPACT As the AI Lead, you'll architect AI-powered BI systems integrating Snowflake, Tableau, and natural language interfaces. Youll transform ETL workflows with ML automation and build multi-agent AI systems to supervise KPIs, generate dashboards, and drive insights in real time. MINIMUM QUALIFICATIONS Proven expertise in building AI apps (LLMs, RAG, NLP, Deep Learning). Sophisticated Python, SQL (Snowflake), and Tableau dashboarding. Experience with ML pipelines, GenAI frameworks (LangChain, Hugging Face). Ability to lead workshops, PoCs, and mentor teams. Close collaboration with AI/engineering/product teams. Continuous learning mindset with industry awareness. BASIC QUALIFICATIONS Bachelors/Masters in CS, AI, or related field. 10+ years in BI/Analytics; 3+ in AI/ML roles. Expertise in LLMs (OpenAI, Claude, Llamas), RAG, and agent-based systems. Strong MLOps skills: model deployment, lifecycle management. Leadership in cross-functional teams and innovation delivery. PREFERRED QUALIFICATIONS Experience with multi-agent AI in BI, Snowflake Cortex, and MLOps tools like Kubeflow. Proven track record integrating AI with enterprise BI platforms for business impact.
Posted 2 weeks ago
7.0 - 10.0 years
18 - 30 Lacs
Hyderabad
Work from Office
Power BI Reporting Analyst with hands on ETL tools, DWH, SQL, Snowflake experience. Excellent SQL skills Good ETL knowledge – Hands on experience in any ETL tool like SSIS/Informatica/Talend Banking knowledge (Deposits and Loans understanding)
Posted 2 weeks ago
5.0 - 7.0 years
5 - 9 Lacs
India, Bengaluru
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly evolving world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant team. We are looking for Snowflake Engineer. Before our software developers write even a single line of code, they have to understand what drives our customers. What is the environmentWhat is the user story based onImplementation means – trying, testing, and improving outcomes until a final solution emerges. Knowledge means exchange – discussions with colleagues from all over the world. Join our Digitalization Technology and Services (DTS) team based in Bangalore. You’ll make a difference by: Developing and delivering parts of a product, in accordance with the customers’ requirements and organizational quality norms. Activities to be performed include: Communicating within the team as well as with all the stake holders Strong customer focus and good learner. Highly proactive and team player Implementation of features and/or bug-fixing and delivering solutions in accordance with coding guidelines and on-time with high quality. Identification and implementation of test strategy to ensure solution addresses customer requirements, and quality, security requirements of product are met. Job Requirements/ Skills: 5-7 years’ work experience in Software Engineering especially in professional software product development. Strong knowledge in Snowflake, Database and Tools Strong knowledge in Data Warehouse, Data Visualization, BI, ETL, Analytics Strong knowledge in RDBMS, Stored Procedures and Triggers Strong Knowledge in DBT Basic knowledge of AWS services Knowledge in any programming languages like Python or Java. Basic Experience with Agile/Lean and SAFe practices is preferred Create a better #TomorrowWithUs! This role is in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries – and the craft of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and creativity and help us craft tomorrow. At Siemens, we are always challenging ourselves to build a better future. We need the most innovative and diverse Digital Minds to develop tomorrow ‘s reality. Find out more about the Digital world of Siemens here/digitalminds
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi