Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
VAM Systems is a Business Consulting, IT Solutions, and Services company that is currently seeking a Data Engineering Analyst for our Bahrain operations. As a Data Engineering Analyst, you will be responsible for supporting the finance team on data and analytics activities and Dataware House (DWH) based on your profound knowledge of banking, financial reporting, and data engineering. To qualify for this role, you must have a Bachelor's Degree in Engineering (B.E.) or MCA, along with a certification in SQL/SAS. Additionally, you should have 5-8 years of experience in the field. Your key objectives will include understanding finance and risk reporting systems/workflow, participating in system implementation, and having hands-on experience with MS Excel. Project management and stakeholder management skills are desired for this position. Your responsibilities will involve coordinating and interacting with the finance business partner to support daily finance data analysis, hierarchical mappings, and root cause analysis of identified data issues. You will also be responsible for ensuring accurate and reconciled reporting, conducting data quality reviews, and supporting the finance team in ad-hoc requests and organizing data for financial/regulatory reports. You will play a crucial role in maintaining the consistency of the bank's data architecture, data flows, and business logic. This will involve working closely with Finance and data Engineering teams to identify issues and develop sustainable data-driven solutions. Your expertise in writing and documenting complex SQL Query, Procedures, and functions will be essential in automating important financial interactions and data controls. Experience in handling SAS ETL jobs, data transformation, validation, analysis, and performance tuning is required for this role. Strong experience in SAS Management Console, SAS DI, SAS Enterprise Guide, Base SAS, SAS Web Report Studio, SAS Delivery Portal, SAS OLAP Cube Studio, SAS Information Maps, SAS BI, SAS Stored Process, SAS Datasets & Library is also expected. If you are interested in this opportunity, please send your latest resume to ashiq.salahudeen@vamsystems.com. Joining time frame is between 15-30 days, and the selected candidates will join VAM Systems Bahrain and shall be deputed to one of the leading banks in Bahrain.,
Posted 1 month ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Contact Centre Data and AI Architect, you will be responsible for leveraging your deep knowledge of Genesys (including InfoMart, UCS, GAAP, SpeechMiner, WFM, and Routing) and Five9 data ecosystem. Your expertise will be crucial in supporting the migration of contact centers to CCaaS and developing data solutions around this migration. Your role will involve defining a data strategy to migrate historical and real-time data to the cloud while ensuring minimal disruption to downstream systems. You will need to understand data models for Genesys (InfoMart, SpeechMiner) and create mappings to cloud equivalents. Collaboration with data architects will be essential to ensure reporting continuity and regulatory compliance. Furthermore, you will evaluate and recommend data architectures that minimize the impact on downstream systems during the migration to CCaaS. Your expertise will be pivotal in identifying opportunities to enhance customer experience through AI technologies such as intent detection, virtual assistants, predictive routing, and sentiment analysis. Collaboration with internal AI/ML teams is crucial to define data-driven automation use cases and implementation strategies. Working closely with IT, customer service operations, business teams, and external CCaaS vendors will be part of your daily responsibilities. Additionally, you will participate in vendor evaluations and selection processes for CCaaS and AI solutions. Your role will also involve engaging in interviews with various business stakeholders to understand their contact center needs and assess the impact on downstream applications. This position requires a high level of expertise in contact center data and AI technologies, strong collaboration skills, and the ability to drive successful migration strategies.,
Posted 1 month ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
The role of Data Lead at LERA Technologies involves owning the data strategy, architecture, and engineering roadmap for key client engagements. As a Data Lead, you will lead the design and development of scalable, secure, and high-performance data pipelines, marts, and warehouses. Additionally, you will mentor a team of data engineers and collaborate with BI/reporting teams and solution architects. Your responsibilities will include overseeing data ingestion, transformation, consolidation, and validation across cloud and hybrid environments. It is essential to champion best practices for data quality, data lineage, and metadata management. You will also be expected to evaluate emerging tools, technologies, and frameworks to enhance platform capabilities and engage with business and technical stakeholders to translate analytics needs into scalable data solutions. Monitoring performance and optimizing storage and processing layers for efficiency and scalability are key aspects of this role. The ideal candidate for this position should have at least 7 years of experience in Data Engineering, including proficiency in SQL/PLSQL/TSQL, ETL development, and data pipeline architecture. A strong command of ETL tools such as SSIS or equivalent and Data Warehousing concepts is required. Expertise in data modeling, architecture, and integration frameworks is essential, along with experience leading data teams and managing end-to-end data delivery across projects. Hands-on knowledge of BI tools like Power BI, Tableau, SAP BO, or OBIEE and their backend integration is a must. Proficiency in big data technologies and cloud platforms such as Azure, AWS, or GCP is also necessary. Programming experience in Python, Java, or equivalent languages, as well as proven experience in performance tuning and optimization of large datasets, are important qualifications. A strong understanding of data governance, data security, and compliance best practices is required, along with excellent communication, stakeholder management, and team mentoring abilities. Desirable skills for this role include leadership experience in building and managing high-performing data teams, exposure to data mesh, data lake house architectures, or modern data platforms, experience defining and enforcing data quality and lifecycle management practices, and familiarity with CI/CD for data pipelines and infrastructure-as-code. At LERA Technologies, you will have the opportunity to embrace innovation, creativity, and experimentation while significantly impacting our clients" success across various industries. You will thrive in a workplace that values diversity and inclusive excellence, benefit from extensive opportunities for career advancement, and lead cutting-edge projects with an agile and visionary team. If you are ready to lead data-driven transformation and shape the future of enterprise data, apply now to join LERA Technologies as a Data Lead.,
Posted 1 month ago
7.0 - 12.0 years
0 Lacs
haryana
On-site
As a Senior Enterprise Architect with 7-12 years of experience in ML & AI, you will be based in Gurgaon with frequent travel to Dusseldorf, Germany. Reporting to the Head of Operations & Digital, you will be instrumental in driving the digital transformation of a global sourcing business. Your role will involve designing and implementing scalable, data-powered technology solutions to optimize core business operations, particularly in sourcing, merchandising, and supply chain. Your responsibilities will include defining and executing the digital transformation roadmap aligned with business objectives. You will design enterprise-grade, scalable technology solutions focusing on predictive analytics, supply chain agility, and operational optimization. Collaboration with department heads and executive stakeholders to identify operational challenges and co-create technology-driven solutions will be essential. You will architect modern data ecosystems and AI/ML pipelines to enable real-time insights into inventory management, demand forecasting, and trend analysis. Additionally, you will act as a strategic bridge between the technology team and core business functions, overseeing the development, deployment, and continuous improvement of digital platforms and business intelligence tools. Monitoring industry trends, evaluating emerging technologies, and implementing automation and AI-enabled capabilities for continuous improvement will also be part of your responsibilities. Your technical expertise should include proficiency in cloud platforms (AWS, Azure, GCP), enterprise integration, modern architecture frameworks, data and analytics tools such as SQL, Python, R, Power BI, Tableau, or Looker, as well as a deep understanding of automation, predictive analytics, and secure system design. The ideal candidate will have a Bachelor's degree in Computer Science, Engineering, Data Science, or related field, with a preference for Master's degree or certifications in enterprise architecture or cloud platforms (e.g., TOGAF, AWS, Azure). You should have 7-12 years of experience in enterprise architecture, digital transformation, and large-scale technology program delivery. Strong hands-on expertise in cloud technologies, data architecture, and AI/ML-based systems is required. Leadership capabilities, excellent communication and interpersonal skills, and the ability to influence global teams and drive enterprise-wide change are essential for this role. A willingness to travel frequently to the organization's European headquarters and other locations as needed is also a requirement. The hiring company is part of a globally influential platform for design-led consumer goods sourcing, operating across a network of over 75 offices and more than 50 factories. With a strategic focus on digitalization and operational excellence, the organization is building future-ready sourcing ecosystems that combine deep industry expertise with cutting-edge technology.,
Posted 1 month ago
5.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Sr. Technical Architect at our organization, you will play a crucial role in designing and delivering complex enterprise scale Salesforce solutions for our valued Customers. You will collaborate with various teams across different stakeholders to bring a vision to life. Your primary focus will be on becoming a deep Health Cloud Domain and Product expert, working closely with delivery teams to ensure the success of our customers. Your responsibilities will include supporting the full implementation lifecycle, from scoping to deployment, in a dynamic ecosystem involving clients and partners. You will be tasked with solutioning the enterprise application end-to-end, designing and constructing Salesforce Industry-specific Health industry solutions. Additionally, you will engage with potential customers during the presales process, understanding their business needs, and customizing Salesforce solutions that align with their objectives. Key Responsibilities: - Lead a team of architects to drive optimized solutions for our Health industry clients utilizing Salesforce Health Cloud, Service Cloud, Sales Cloud, and Experience Cloud. - Develop integration, data migration, and implementation strategies to enhance standard product capabilities. - Serve as a trusted advisor to clients and lead the technical architecture team for enterprise-level customer engagements. - Participate in pre-sales activities, technical deep-dive sessions, and collaborate with Salesforce Product teams. - Conduct functional and technical workshops, demonstrating leadership in designing, testing, and deploying solutions. - Collaborate cross-functionally, exhibit strong communication skills, and foster creative thinking. - Provide expertise in User Journey preparations, Data Modeling, Apex Design Patterns, LWC, and other modern UI techniques. - Guide customers, partners, and implementation teams on digital transformation with the Salesforce platform. - Establish trust with customers" leadership, implement standard processes, and lead risk areas in solutions proactively. Qualifications: - Salesforce Certified Technical Architect (CTA) credential is a must. - 14+ years of experience in developing technology solutions. - 5+ years of experience in client-facing projects with increasing responsibilities. - Proficiency in Salesforce Health Cloud, Sales/Service/Experience Cloud, and Vlocity Omnistudio is mandatory. - Expertise in Integration Architecture, REST & SOAP APIs, and Governor limits. - Experience in healthcare transformation projects as a Technical/Enterprise Architect is preferred. - Working knowledge of continuous integration, repositories, large-scale applications, and solution architecting. - Strong skills in Apex Design Patterns, Platform Security, Identity and Access Management, and Data Architecture. - Familiarity with Salesforce/Apex, Triggers, Lightning Flows, LWC, web, and mobile technologies. - Excellent presentation and communication skills, with the ability to tailor content for diverse audiences. - Salesforce Certification in Admin, Developer, Sales, and Service Clouds, Application Architect, OmniStudio Developer/Consultant is preferred. - Application & System Architect Certifications are also preferred. Join us in our journey to deliver innovative Salesforce solutions and drive digital transformation for our clients in the Health industry.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,
Posted 1 month ago
0.0 - 4.0 years
0 Lacs
hyderabad, telangana
On-site
Calling bright minds from top-tier engineering & tech institutes! Are you ready for a world-class, hands-on internship in real-world product development Join our AI-driven CRM & ERP + Sales Automation platform team, a greenfield project where your DSA, backend, and database skills truly come to life. You will be responsible for building and optimizing core systems using Node.js, RDBMS, MongoDB, LLM/RAG databases. Additionally, you will design and tune APIs, handle routing, and enhance performance. Your role will involve developing intelligent automation for sales workflows, ranging from lead generation, outreach, and forecasting to personalized communication, all driven by AI models. You will need to apply critical thinking to solve real-time product challenges in a product-first, no-fluff coding environment. As part of the perks, you will receive a fully paid 3-month internship with a stipend (finalized post-interview) and there is high potential to transition into a permanent role based on strong performance. The on-site location for this opportunity is at Cyber Towers, Hyderabad. We are looking for individuals who are physically present in Hyderabad and ready to contribute as a team player with strong critical-thinking skills. You should be willing to go the extra mile, dive deep, and have a passion for backend/data architecture and building AI-enabled systems. This role is focused on pure coding, devoid of drama, and promises an experience that you will be immensely proud of. If you are interested in this exciting opportunity, please drop your CV or DM today to get started on building something exceptional.,
Posted 1 month ago
12.0 - 21.0 years
25 - 40 Lacs
Chennai
Work from Office
Database Architect Position Overview We are seeking a full-fledged Database Architect to design and develop complex database projects while leading a critical migration from SQL Server to PostgreSQL. This role requires expertise in designing enterprise-scale database solutions, strong development skills across multiple database platforms, and proven experience in project ownership and client management. Key Responsibilities Design and execute comprehensive SQL Server to PostgreSQL migration strategy Quickly initiate and design database models for both RDBMS and NoSQL projects Provide swift support to development teams with database-related requirements and troubleshooting Expert-level development of stored procedures, functions, triggers, and complex database objects across multiple platforms Performance tuning and optimization - query optimization, indexing strategies, and database performance enhancement Migrate stored procedures, functions, and jobs from T-SQL to PL/pgSQL Architect scalable, high-performance database solutions for PostgreSQL and other platforms Take full project ownership including timeline, deliverables, and client communication Develop migration tools and data validation frameworks Required Qualifications 10+ years database architecture and development experience Expert-level PostgreSQL skills - administration, performance tuning, advanced features Strong expertise in multiple database platforms - SQL Server (T-SQL, SSIS), MySQL, PostgreSQL NoSQL database experience - MongoDB and other NoSQL technologies Proven database migration experience - preferably SQL Server to PostgreSQL Development skills in .NET or Java Client-facing experience with strong communication skills Project ownership experience - managing timelines, stakeholders, and deliverables Preferred Qualifications PostgreSQL certifications Experience with migration tools (AWS DMS, Azure DMS) ETL tools and data integration experience
Posted 1 month ago
2.0 - 5.0 years
4 - 7 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Machine Learning Engineer: Must have: Strong on programming languages like Python , Java One cloud hands-on experience ( GCP preferred) Experience working with Dockers Environments managing (e.g venv, pip, poetry, etc.) Experience with orchestrators like Vertex AI pipelines, Airflow, etc Understanding of full ML Cycle end-to-end for majorly NLP projects Data engineering, Feature Engineering techniques Experience with ML modelling and evaluation metrics Experience with Tensorflow, Pytorch or another framework Experience with Models monitoring Advance SQL knowledge API development eg. Fast api etc Aware of Streaming concepts like Windowing , Late arrival , Triggers etc Good to have: Hyperparameter tuning experience. Proficient in either Apache Spark or Apache Beam or Apache Flink Should have hands-on experience on Distributed computing Should have working experience on Data Architecture design
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai, Navi Mumbai
Work from Office
Data Engineer II As a Data Engineer II supporting the Infrastructure team, you will focus on building and enhancing the enterprise data environment. This infrastructure will support the delivery of analytics and reporting (descriptive, diagnostic, predictive, and prescriptive) across Finicity/Mastercard. Responsibilities - Collaborate with internal business stakeholders to understand critical data initiatives and reporting requirements. - Design and implement scalable, well-structured data pipelines to support analytics and business intelligence. - Develop and maintain robust data models and data warehouse structures to ensure data consistency and performance. - Build and optimize ELT workflows using dbt and Snowflake. - Ensure data quality, integrity, and governance across all data assets. Support the infrastructure for enterprise reporting and analytics platforms. Requirements 3-5 years of experience in data modeling, data warehousing, and ETL/ELT development. 2+ years of experience with SQL and relational databases. Hands-on experience with dbt and Snowflake. Strong understanding of data architecture and pipeline design principles. Excellent problem-solving and communication skills.
Posted 1 month ago
3.0 - 7.0 years
5 - 9 Lacs
Chennai
Work from Office
Job Title Data Analyst Expert Job Description Job title: Data Analyst Expert Your role: Analyzes complex and multi-dimensional datasets across various business units to uncover deep insights that inform high-level strategic decisions and drive business performance. Creates sophisticated data models and predictive analytics using advanced statistical techniques and machine learning algorithms to forecast trends and optimize business outcomes, working under general supervision. Implements scalable ETL (Extract, Transform, Load) pipelines and robust data integration solutions, ensuring data consistency, accuracy, and reliability across the organization. Provides inputs on data architecture and infrastructure improvements, advising on best practices for data storage, retrieval, and management to enhance system performance and scalability. Ensures compliance with global data privacy regulations (such as GDPR and HIPAA) by designing and enforcing comprehensive data governance frameworks and security protocols. Conducts advanced statistical analyses, including hypothesis testing and multivariate analysis, to validate business strategies and provide evidence-based recommendations for process enhancements. Evaluates the effectiveness of implemented data solutions using performance metrics and feedback loops, continuously refining analytical approaches to improve outcomes. Benchmarks existing data analytics tools and technologies, leading the selection and integration of state-of-the-art solutions to maintain a competitive edge and drive innovation. Supports the execution of enterprise-wide data strategies, providing thought leadership and expert guidance in data analytics, business intelligence, and data science methodologies. Identifies opportunities for automation and process optimization, leveraging artificial intelligence, machine learning, and advanced analytics to enhance data processing efficiency and decision-making accuracy. Youre the right fit if: (4 x bullets max) 1. Experience- 5+ Industry Experience in Data Analysis, SQL, ETL 2. Skills: Data Analysis & Interpretation Data Harmonization & Processing Statistical Methods Statistical Programming Software Business Intelligence Tools Data Mining Machine Learning Engineering Fundamentals Research & Analysis Structured Query Language (SQL) Regulatory Compliance 3. Education- Bachelors degree in Any Engineering 4. Anything else- Must have strong communication skill. How we work together We believe that we are better together than apart. For our office-based teams, this means working in-person at least 3 days per week. Onsite roles require full-time presence in the company s facilities. Field roles are most effectively done outside of the company s main facilities, generally at the customers or suppliers locations. Indicate if this role is an office/field/onsite role. If you re interested in this role and have many, but not all, of the experiences needed, we encourage you to apply. You may still be the right candidate for this or other opportunities at Philips. Learn more about our culture of impact with care here .
Posted 1 month ago
16.0 - 18.0 years
50 - 60 Lacs
Bengaluru
Work from Office
Join us as a Data Engineer You ll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making We ll look to you to drive the build of effortless, digital first customer experiences If you re ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you re looking for We are offering this role at vice president level What youll do As a Data Engineer, you ll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank s data, safe and secure. You ll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You ll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture. Your responsibilities will also include: Building advanced automation of data engineering pipelines through removal of manual stages Embedding new data techniques into our business through role modelling, training, and experiment design oversight Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets Sourcing new data using the most appropriate tooling for the situation Developing solutions for streaming data ingestion and transformations in line with our streaming strategy The skills youll need To thrive in this role, you ll need a strong understanding of data usage and dependencies and experience of extracting value and features from large scale data. You ll also bring practical experience of programming languages alongside knowledge of data and software engineering fundamentals. Additionally, you ll need: Expertise in in data engineering toolsets such as Airflow, RDBMs(PGSQL/Oracle/DB2), Snowflake, S3, EMR/DataBricks and Data Pipelines Proven proficiency in Python, PySpark, SQL, CICD pipelines, Git version control Experience working with reporting tools such as QuickSight would be an added advantage Experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, and exploration and analysis Data warehousing and data modelling capabilities A good understanding of modern code development practices Experience of working in a governed, and regulatory environment Strong communication skills with the ability to proactively engage and manage a wide range of stakeholders Hours 45 Job Posting Closing Date: 28/07/2025
Posted 1 month ago
3.0 - 8.0 years
10 - 14 Lacs
Chennai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress and make necessary adjustments to keep everything on track, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing data requirements and translating them into effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and contribute to the continuous improvement of data processes and systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in the design and implementation of data pipelines to support data integration and analytics.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data warehousing solutions.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be responsible for analyzing data requirements and translating them into effective solutions that support the organization's data strategy. Additionally, you will participate in team meetings to share insights and contribute to the overall success of the data platform initiatives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and design processes to ensure clarity and consistency across the team. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data integration techniques and best practices.- Experience with data modeling and database design principles.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with database management systems and SQL.- Familiarity with application development methodologies and best practices.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated with the latest technologies and methodologies to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing them with guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 5.0 years
30 - 35 Lacs
Mumbai, Pune
Work from Office
Required - Experience in Big Data Applications and Environments/SQL. Extensive experience in Azure stack - ADLS, Azure SQL DB, Azure Data Factory, Azure Synapse, Analytics Services, Event Hub etc. Experience of working in an Engineering capacity (planning, design, implementation, configuration, upgrades, migrations, troubleshooting and support) of Applications using Azure Stack. Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala Experience in Coding complex T-SQL, Spark (Python). Experience working in an Engineering capacity (planning, design, implementation, configuration, troubleshooting and support) on ETL Experience in designing and developing Python using Azure Synapse Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Experience PL/SQL Collaborate with cross-functional teams to define project requirements and technical specifications. Conduct code reviews to ensure adherence to code quality, consistency, and best practices. Accomplished highly motivated and results driven, able to work independently with minimal supervision Ability to think strategically and effectively communicate solutions to various levels of management Preferred- Experience with Big Data Ecosystem: Azure Data Platform, Azure Synapse and related data integration technologies Experience with T-SQL Knowledge and skills (general and technical): Azure Stack, Azure Data Lake, Azure Synapse, Azure Data Factory T-SQL, Spark (Python) Any one or all of the scripting languages PowerShell, Python etc., PL/SQL, MSSQL, Other Databases Excellent communication skills Bachelor s or Master s degree in Computer Science, Software Engineering, or a related field (B.E./B.Tech, MCA/M.Sc or equivalent ).
Posted 1 month ago
6.0 - 11.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Req ID: 332631 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Sr Java Dev to join our team in Bangalore, Karn taka (IN-KA), India (IN). Senior Application Developer - Java Who we are: NTT DATA America strives to hire exceptional, innovative and passionate individuals who want to grow with us. Launch by NTT DATA is the culmination of the company s strategy to acquire and integrate the skills, experience, and technology of leading digital companies, backed by NTT DATA s core capabilities, global reach, and depth. How You ll Help Us: Our clients need digital solutions that will transform their business so they can succeed in today s hypercompetitive marketplace. As a team member you will routinely deliver elite solutions to clients that will impact their products, customers, and services. Using your development, design and leadership skills and experience, you will design and implement solutions based on client needs. You will collaborate with customers on future system enhancements, thus resulting to continued engagements. How We Will Help You: Joining our Java practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. You will collaborate with teams and supports emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge and support for applications development, integration, and maintenance. You will provide input to department and project teams on decisions supporting projects. Apply Disaster Recovery Knowledge Apply Information Analysis and Solution Generation Knowledge Apply Information Systems Knowledge Apply Internal Systems Knowledge IT - Design/Develop Application Solutions IT - Knowledge of Emerging Technology IT - Problem Management/Planning Technical Problem Solving and Analytical Processes Technical Writing Job Requirements: Contribute to IS Projects; Conducts systems and requirements analyses to identify project action items. Perform Analysis and Design; participates in defining and developing technical specifications to meet systems requirements. Design and Develop Moderate to Highly Complex Applications; Analyzes, designs, codes, tests, corrects, and documents moderate to highly complex programs to ensure optimal performance and compliance. Develop Application Documentation; Develops and maintains system documentation to ensure accuracy and consistency. Produce Integration Builds; Defines and produces integration builds to create applications. Performs Maintenance and Support; Defines and administers procedures to monitor systems performance and integrity. Support Emerging Technologies and Products; Monitors the industry to gain knowledge and understanding of emerging technologies. Must have GCP and Big Query experience Should have Power BI, Microservice Architecture, SQL Server, DB2, Spring Boot, JSON, Java, C#, AMQP, AzureAD, HTTP, readme documentation. Should be proficient in GIT, Scrum, and Azure DevOps Basic qualifications: 6+ years of experience with Java, including building complex, scalable applications. 6+ years of experience in Spring Boot, including designing and implementing advanced microservices architectures. 4+ years of GCP and Big Query experience Ideal Mindset: Lifelong Learner. You are always seeking to improve your technical and nontechnical skills. Team Player. You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator. You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Include if in India: Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering
Posted 1 month ago
6.0 - 11.0 years
9 - 10 Lacs
Bengaluru
Work from Office
Req ID: 322582 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior .NET Developer - Remote to join our team in Bangalore, Karn taka (IN-KA), India (IN). Senior .NET Developer - Remote Who We Are: NTT DATA America strives to hire exceptional, innovative and passionate individuals who want to grow with us. Launch by NTT DATA is the culmination of the company s strategy to acquire and integrate the skills, experience, and technology of leading digital companies, backed by NTT DATA s core capabilities, global reach, and depth. How You ll Help Us: A Senior Application Developer is first and foremost a software developer who specializes in .NET C# development. You ll be part of a team focused on delivering quality software for our clients. How We Will Help You: Joining our Microsoft practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will: The Senior Applications Developer provides input and support for, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). You will participate in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. Additionally, you will collaborate with teams and support emerging technologies to ensure effective communication and achievement of objectives. The Senior Applications Developer provides knowledge / support for applications development, integration, and maintenance as well as providing input to department and project teams on decisions supporting projects. Basic Qualifications: 6+ years developing in .Net/.Net Core 3+ years of experience with Object Oriented Programming and SOLID Principles 3+ years of Rest API development 2+ years of experience working with Databases and writing stored procedures 2+ year of unit and service testing with frameworks such as xunit, Nunit, etc. 1+ year of cloud platform experience either in AWS, Azure, or GCP Preferred: Experience with CI/CD tooling i.e. Jenkins, Azure Devops etc Experience with containerization technologies e.g. Docker, Kubernetes GCP experience Ideal Mindset: Lifelong Learner: You are always seeking to improve your technical and nontechnical skills. Team Player: You are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. Communicator: You know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering
Posted 1 month ago
8.0 - 10.0 years
30 - 37 Lacs
Chennai
Work from Office
We are seeking a highly experienced and strategic Director of Enterprise Architecture to lead our Enterprise Architecture function based in India. This senior leadership role is critical in shaping the technology future for Ford Credits global operations. You will be responsible for leading and mentoring a talented team of architects in India, while collaborating closely with Enterprise Architecture Directors and teams located in North America and Europe. The successful candidate will possess a strong blend of deep technical expertise, strategic vision, and exceptional relationship management skills. You will drive the development and evolution of architectural strategies, standards, and roadmaps that support our global business needs across a wide range of domains, including digital platforms (web, mobile), enterprise integrations, data management, risk systems, AI/ML capabilities, lending/banking platforms, and customer service solutions. This role requires someone who can not only define the what and how from an architectural perspective but also effectively communicate the why to stakeholders at various levels across the organization. Required: Bachelors degree in Computer Science, Engineering, Information Technology, or a related technical field, or equivalent practical experience. Significant experience (typically 10+ years) in Enterprise Architecture or senior-level Solution Architecture roles within large, complex organizations. Proven experience leading and managing technical teams, preferably architecture teams (typically 5+ years of management experience). Demonstrated ability to define and implement enterprise-level architectural strategies, standards, and roadmaps. Deep understanding of various architectural patterns (e.g., Microservices, Event-Driven Architecture, Service-Oriented Architecture) and design principles. Experience with designing and overseeing the implementation of large-scale, distributed, and global enterprise systems. Strong working knowledge across several relevant technology domains such as: cloud platforms (GCP, Azure, AWS), data architecture (data lakes, data warehousing, data governance), integration patterns (APIs, messaging, ETL), digital platforms (web, mobile), AI/ML architecture, and security architecture principles. Excellent verbal and written communication, presentation, and interpersonal skills, with the ability to influence and build consensus among diverse stakeholders. Ability to operate effectively in a global, matrixed organization. Preferred: Masters degree in a relevant field. Experience in the Financial Services or Automotive Finance industry. Experience working with teams and stakeholders located in different geographic regions (e.g., North America, Europe). Familiarity with architectural frameworks (e.g., TOGAF, Zachman). Experience with agile development methodologies. Skills Exceptional strategic thinking and the ability to translate business strategy into technical architecture. Strong leadership and team-building capabilities. Superior stakeholder management, negotiation, and influencing skills. Deep technical acumen across a broad range of technologies and architectural domains. Excellent analytical and problem-solving skills. Ability to manage multiple priorities and navigate ambiguity in a dynamic environment. A passion for technology, innovation, and continuous improvement. Architectural Leadership & Strategy: Lead the development, communication, and governance of enterprise architectural strategies, principles, standards, and roadmaps for Ford Credit globally, with a focus on alignment across regions (India, NA, Europe). Provide strategic guidance and oversight for the architectural design and implementation of complex, large-scale solutions supporting global business needs across diverse domains (Digital, Data, AI, Lending, Banking, Risk, Customer Service, Integrations). Ensure architectural decisions support business objectives, foster innovation, improve efficiency, and manage technical debt. Champion architectural best practices, patterns, and methodologies within the team and across the broader IT organization. Team Leadership & Development: Lead, mentor, and develop a high-performing team of Enterprise Architects in India. Foster a collaborative and innovative team environment, promoting continuous learning and growth. Manage team priorities, resources, and performance to deliver high-quality architectural outcomes. Global Collaboration & Stakeholder Management: Collaborate closely with Enterprise Architecture Directors and teams in North America and Europe to ensure global consistency, leverage shared capabilities, and contribute to a unified global EA function. Build and maintain strong relationships with senior business leaders, IT executives, product managers, engineering teams, and other key stakeholders globally. Effectively communicate complex architectural concepts and strategies to both technical and non-technical audiences. Influence decision-making and drive consensus on architectural direction across organizational boundaries. Architectural Governance & Quality: Establish and refine architectural governance processes to ensure solutions adhere to defined standards and strategies. Provide architectural reviews and guidance for major projects and initiatives. Identify and mitigate architectural risks. Technology & Market Awareness: Stay abreast of industry trends, emerging technologies, and competitive landscapes relevant to financial services, automotive finance, and the specific technology domains (Cloud, AI, Data, Digital, etc.). Evaluate new technologies and assess their potential impact and applicability to Ford Credits global architecture.
Posted 1 month ago
5.0 - 11.0 years
50 - 100 Lacs
Bengaluru
Work from Office
. Roles and Responsibility Spark/Scala Job Description As a Software Development Engineer 2 you will be responsible for expanding and optimising our data and data pipeline architecture as well as optimising data flow and collection for cross-functional teams. The ideal candidate is an experienced data pipeline design and data wrangler who enjoys optimising data systems and building them from the ground up. The Data Engineer will lead our software developers on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams systems and products. The right candidate will be excited by the prospect of optimising or even re-designing our company s data architecture to support our next generation of products and data initiatives. Responsibilities Create and maintain optimal data pipeline architecture Assemble large complex data sets that meet functional / non-functional business requirements. Identify design and implement internal process improvements: automating manual processes optimising data delivery, coordinating to re-design infrastructure for greater scalability etc. Work with stakeholders including the Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure Work with data and analytics experts to strive for greater functionality in our data systems. - Support PROD systems Qualifications Must have About 5 - 11 years and at least 3 years relevant experience with Bigdata. Must have Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amount of data. Must have experience in Hadoop, Hive, Spark with Scala with good experience in performance tuning and debugging issues. Good to have any stream processing Spark/Java Kafka. Must have experience in design and development of Big data projects. Good knowledge in Functional programming and OOP concepts, SOLID principles, design patterns for developing scalable applications. Familiarity with build tools like Maven. Must have experience with any RDBMS and at least one NoSQL database preferably PostgresSQL Must have experience writing unit and integration tests using scaliest Must have experience using any versioning control system - Git Must have experience with CI / CD pipeline - Jenkins is a plus Basic hands-on experience in one of the cloud provider (AWS/Azure) is a plus Databricks Spark certification is a plus.
Posted 1 month ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
We are seeking a strategic and technically strong Enterprise Data Architect to design and lead the implementation of scalable, secure, and high-performing data architecture solutions across the organization. The ideal candidate will have deep experience with modern data platforms, including Snowflake, DBT, SnapLogic, and cloud-native technologies. This role requires a balance of technical expertise, architectural vision, and business acumen to align data solutions with enterprise goals. Key Responsibilities: Define and maintain the organizations enterprise data architecture strategy, including data modeling, governance, and integration standards. Lead the design and architecture of enterprise-grade data platforms using Snowflake, DBT, SnapLogic, and Azure Data Factory. Oversee the development of robust, scalable, and secure data pipelines across a hybrid cloud environment. Architect and optimize SQL Server and PostgreSQL environments to ensure availability, performance, and scalability. Define and enforce integration patterns to ensure data consistency, accuracy, and reliability across systems. Guide the design of efficient ETL/ELT frameworks to ensure alignment with data warehousing and business intelligence requirements. Partner with business and technical teams, including data engineers, analysts, and stakeholders, to define and enforce data governance and metadata management practices. Review and guide SQL query performance tuning, indexing strategies, and system monitoring. Provide direction on the use of Python for data automation, orchestration, and advanced transformations. Establish and maintain enterprise-wide documentation for data flows, data dictionaries, and architectural decisions. Technical Skills & Experience: 8+ years of progressive experience in data engineering or architecture roles, with 2-3 years in a lead or architect capacity. Proven experience designing and implementing data architectures using Snowflake , DBT , SnapLogic , and Azure Data Factory . Strong proficiency in SQL and performance tuning across large-scale environments. Deep experience with SQL Server and PostgreSQL administration and architecture. Experience with Python for scripting, data processing, and orchestration tasks. Solid understanding of data governance , security , compliance , and data lifecycle management . Experience leading data modernization initiatives in cloud/hybrid environments. Understanding of metadata management, master data management, and data lineage tools is a plus. Soft Skills: Strategic mindset with excellent analytical and problem-solving skills. Strong leadership and communication abilities, capable of influencing stakeholders across business and technical domains. Ability to translate business requirements into scalable and sustainable technical solutions. Team-oriented with a collaborative approach to cross-functional projects. Preferred Qualifications: Bachelor s or master s degree in computer science, Data Engineering, or a related field. Relevant certifications (e.g., Snowflake Architect , Azure Solutions Architect , DBT Certification ) are highly desirable.
Posted 1 month ago
5.0 - 10.0 years
3 - 7 Lacs
Hyderabad
Work from Office
We are seeking an experienced Data Modeler with a strong background in real estate, investment management, and master data management. The ideal candidate will be responsible for designing, implementing, and maintaining data models that support our business objectives. This role requires a deep understanding of data architecture, data integration, and database optimization. Key Responsibilities: Design and Develop Data Models: Create conceptual, logical, and physical data models to support business requirements in the real estate and investment management domains. Master Data Management (MDM): Develop and manage master data solutions to ensure data consistency, accuracy, and reliability across the organization. Data Integration: Integrate data from various sources, ensuring consistency and accuracy across systems. Data Mapping: Map data elements to business requirements and create detailed data mapping documents. Collaboration: Work closely with data analysts, database administrators, and business stakeholders to understand data needs and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and data dictionaries. Data Governance: Ensure data models comply with data governance and security policies. Qualifications: Experience: Overall 12+ Yrs Minimum of 5 years of experience in data modeling, with a focus on real estate, investment management, and master data management. Technical Skills: Proficiency in SQL, data modeling tools (e.g., ER/Studio, ERwin), and database management systems (e.g., Oracle, SQL Server). Domain Expertise: In-depth knowledge of real estate and investment management processes and data requirements. MDM Expertise: Strong experience in master data management, including data governance, data quality, and data stewardship. Analytical Skills: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills. Preferred Skills: Experience with data warehousing and business intelligence tools. Familiarity with cloud-based data solutions (e.g., AWS, Azure). Knowledge of data governance frameworks and best practices.
Posted 1 month ago
2.0 - 7.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Job_Description":" Data Engineer Position Overview Role Summary We are searching for a talented and motivated Data Engineerto join our team. The ideal candidate will have expertise in data modeling,analytical thinking, and developing ETL processes using Python. In this role,you will be pivotal in transforming raw data from landing tables into reliable,curated master tables, ensuring accuracy, accessibility, and integrity withinour Snowflake data platform. Main Responsibilities Design, Develop, and Maintain ETL Processes: Build and maintain scalable ETL pipelines inPython to extract, transform, and load data into Snowflake master tables.Automate data mastering, manage incremental updates, and ensure consistencybetween landing and master tables. Data Modeling: Create and optimize logical and physical datamodels in Snowflake for efficient querying and reporting. Translate businessneeds into well-structured data models, defining tables, keys, relationships,and constraints. Analytical Thinking and Problem Solving: Analyze complex datasets, identify trends, andwork with analysts and stakeholders to resolve data challenges. Investigatedata quality issues and design robust solutions aligned with business goals. Data Quality and Governance: Implement routines for data validation,cleansing, and error handling to ensure accuracy and reliability in Snowflake.Support the creation and application of data governance standards. Automation and Optimization: Seek automation opportunities for dataengineering tasks, enhance ETL processes for performance, and scale systems asdata volumes grow within Snowflake. Documentation and Communication: Maintain thorough documentation of data flows,models, transformation logic, and pipeline configurations. Clearly communicatetechnical concepts to all stakeholders. Collaboration: Work closely with data scientists, analysts, andengineers to deliver integrated data solutions, contributing tocross-functional projects with your data engineering expertise. Required Qualifications Bachelors or Masters degree in ComputerScience, IT, Engineering, Mathematics, or related field At least 2 years of experience as a DataEngineer or similar role Strong Python skills, including experiencedeveloping ETL pipelines and automation scripts Solid understanding of relational anddimensional data modeling Experience with Snowflake for SQL, schemadesign, and managing pipelines Proficient in SQL for querying and data analysisin Snowflake Strong analytical and problem-solving skills Familiarity with data warehousing and bestpractices Knowledge of data quality, cleansing, andvalidation techniques Experience with version control systems like Gitand collaborative workflows Excellent communication, both verbal and written Preferred Qualifications In-depth knowledge of Snowflake features likeSnowpipe, Streams, Tasks, and Time Travel Experience with cloud platforms such as AWS,Azure, or Google Cloud Familiarity with workflow orchestration toolslike Apache Airflow or Luigi Understanding of big data tools like Spark,Hadoop, or distributed databases Experience with CI/CD pipelines in dataengineering Background in streaming data and real-timeprocessing Experience deploying data pipelines inproduction Sample Responsibilities in Practice Develop automated ETL pipelines in Python toingest daily CSVs into a Snowflake landing table, validate data, and mergeclean records into a master table, handling duplicates and change tracking. Design scalable data models in Snowflake tosupport business intelligence reporting, ensuring both integrity and queryperformance. Collaborate with business analysts to adapt datamodels and pipelines to evolving needs. Monitor pipeline performance and troubleshootinconsistencies, documenting causes and solutions. Key Skills and Competencies Technical Skills: Python (including pandas,SQLAlchemy); Snowflake SQL and management; schema design; ETL processdevelopment Analytical Thinking: Ability to translatebusiness requirements into technical solutions; strong troubleshooting skills Collaboration and Communication: Effective teamplayer; clear technical documentation Adaptability: Willingness to adopt newtechnologies and proactively improve processes Our Data Environment Our organization manages diverse data sources,including transactional systems, third-party APIs, and unstructured data. Weare dedicated to building a top-tier Snowflake data infrastructure foranalytics, reporting, and machine learning. In this role, you will influenceour data architecture, implement modern data engineering practices, andcontribute to a culture driven by data. ","
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |