Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer at our organization, you will be responsible for designing, implementing, and maintaining data pipelines and data integration solutions using Azure Synapse. Your role will involve developing and optimizing data models and data storage solutions on Azure. You will collaborate closely with data scientists and analysts to implement data processing and data transformation tasks. Ensuring data quality and integrity through data validation and cleansing methodologies will be a key aspect of your responsibilities. Your duties will also include monitoring and troubleshooting data pipelines to identify and resolve performance issues promptly. Collaboration with cross-functional teams to understand and prioritize data requirements will be essential. It is expected that you stay up-to-date with the latest trends and technologies in data engineering and Azure services to contribute effectively to the team. To be successful in this role, you are required to possess a Bachelor's degree in IT, computer science, computer engineering, or a related field, along with a minimum of 8 years of experience in Data Engineering. Proficiency in Microsoft Azure Synapse Analytics is crucial, including experience with Azure Data Factory, Dedicated SQL Pool, Lake Database, and Azure Storage. Hands-on experience in Spark notebooks (Python or Scala) is mandatory for this position. Your expertise should also cover end-to-end Data Warehouse experience, including ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and data security. Advanced SQL and relational database knowledge, as well as demonstrated experience in designing and delivering data platforms for Business Intelligence and Data Warehouse, are required skills. Strong analytical abilities to handle and analyze complex, high-volume data with attention to detail are essential. Familiarity with data modeling and data warehousing concepts such as DataVault or 3NF, along with experience in Data Governance (Quality, Lineage, Data dictionary, and Security), is preferred. Knowledge of Agile methodology and working environment is beneficial for this role. You should also exhibit the ability to work independently with Product Owners, Business Analysts, and Architects. Join us at NTT DATA Business Solutions, where we empower you to transform SAP solutions into value. If you have any questions regarding this job opportunity, please reach out to our Recruiter, Pragya Kalra, at Pragya.Kalra@nttdata.com.,
Posted 2 weeks ago
6.0 - 8.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Person should have 6+ years in Azure Cloud. Should have experience in Data Engineer, Architecture. Experience in working on Azure Services like Azure Data Factory, Azure Function, Azure SQL, Azure Data Bricks, Azure Data Lake, Synapse Analytics etc.
Posted 2 weeks ago
3.0 - 8.0 years
10 - 15 Lacs
Mumbai
Hybrid
1. Job Purpose The Senior Data Analytics Analyst will play a key role in delivering hands-on analytics solutions across the UK and Ireland. This role focuses on bespoke report building, dashboard development, and advanced data analysis using technologies such as Power BI, Microsoft Fabric, Snowflake, Azure, AWS, Python, SQL, and R. Experience in data science is highly advantageous, including the application of machine learning and predictive modelling to solve business problems 2. Accountabilities & Activities • Design and build interactive dashboards and reports using Power BI and Microsoft Fabric. • Perform advanced data analysis and visualisation to support business decision-making. • Develop and maintain data pipelines and queries using SQL and Python. • Apply data science techniques such as predictive modelling, classification, clustering, and regression to solve business problems and uncover actionable insights. • Perform feature engineering and data preprocessing to prepare datasets for machine learning workflows. • Build, validate, and tune machine learning models using tools such as scikit-learn, TensorFlow, or similar frameworks. • Deploy models into production environments and monitor their performance over time, ensuring they deliver consistent value. • Collaborate with stakeholders to translate business questions into data science problems and communicate findings in a clear, actionable manner. • Use statistical techniques and hypothesis testing to validate assumptions and support decision-making. • Document data science workflows and maintain reproducibility of experiments and models. • Support the Data Analytics Manager in delivering analytics projects and mentoring junior analysts. • Design and build interactive dashboards and reports using Power BI and Microsoft Fabric. 3. Qualifications, Knowledge & Experience • Professional Certifications (preferred or in progress): - Microsoft Certified: Power BI Data Analyst Associate (PL-300) - SnowPro Core Certification (Snowflake) - Microsoft Certified: Azure Data Engineer Associate - AWS Certified: Data Analytics Specialty • Strong technical expertise in Power BI, Microsoft Fabric, Snowflake, SQL, Python, and R. • Experience with Azure Data Factory, Databricks, Synapse Analytics, and AWS Glue. • Hands-on experience in building and deploying machine learning models. • Ability to translate complex data into actionable insights. • Excellent problem-solving and communication skills. 4. Judgement Skills Balances team development with delivery priorities and business needs. Makes informed decisions on technical design, resource allocation, and delivery timelines. Evaluates project outcomes and identifies opportunities for team and process improvement. Encourages experimentation while maintaining delivery discipline and governance. 5. Freedom Of Action Acts as a key liaison between technical teams and business stakeholders. 6. Dimensions Financial: Supports budgeting and cost optimisation for analytics projects. Contributes to revenue growth through data-driven client solutions. Non-Financial: Drives adoption of modern analytics tools and practices. Builds strong relationships across business units and with external clients. 7. Environment Build strong relationships and influence key decision makers. Able to work under pressure and adapt to chang
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Cloud Architect with expertise in Azure and Snowflake, you will be responsible for designing and implementing secure, scalable, and highly available cloud-based solutions on AWS and Azure Cloud. Your role will involve utilizing your experience in Azure Databricks, ADF, Azure Synapse, PySpark, and Snowflake Services. Additionally, you will participate in pre-sales activities, including RFP and proposal writing. Your experience with integrating various data sources with Data Warehouse and Data Lake will be crucial for this role. You will also be expected to create Data warehouses and data lakes for Reporting, AI, and Machine Learning purposes, while having a solid understanding of data modelling and data architecture concepts. Collaboration with clients to comprehend their business requirements and translating them into technical solutions that leverage Snowflake and Azure cloud platforms will be a key aspect of your responsibilities. Furthermore, you will be required to clearly articulate the advantages and disadvantages of different technologies and platforms, as well as participate in Proposal and Capability presentations. Defining and implementing cloud governance and best practices, identifying and implementing automation opportunities for increased operational efficiency, and conducting knowledge sharing and training sessions to educate clients and internal teams on cloud technologies are additional duties associated with this role. Your expertise will play a vital role in ensuring the success of cloud projects and the satisfaction of clients.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You should have a minimum of 7 years of experience in Database warehouse / lake house programming and should have successfully implemented at least 2 end-to-end data warehouse / data lake projects. Additionally, you should have experience in implementing at least 1 Azure Data warehouse / lake house project end-to-end, converting business requirements into concept / technical specifications, and collaborating with source system experts to finalize ETL and analytics design. You will also be responsible for supporting data modeler developers in the design and development of ETLs and creating activity plans based on agreed concepts with timelines. Your technical expertise should include a strong background with Microsoft Azure components such as Azure Data Factory, Azure Synapse, Azure SQL Database, Azure Key Vault, MS Fabric, Azure DevOps (ADO), and Virtual Networks (VNets). You should also have expertise in Medallion Architecture for Lakehouses and data modeling in the Gold layer, along with a solid understanding of Data Warehouse design principles like star schema, snowflake schema, and data partitioning. Proficiency in MS SQL Database Packages, Stored procedures, Functions, procedures, Triggers, and data transformation activities using SQL is required, as well as knowledge in SQL loader, Data pump, and Import/Export utilities. Experience with data visualization or BI tools like Tableau, Power BI, capacity planning, environment management, performance tuning, and familiarity with cloud cloning/copying processes within Azure will be essential for this role. Knowledge of green computing principles and optimizing cloud resources for cost and environmental efficiency is also desired. You should possess excellent interpersonal and communication skills to collaborate effectively with technical and non-technical teams, communicate complex concepts, and influence key stakeholders. Additionally, analyzing demands, contributing to cost/benefit analysis, and estimation are part of the responsibilities. Preferred qualifications include certifications like Azure Solutions Architect Expert or Azure Data Engineer Associate. Skills required for this role include database management, Tableau, Power BI, ETL processes, Azure SQL Database, Medallion Architecture, Azure services, data visualization, data warehouse design, and Microsoft Azure technologies.,
Posted 2 weeks ago
5.0 - 6.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Responsibilities Analyze and implement user requirements/business needs as new and/or enhanced product functionality Design, code, test, and document software code Engage in the full software development lifecycle, from design and implementation to testing and deployment. Assist in the packaging and delivery of finished software products to clients Communicate with technical and business leaders on business requirements, system-related capabilities, programming progress, and enhancement status Assist in supporting the client by providing technical product support to them. Assist in the maintenance of the hosted products and environments Learn new technologies and help the team implement them in our products Gain a deep understanding of distributed architecture and contribute to the scalability and efficiency of our blockchain projects. Oversee junior developers to enhance their skills and contribute to the success of the team. Utilize general developer tools such as Github, VSCode, and other industry-standard platforms. Skill required .Net, Azure, Angular, Unit testing, Blockchain(Intermediate to Advanced knowledge), Docker, Linux , DevOps Comfortable with Linux and command-line interfaces. Passion for technology, identifying issues and problem solving. Process-oriented mindset, with ability to create, document, perform, and continually improve how team goals are achieved. Excellent verbal and written communication skills in English. Ability to work cross-functionally with team members of varied backgrounds (e.g., business, product, development, testing) Ability to build professional relationships, demonstrate a spirit of collaboration, and provide a flexible approach to work.
Posted 2 weeks ago
6.0 - 11.0 years
14 - 19 Lacs
Bengaluru
Remote
Role: Azure Specialist-CDM Smith Location:Bangalore Mode: Remote Education and Work Experience Requirements: Key Responsibilities: Databricks Platform: Act as a subject matter expert for the Databricks platform within the Digital Capital team, provide technical guidance, best practices, and innovative solutions. Databricks Workflows and Orchestration: Design and implement complex data pipelines using Azure Data Factory or Qlik replicate. End-to-End Data Pipeline Development: Design, develop, and implement highly scalable and efficient ETL/ELT processes using Databricks notebooks (Python/Spark or SQL) and other Databricks-native tools. Delta Lake Expertise: Utilize Delta Lake for building reliable data lake architecture, implementing ACID transactions, schema enforcement, time travel, and optimizing data storage for performance. Spark Optimization: Optimize Spark jobs and queries for performance and cost efficiency within the Databricks environment. Demonstrate a deep understanding of Spark architecture, partitioning, caching, and shuffle operations. Data Governance and Security: Implement and enforce data governance policies, access controls, and security measures within the Databricks environment using Unity Catalog and other Databricks security features. Collaborative Development: Work closely with data scientists, data analysts, and business stakeholders to understand data requirements and translate them into Databricks based data solutions. Monitoring and Troubleshooting: Establish and maintain monitoring, alerting, and logging for Databricks jobs and clusters, proactively identifying and resolving data pipeline issues. Code Quality and Best Practices: Champion best practices for Databricks development, including version control (Git), code reviews, testing frameworks, and documentation. Performance Tuning: Continuously identify and implement performance improvements for existing Databricks data pipelines and data models. Cloud Integration: Experience integrating Databricks with other cloud services (e.g., Azure Data Lake Storage Gen2, Azure Synapse Analytics, Azure Key Vault) for a seamless data ecosystem. Traditional Data Warehousing & SQL: Design, develop, and maintain schemas and ETL processes for traditional enterprise data warehouses. Demonstrate expert-level proficiency in SQL for complex data manipulation, querying, and optimization within relational database systems. Mandatory Skills: Experience in Databricks and Databricks Workflows and Orchestration Python: Hands-on experience in automation and scripting. Azure: Strong knowledge of Data Lakes, Data Warehouses, and cloud architecture. Solution Architecture: Experience in designing web applications and data engineering solutions. DevOps Basics: Familiarity with Jenkins and CI/CD pipelines. Communication: Excellent verbal and written communication skills. Fast Learner: Ability to quickly grasp new technologies and adapt to changing requirements. Cloud Integration: Experience integrating Databricks with other cloud services (e.g., Azure Data Lake Storage Gen2, Azure Synapse Analytics, Azure Key Vault) for a seamless data ecosystem Extensive experience with Spark (PySpark, Spark SQL) for large-scale data processing Additional Information: Qualifications - BE, MS, M.Tech or MCA. Certifications: Databricks Certified Associat
Posted 2 weeks ago
5.0 - 10.0 years
14 - 19 Lacs
Bengaluru
Remote
Role: Azure Specialist-DBT Location: Bangalore Mode: Remote Education and Work Experience Requirements: •Overall, 5 to 9 years of experience in IT Industry. Min 6 years of experience working on Data Engineering. Translate complex business requirements into analytical SQL views using DBT Support data ingestion pipelines using Airflow and Data Factory Develop DBT macros to enable scalable and reusable code automation Mandatory Skills: Strong experience with DBT (Data Build Tool)( (or strong SQL / relational DWH knowledge)-Must have Proficiency in SQL and strong understanding of relational data warehouse concept Hands-on experience with Databricks (primarily Databricks SQL) good to have Familiarity with Apache Airflow and Azure Data Factory – nice to have Experience working with Snowflake – nice to have Additional Information: Qualifications - BE, MS, M.Tech or MCA. Certifications: Azure Big Data, Databricks Certified Associate
Posted 2 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Vacancy Name: Power BI Developer | Software Engineer Location Country: India Location City: Kormangala - Bangalore Description: Maintain static set-ups & rate maintenance to facilitate reconciliation of invoices Meet Service Level Agreement targets Work on SmartStreams Transactions, Fees Cost and Invoice Management solution ensuring all service level agreements are met. Key Responsibilities: Design and develop interactive dashboards and reports using Power BI. Connect to various data sources (SQL Server, Excel, SharePoint, etc.) and transform data using Power Query and DAX. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Optimize data models for performance and scalability. Implement row-level security and data governance best practices. Maintain and troubleshoot existing Power BI reports and dashboards. Integrate Power BI reports into other applications using embedded analytics. Stay updated with the latest Power BI features and best practices. Key Skills: Bachelors degree in Computer Science, Information Systems, or a related field. 3 to 5 years of hands-on experience with Power BI. Strong proficiency in DAX, Power Query (M), and data modelling. Experience with SQL and relational databases. Familiarity with data warehousing concepts and ETL processes. Understanding of business processes and KPIs. Excellent analytical and problem-solving skills. Strong communication and interpersonal skills. Qualifications: Microsoft Certified: Data Analyst Associate (Power BI). Experience with Azure Data Services (e.g., Azure Data Factory, Synapse). Knowledge of Python or R for data analysis. Experience with Agile/Scrum methodologies. Employment Type: Permanent Equality Statement: SmartStream is an equal opportunities employer. We are committed to promoting equality of opportunity and following practices which are free from unfair and unlawful discrimination.
Posted 2 weeks ago
2.0 - 7.0 years
7 - 12 Lacs
Chennai
Work from Office
Collaborate with business stakeholders and other technical team members to acquire data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Write custom scripts to extract data from unstructured/semi-structured sources. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Technical Skills Nice-to-have skills 2+ years of experience with Big Data Management (BDM) for relational and non-relational data (formats like json, xml, Avro, parquet, copybook, etc.) Knowledge of Dev-Ops processes (CI/CD) and infrastructure as code. Knowledge of Master Data Management (MDM) and Data Quality tools. Experience developing REST APIs. Knowledge of key machine learning concepts & MLOPS Qualifications Bachelors degree in computer engineering 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 2+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Deep understanding of REST and good API design. Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.
Posted 2 weeks ago
5.0 - 9.0 years
15 - 27 Lacs
Amritsar
Remote
Job Title: Senior Azure Data Engineer Location: Remote Experience Required: 5+ years About the Role: We are seeking a highly skilled Senior Azure Data Engineer to design and develop robust, scalable, and high-performance data pipelines using Azure technologies. The ideal candidate will have strong experience with modern data platforms and tools, including Azure Data Factory, Synapse, Databricks, and Data Lake, as well as expertise in SQL, Python, and CI/CD workflows. Key Responsibilities: Design and implement end-to-end data pipelines using Azure Data Factory, Azure Synapse Analytics, Azure Databricks, and Azure Data Lake Storage Gen2. Ingest and integrate data from various sources such as SQL Server, APIs, blob storage, and on-premise systems, ensuring security and performance. Develop and manage ETL/ELT workflows and orchestrations in a scalable, optimized manner. Build and maintain data models, data marts, and data warehouse structures for analytics and reporting. Write and optimize complex SQL queries, stored procedures, and Python scripts. Ensure data quality, consistency, and integrity through validation frameworks and best practices. Support and enhance CI/CD pipelines using Azure DevOps, Git, and ARM/Bicep templates. Collaborate with data scientists, analysts, and business stakeholders to understand requirements and deliver impactful solutions. Enforce data governance, security, and compliance policies, including use of Azure Key Vault and access controls. Mentor junior data engineers, lead design discussions, and conduct code reviews. Monitor and troubleshoot issues related to performance, cost, and scalability across data systems. Required Skills & Experience: 6+ years of experience in data engineering or related fields. 3+ years of hands-on experience with Azure cloud services, specifically: Azure Data Factory (ADF) Azure Synapse Analytics (Dedicated and Serverless SQL Pools) Azure Databricks (Spark preferred) Azure Data Lake Storage Gen2 (ADLS) Azure SQL / Managed Instance / Cosmos DB Strong proficiency in SQL, PySpark, and Python. Solid experience with CI/CD tools: Azure DevOps, Git, ARM/Bicep templates. Experience with data warehousing, dimensional modeling, and medallion/lakehouse architecture. In-depth knowledge of data security best practices, including encryption, identity management, and network configurations in Azure. Expertise in performance tuning, data partitioning, and cost optimization. Excellent communication, problem-solving, and stakeholder management skills.
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
The Senior Data Engineer role requires 5 to 8 years of experience and expertise in Azure Synapse and deep Azure data engineering. You will be responsible for designing and implementing data technology and modern data platform solutions within the Azure environment. Your key responsibilities will include collaborating with Data Architects, Presales Architects, and Cloud Engineers to deliver high-quality solutions, mentoring junior team members, and conducting research to stay updated with the latest industry trends. You will also be expected to develop and enforce best practices in data engineering and platform development. We are looking for candidates with substantial experience in data engineering and Azure data services, strong analytical and problem-solving skills, proven experience working with diverse customers, and expertise in developing data pipelines, APIs, file formats, and databases. Familiarity with technologies such as Synapse, ADLS2, Databricks, Azure Data Factory, Azure SQL, Keyvault, and Azure Security is essential. Experience with CI/CD practices, specifically within Azure DevOps, and agile delivery methods is preferred. This is a full-time position based in Ahmedabad, India, with a hybrid work mode. The work schedule is from Monday to Friday during day shifts. As a Senior Data Engineer, you will have the opportunity to contribute to the development of cutting-edge data solutions, support various teams within the organization, and play a key role in mentoring and guiding junior team members. To apply for this position, please provide information on your notice period, current annual salary, expected annual salary, and current city of residence. The ideal candidate for this role will have a minimum of 6 years of experience with Azure data services, Azure Synapse, Databricks, and Azure Data Factory. If you have a passion for data engineering, a drive for continuous learning, and a desire to work with innovative technologies, we encourage you to apply for this exciting opportunity.,
Posted 3 weeks ago
5.0 - 8.0 years
25 - 35 Lacs
Ahmedabad
Hybrid
Must Have: Azure Synapse and deep Azure data engineering experience, solid understanding of data platform, infrastructure, and security Job Description As a Senior Data Engineer, you will provide expertise in building data technology and modern data platform solutions in Azure. You will play a crucial role in developing complex data solutions, supporting our Data Architects, Presales Architects, and Cloud Engineers, and mentoring junior team members. Key Responsibilities Design and build complex data solutions leveraging Azure data services. Support and collaborate with Data Architects, Presales Architects, and Cloud Engineers to deliver top-notch solutions. Mentor and guide junior team members, fostering a culture of continuous learning and improvement. Conduct R&D to stay ahead of industry trends and integrate new technologies. Develop and enforce best practices in data engineering and platform development. What We Need From You Substantial experience in data engineering and Azure data services. Strong analytical and problem-solving skills. Proven experience working with a variety of customers. In-depth knowledge of engineering practices and processes. Expertise in developing data pipelines, working with APIs, multiple file formats, and databases. Specific Technologies and Disciplines Fabric nice to have, not essential Synapse ADLS2 Databricks Azure Data Factory (including metadata-driven pipelines) Azure SQL Keyvault Azure Security (e.g., use of private endpoints) CI/CD, especially within Azure DevOps Agile delivery
Posted 3 weeks ago
2.0 - 7.0 years
7 - 12 Lacs
Chennai
Work from Office
Company Overview Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and design capabilities coupled with deep domain understanding. We combine services and products to maximize business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science & healthcare industries. Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep technology and domain experts, and global teams. Incedo University, our learning platform, provides ample learning opportunities starting with a structured onboarding program and carrying throughout various stages of your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible career paths allow you to grow into a program manager, a technical architect or a domain expert based on your skills and interests. Our Mission is to enable our clients to maximize business impact from technology by Harnessing the transformational impact of emerging technologies Bridging the gap between business and technology Role Description Collaborate with business stakeholders and other technical team members to acquire data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Write custom scripts to extract data from unstructured/semi-structured sources. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Technical Skills Nice-to-have skills 2+ years of experience with Big Data Management (BDM) for relational and non-relational data (formats like json, xml, Avro, parquet, copybook, etc.) Knowledge of Dev-Ops processes (CI/CD) and infrastructure as code. Knowledge of Master Data Management (MDM) and Data Quality tools. Experience developing REST APIs. Knowledge of key machine learning concepts & MLOPS Qualifications Bachelors degree in computer engineering 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 2+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Deep understanding of REST and good API design. Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools.
Posted 3 weeks ago
6.0 - 8.0 years
7 - 11 Lacs
Kolkata, Chennai, Bengaluru
Work from Office
Job Title : Azure Data Engineer Location State : Karnataka,Telangana,West Bengal Location City : Bangalore, Hyderabad, Kolkata Experience Required : 6 to 8 Year(s) CTC Range : 7 to 11 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 3 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: Azure Data Engineer Essential Job Functions: Azure Data Engineer Qualifications: Azure Data Engineer Minimum 5-6 years of hands-on experience in ETL pipeline using Azure Data factory /Azure synapse How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000
Posted 3 weeks ago
4.0 - 6.0 years
8 - 12 Lacs
Ahmedabad
Work from Office
SUMMARY We are looking for a highly motivated Azure Data engineer to join our dynamic development team. This is an exciting opportunity for an individual who is passionate about working on trending technology and wants to contribute to the growth of our company. KEY RESPONSIBILITIES Design, develop, and maintain scalable data pipelines and use Azure Data Factory and Azure Stream Analytics • Collaborate with data scientists and analysts to understand data requirements and implement solutions that support analytics and machine learning initiatives. • Optimize data storage and retrieval mechanisms to ensure performance, reliability, and cost-effectiveness. • Implement data governance and security best practices to ensure compliance and data integrity. • Troubleshoot and debug data pipeline issues, providing timely resolution and proactive monitoring. Proven experience as a Data Engineer or in a similar role. • Experience in designing and hands-on development in cloud-based (Azure) analytics solutions. • Expert level understanding on Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, Azure App Service, Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics • Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. • Experience in SQL server and procedures. • Thorough understanding of Azure Infrastructure offerings. • Strong experience in common data warehouse modelling principles including Kimball, Inmon. • Experience in additional Modern Database terminologies. • Working knowledge of Python or Java or Scala is desirable • Strong knowledge of data modeling, ETL processes, and database technologies • Solid understanding of data governance, data security, and data quality best practices. • Strong analytical and problem-solving skills, with attention to detail. MANDATORY SKILLS Azure Data Factory, Azure Synapse, Azure SQL, Azure Data Lake, Azure App Service, Azure Databricks, Azure IoT, Azure HDInsight + Spark, Azure Stream Analytics
Posted 3 weeks ago
5.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Azure Data Factory. Experience: 5-8 Years.
Posted 3 weeks ago
9.0 - 14.0 years
18 - 27 Lacs
Pune
Hybrid
Application of Architectural Principles and Methods | Is responsible for the overall architecture of the data & analytics platform. Develops architectural guidelines for data management and implementation concepts of the data Required Candidate profile Azure platform, including Azure Data Factory, Azure SQL Database, Azure Synapse, Azure Databricks, and Power BI. Extensive knowledge of data modeling, ETL processes, and data warehouse
Posted 3 weeks ago
5.0 - 10.0 years
22 - 25 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 3 weeks ago
3.0 - 7.0 years
12 - 15 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are looking for an experienced Data Engineer/BI Developer with strong hands-on expertise in Microsoft Fabric technologies, including OneLake, Lakehouse, Data Lake, Warehouse, and Real-Time Analytics, along with proven skills in Power BI, Azure Synapse Analytics, and Azure Data Factory (ADF). The ideal candidate should also possess working knowledge of DevOps practices for data engineering and deployment automation. Key Responsibilities: Design and implement scalable data solutions using Microsoft Fabric components: OneLake, Data Lake, Lakehouse, Warehouse, and Real-Time Analytics Build and manage end-to-end data pipelines integrating structured and unstructured data from multiple sources. Integrate Microsoft Fabric with Power BI, Synapse Analytics, and Azure Data Factory to enable modern data analytics solutions. Develop and maintain Power BI datasets, dashboards, and reports using data from Fabric Lakehouses or Warehouses. Implement data governance, security, and compliance policies within the Microsoft Fabric ecosystem. Collaborate with stakeholders for requirements gathering, data modeling, and performance tuning. Leverage Azure DevOps / Git for version control, CI/CD pipelines, and deployment automation of data artifacts. Monitor, troubleshoot, and optimize data flows and transformations for performance and reliability. Required Skills: 38 years of experience in data engineering, BI development, or similar roles. Strong hands-on experience with Microsoft Fabric ecosystem:OneLake, Data Lake, Lakehouse, Warehouse, Real-Time Analytics Proficient in Power BI for interactive reporting and visualization. Experience with Azure Synapse Analytics, ADF (Azure Data Factory), and related Azure services. Good understanding of data modeling, SQL, T-SQL, and Spark/Delta Lake concepts. Working knowledge of DevOps tools and CI/CD processes for data deployment (Azure DevOps preferred). Familiarity with DataOps and version control practices for data solutions. Preferred Qualifications: Microsoft certifications (e.g., DP-203, PL-300, or Microsoft Fabric certifications) are a plus. Experience with Python, Notebooks, or KQL for Real-Time Analytics is advantageous. Knowledge of data governance tools (e.g., Microsoft Purview) is a plus. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
6.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Database Administration skills database software Installation Patch Management Maintenance data ETL and ELT jobs on Microsoft SQL Server & Oracle databases Azure Data Factory & Synapse data warehousing data mining database Backups and Recovery Required Candidate profile 6 - 9 yrs of exp as Database administrator Datawarehouse Data Lake SQL Development skills Tables Views Schemas Procedure functions trigger CTE Cursor security log data structure data integration
Posted 3 weeks ago
6.0 - 8.0 years
1 - 6 Lacs
Noida
Work from Office
Urgent Hiring... Microsoft Fabric Cloud Architect 6-8yrs Noida Immediate to 30 days Skills- Azure Cloud, MS Fabric, Py spark, DAX, Python, Azure Synapse, ADF, Data Bricks, MS-Fabric, ETL Pipelines.
Posted 3 weeks ago
2.0 - 4.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 3 weeks ago
9.0 - 14.0 years
25 - 40 Lacs
Noida, Bengaluru
Hybrid
Role & responsibilities We are seeking an experienced and visionary Technical Expert (Architect) with deep expertise in Microsoft technologies and a strong focus on Microsoft Analytics solutions. The ideal candidate will design, implement, and optimize end-to-end analytics architectures, enabling organizations to derive actionable insights from their data. This role requires a blend of technical prowess, strategic thinking, and leadership capabilities to guide teams and stakeholders toward innovative solutions. Key Responsibilities Architectural Design: Lead the design and development of scalable and secure data analytics architectures using Microsoft technologies (e.g., Power BI, Azure Synapse Analytics, SQL Server). Define the data architecture, integration strategies, and frameworks to meet organizational goals. Technical Leadership: Serve as the technical authority on Microsoft Analytics solutions, ensuring best practices in performance, scalability, and reliability. Guide cross-functional teams in implementing analytics platforms and solutions. Solution Development: Oversee the development of data models, dashboards, and reports using Power BI and Azure Data Services. Implement data pipelines leveraging Azure Data Factory, Data Lake, and other Microsoft technologies. Stakeholder Engagement: Collaborate with business leaders to understand requirements and translate them into robust technical solutions. Present architectural designs, roadmaps, and innovations to technical and non-technical audiences. Continuous Optimization: Monitor and optimize analytics solutions for performance and cost-effectiveness. Stay updated on the latest Microsoft technologies and analytics trends to ensure the organization remains competitive. Mentorship and Training: Mentor junior team members and provide technical guidance on analytics projects. Conduct training sessions to enhance the technical capabilities of internal teams. Required Skills and Qualifications Experience: 9+ years of experience working with Microsoft analytics and related technologies. Proven track record of designing and implementing analytics architectures. Technical Expertise: Deep knowledge of Power BI, Azure Synapse Analytics, Azure Data Factory, SQL Server, Azure Data Lake and Fabric. Proficiency in data modeling, ETL processes, and performance tuning. Soft Skills: Strong problem-solving and analytical abilities. Excellent communication and interpersonal skills for stakeholder management. Certifications (Preferred): Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Data Analyst Associate Microsoft Certified: Azure Data Engineer Associate
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Microsoft Fabric Professional at YASH Technologies, you will be responsible for working with cutting-edge technologies to bring about real positive changes in an increasingly virtual world. You will have the opportunity to contribute to business transformation by leveraging your experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure Storage Services, Azure SQL, ETL, Azure Cosmos DB, Event HUB, Azure Data Catalog, Azure Functions, and Azure Purview. With 5-8 years of experience in Microsoft Cloud solutions, you will be involved in creating pipelines, datasets, dataflows, Integration runtimes, and monitoring Pipelines. Your role will also entail extracting, transforming, and loading data from source systems using Azure Databricks, as well as preparing DB Design Documents based on client requirements. Collaborating with the development team, you will create database structures, queries, and triggers while working on SQL scripts and Synapse pipelines for data migration to Azure SQL. Your responsibilities will include data migration pipeline to Azure cloud, database migration from on-prem SQL server to Azure Dev Environment, and implementing data governance in Azure. Additionally, you will work on data migration pipelines for on-prem SQL server data to Azure cloud, along with utilizing Azure data catalog and experience in Big Data Batch Processing Solutions, Interactive Processing Solutions, and Real-Time Processing Solutions. To excel in this role, mandatory certifications are required. At YASH Technologies, you will have the opportunity to create a career path tailored to your aspirations within an inclusive team environment. Our Hyperlearning workplace is built on principles of flexible work arrangements, free spirit, emotional positivity, agile self-determination, trust, transparency, open collaboration, support for business goals realization, stable employment, and ethical corporate culture. Join us to embark on a journey of continuous learning, unlearning, and relearning in a dynamic and evolving technology landscape.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40183 Jobs | Dublin
Wipro
19418 Jobs | Bengaluru
Accenture in India
16534 Jobs | Dublin 2
EY
15533 Jobs | London
Uplers
11630 Jobs | Ahmedabad
Amazon
10667 Jobs | Seattle,WA
Oracle
9549 Jobs | Redwood City
IBM
9337 Jobs | Armonk
Accenture services Pvt Ltd
8190 Jobs |
Capgemini
7921 Jobs | Paris,France