Jobs
Interviews

1544 Adf Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description: Business Title QA Manager Years Of Experience 10+ Job Descreption The purpose of this role is to ensure the developed software meets the client requirements and the business’ quality standards within the project release cycle and established processes. To lead QA technical initiatives in order to optimize the test approach and tools. Must Have Skills At least 2 years in a lead role. Experience with Azure cloud. Testing file-based data lake solutions or Big data based solution. Worked on migration or implementation of Azure Data Factory projects. Strong experience in ETL/data pipeline testing, preferably with Azure Data Factory. Proficiency in SQL for data validation and test automation. Familiarity with Azure services: Data Lake, Synapse Analytics, Azure SQL, Key Vault, and Logic Apps. Experience with test management tools (e.g., Azure DevOps, JIRA, TestRail). Understanding of CI/CD pipelines and integration of QA in DevOps workflows. Experience with data quality frameworks (e.g., Great Expectations, Deequ). Knowledge of Python or PySpark for data testing automation. Exposure to Power BI or other BI tools for test result visualization. Azure Data Factory Exposure to Azure Databricks SQL/stored procedure on SQL Server ADLS Gen2 Exposure to Python/ Shell script Good To Have Skills Exposure to any ETL tool experience. Any other Cloud experience (AWS / GCP). Exposure to Spark architecture, including Spark Core, Spark SQL, DataFrame, Spark Streaming, and fault tolerance mechanisms. ISTQB or equivalent QA certification. Working experience on JIRA and Agile Experience with testing SOAP / API projects Stakeholder communication Microsoft Office Key responsibiltes Lead the QA strategy, planning, and execution for ADF-based data pipelines and workflows. Design and implement test plans, test cases, and test automation for data ingestion, transformation, and loading processes. Validate data accuracy, completeness, and integrity across source systems, staging, and target data stores (e.g., Azure SQL, Synapse, Data Lake). Collaborate with data engineers, architects, and business analysts to understand data flows and ensure test coverage. Develop and maintain automated data validation scripts using tools like PySpark, SQL, PowerShell, or Azure Data Factory Data Flows. Monitor and report on data quality metrics, defects, and test coverage. Ensure compliance with data governance, security, and privacy standards. Mentor junior QA team members and coordinate testing efforts across sprints. Education Qulification Minimum Bachelor’s degree in computer science, Information Systems, or related field. Certification If Any Any Basic level certification in AWS / AZURE / GCP Snowflake Associate / Core Shift timing 12 PM to 9 PM and / or 2 PM to 11 PM - IST time zone Location: DGS India - Mumbai - Goregaon Prism Tower Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 days ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

A.P. Moller - Maersk A.P. Moller – Maersk is the global leader in container shipping services. The business operates in 130 countries and employs 80,000 staff. An integrated container logistics company, Maersk aims to connect and simplify its customers’ supply chains. Today, we have more than 180 nationalities represented in our workforce across 131 Countries and this mean, we have elevated level of responsibility to continue to build inclusive workforce that is truly representative of our customers and their customers and our vendor partners too. We are responsible for moving 20 % of global trade & is on a mission to become the Global Integrator of Container Logistics. To achieve this, we are transforming into an industrial digital giant by combining our assets across air, land, ocean, and ports with our growing portfolio of digital assets to connect and simplify our customer’s supply chain through global end-to-end solutions, all the while rethinking the way we engage with customers and partners. Key Responsibilities: Partner with business, product, and engineering teams to define problem statements, evaluate feasibility, and design AI/ML-driven solutions that deliver measurable business value. Lead and execute end-to-end AI/ML projects — from data exploration and model development to validation, deployment, and monitoring in production. Drive solution architecture using techniques in data engineering, programming, machine learning, NLP, and Generative AI. Champion the scalability, reproducibility, and sustainability of AI solutions by establishing best practices in model development, CI/CD, and performance tracking. Guide junior and associate AI/ML engineers through technical mentoring, code reviews, and solution reviews. Identify and evangelize the adoption of emerging tools, technologies, and methodologies across teams. Translate technical outputs into actionable insights for business stakeholders through storytelling, data visualizations, and stakeholder engagement. We are looking for: A seasoned AI/ML engineer with 7+ years of hands-on experience delivering enterprise-grade AI/ML solutions. Advanced proficiency in Python, SQL, PySpark, and experience working with cloud platforms (Azure preferred) and tools such as Databricks, Synapse, ADF, and Web Apps. Strong expertise in applied text analytics, NLP, and Generative AI, with real-world deployment exposure. Solid understanding of model evaluation, optimization, bias mitigation, and monitoring in production. A problem solver with scientific rigor, strong business acumen, and the ability to bridge the gap between data and decisions. Prior experience in leading cross-functional AI initiatives or collaborating with engineering teams to deploy ML pipelines. Bachelor's or master’s degree in computer science, Engineering, Statistics, or a related quantitative field. A PhD is a plus. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 3 days ago

Apply

4.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key responsibilities: Collaborate with business, platform and technology stakeholders to understand the scope of projects. Perform comprehensive exploratory data analysis at various levels of granularity of data to derive inferences for further solutioning/experimentation/evaluation. Design, develop and deploy robust enterprise AI solutions using Generative AI, NLP, machine learning, etc. Continuously focus on providing business value while ensuring technical sustainability. Promote and drive adoption of cutting-edge data science and AI practices within the team. Continuously stay up to date on relevant technologies and use this knowledge to push the team forward. We are looking for: A team player having 4-7 years of experience in the field of data science and AI. Proficiency with programming/querying languages like python, SQL, pyspark along with Azure cloud platform tools like databricks, ADF, synapse, web app, etc. An individual with strong work experience in areas of text analytics, NLP and Generative AI. A person with a scientific and analytical thinking mindset comfortable with brainstorming and ideation. A doer with deep interest in driving business outcomes through AI/ML. A candidate with bachelor’s or master’s degree in engineering, computer science with/withput a specialization within the field of AI/ML. A candidate with strong business acumen and desire to collaborate with business teams and help them by solving business problems. Prior understanding in business domain of shipping and logistics is an advantage. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job Description: Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include: · Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery · Integrate and support third-party APIs and external services · Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack · Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) · Participate in Agile/Scrum ceremonies and manage tasks using Jira · Understand technical priorities, architectural dependencies, risks, and implementation challenges · Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Primary Skills: 8+ years of hands-on development experience with: · C#, .NET Core 6/8+, Entity Framework / EF Core · JavaScript, jQuery, REST APIs · Expertise in MS SQL Server, including: · Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types · Skilled in unit testing with XUnit, MSTest · Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership · Strong problem-solving and debugging capabilities · Ability to write reusable, testable, and efficient code · Develop and maintain frameworks and shared libraries to support large-scale applications · Excellent technical documentation, communication, and leadership skills · Microservices and Service-Oriented Architecture (SOA) · Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: · Azure Functions · Azure Durable Functions · Azure Service Bus, Event Grid, Storage Queues · Blob Storage, Azure Key Vault, SQL Azure · Application Insights, Azure Monitoring Secondary Skills: · Familiarity with AngularJS, ReactJS, and other front-end frameworks · Experience with Azure API Management (APIM) · Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) • Experience with Azure Data Factory (ADF) and Logic Apps · Exposure to Application Support and operational monitoring · Azure DevOps - CI/CD pipelines (Classic / YAML) Qualification: Any UG / PG Degree / Engineering Graduates Experience: Minimum 8+ Years Gender: Male / Female Job Location: Trivandrum / Kochi (KERALA) Job Type: Full Time | Mid Shift | Sat & Sun Week Off Working Time: 12:01 PM to 9:00 PM Project: European client | Shift: Mid Shift (12:01PM TO 9:00PM) | WFO Salary: Rs.18,00,000 to 30,00,000 LPA Apply to hr@trueledge.com or info@trueledge.com

Posted 3 days ago

Apply

7.0 years

8 - 9 Lacs

Thiruvananthapuram

On-site

7 - 9 Years 4 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: We are seeking a highly experienced Senior Data Engineer to design, develop, and optimize scalable data pipelines in a cloud-based environment. The ideal candidate will have deep expertise in PySpark, SQL, Azure Databricks, and experience with either AWS or GCP. A strong foundation in data warehousing, ELT/ETL processes, and dimensional modeling (Kimball/star schema) is essential for this role. Must-Have Skills 8+ years of hands-on experience in data engineering or big data development. Strong proficiency in PySpark and SQL for data transformation and pipeline development. Experience working in Azure Databricks or equivalent Spark-based cloud platforms. Practical knowledge of cloud data environments – Azure, AWS, or GCP. Solid understanding of data warehousing concepts, including Kimball methodology and star/snowflake schema design. Proven experience designing and maintaining ETL/ELT pipelines in production. Familiarity with version control (e.g., Git), CI/CD practices, and data pipeline orchestration tools (e.g., Airflow, Azure Data Factory Skills Azure Data Factory,Azure Databricks,Pyspark,Sql About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 3 days ago

Apply

2.0 - 5.0 years

3 - 9 Lacs

Gurgaon

On-site

Work Flexibility: Hybrid Senior Analyst, Analytics What will you do: Improve, and maintain Azure-based data warehouse solutions . Implement, monitor, and optimize workflows using Azure Synapse, ADF, and Databricks. Manage relationships with IT vendors to ensure optimal service delivery and performance. Offer the best practices, advice and recommendations to the Managed Services team around the overall architecture and strategy of Azure-based solutions. Act as the liaison between technical teams and business stakeholders to ensure effective service delivery. Collaborate with cloud architects and engineers to optimize cost, performance, and security. Assist with onboarding new Azure services and integrating them into existing operations. Investigate and resolve complex technical issues and bugs, ensuring the stability and reliability of the applications and data warehouse solutions. Operations Work closely with the IT Service Delivery Lead and support teams to manage daily support and maintenance of application instances and conduct long-term improvement operations to ensure compatibility with evolving mission requirements. What you need: Bachelor’s degree required; Master’s degree in computer science or Business Administration preferred 2 to 5 years of experience in Azure Platform (Synapse, ADF, Databricks, Power BI) Microsoft Azure Fundamentals or higher-level Azure certifications (e.g., AZ-104, AZ-305).Strong understanding of Azure services including Azure Virtual Machines, Azure Active Directory, Azure Monitor, and Azure Resource Manager. Experience in IT Service Management (ITSM), data analysis, and business process automation. Ability to develop good working relationships with technical, business, using strong communication and team-building skills. Ability to analyze numbers, trends, and data to make new conclusions based on findings. Ability to work effectively in a matrix organization structure, focusing on collaboration and influence rather than command and control. Travel Percentage: 10%

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Software: fuel for mobility We bring bold digital visions to life. So we’re on the lookout for more curious and creative engineers who want to create change – one line of high-quality code at a time. Our transformation isn't for everyone, but if you're excited about solving the leading-edge technological challenges facing the auto industry, then let’s talk about your next move. Let's introduce ourselves At Volvo Cars, curiosity, collaboration, and continuous learning define our culture. Join our mission to create sustainable transportation solutions that protect what matters most – people, communities, and the planet. As a Data Engineer, you will drive digital innovation, leading critical technology initiatives with global teams. You’ll design and implement solutions impacting millions worldwide, supporting Volvo’s vision for autonomous, electric, and connected vehicles. What You'll Do Technical Leadership & Development Lead development and implementation using AirFlow, Amazon Web Services (AWS), Azure, Azure Data Factory (ADF), Big Data and Analytics, Core Data, Data Analysis, ETL/ELT, PowerBI, SQL / SQL Script, Snowflake Design, build, and maintain scalable solutions supporting global operations Collaborate closely with USA stakeholders across product management and engineering Promote technical excellence through code reviews, architecture decisions, and best practices Cross-Functional Collaboration Partner internationally using Microsoft Teams, Slack, SharePoint, and Azure DevOps Participate in Agile processes and sprint planning Share knowledge and maintain technical documentation across regions Support 24/7 operations through on-call rotations and incident management Innovation & Continuous Improvement Research emerging technologies to enhance platform capabilities Contribute to roadmap planning and architecture decisions Mentor junior team members and encourage knowledge sharing What You'll Bring Professional Experience 4 -8 years hands-on experience in software development, system administration, or related fields Deep expertise in AirFlow, AWS, Azure, ADF, Big Data, Core Data, Data Analysis, ETL/ELT, PowerBI, SQL, Snowflake with proven implementation success Experience collaborating with global teams across time zones Preferred industry knowledge in automotive, manufacturing, or enterprise software Technical Proficiency Advanced skills in core technologies: AirFlow, AWS, Azure, ADF, Big Data, Core Data, Data Analysis, ETL/ELT, PowerBI, SQL, Snowflake Strong grasp of cloud platforms, DevOps, and CI/CD pipelines Experience with enterprise integration and microservices architecture Skilled in database design and optimization with SQL and NoSQL Essential Soft Skills Analytical Thinking, Collaboration, Communication Skills, Critical Thinking, Documentation Best Practices, Problem Solving, Written Communication Excellent communication, able to explain complex technical topics Adaptable in multicultural, globally distributed teams Strong problem-solving abilities Additional Qualifications Business-level English fluency Flexibility to collaborate across USA time zones Volvo Cars – driving change together Volvo Cars’ success is the result of a collaborative, diverse and inclusive working environment. Today, we’re one of the most well-known and respected car brands, with around 43,000 employees across the globe. At Volvo Cars, your career is designed around your skills and aspirations, so you can reach your fullest potential. And it’s so exciting – we’re well on our way on our journey towards full electrification. We have five fully electric cars already on the market, and five more on the way. Our fully-electric and plug-in hybrid cars combined make up almost 50 per cent of our sales. So come and join us in shaping the future of mobility. There’s never been a more rewarding time to play your part in our inspiring and creative teams!

Posted 3 days ago

Apply

5.0 - 7.0 years

0 Lacs

Noida

On-site

5 - 7 Years 2 Openings Noida Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 3 days ago

Apply

0 years

2 - 9 Lacs

Noida

On-site

The Senior Technical Lead in CRM / D365 CE OOTB, Configuration, Cust will be responsible for overseeing and leading technical teams to deliver high-quality CRM / D365 CE solutions. The main objective is to ensure the successful implementation, customization, and configuration of CRM / D365 CE OutoftheBox functionalities to meet the specific needs of the organization. (1.) Key Responsibilities 1. Lead and manage technical teams in the design, development, and implementation of crm / d365 ce solutions. 2. Define and implement best practices for crm / d365 ce outofthebox configurations and customizations. 3. Collaborate with stakeholders to gather requirements and provide technical expertise in crm / d365 ce solution design. 4. Perform system analysis, troubleshooting, and debugging to ensure smooth operation of crm / d365 ce systems. 5. Provide guidance and mentorship to junior team members to enhance their technical skills and capabilities. 6. Stay updated on the latest crm / d365 ce trends, updates, and features to propose innovative solutions. Skill Requirements 1. Strong proficiency in crm / d365 ce outofthebox functionalities, configurations, and customizations. 2. Extensive experience in leading technical teams and managing crm / d365 ce implementation projects. 3. In-depth knowledge of crm / d365 ce architecture, data models, and integration capabilities. 4. Excellent problem-solving skills and ability to analyze complex crm / d365 ce issues. 5. Strong communication skills to effectively collaborate with cross functional teams and stakeholders. 6. Ability to prioritize tasks, meet deadlines, and deliver high-quality crm / d365 ce solutions. Certifications: Microsoft Certified: Dynamics 365 Customer Service Functional Consultant Associate or similar certifications preferred. No. of Positions 1 Skill (Primary) Microsoft Dynamics (APPS)-Customer Engagement-Technical-MsD-Microsoft Dynamics 365 Auto req ID 1589518BR Skill Level 3 (Secondary Skill 1) Data Fabric-Azure-Azure Data Factory (ADF) Skill Level 3 (Secondary Skill 2) Microsoft Dynamics (APPS)-MsD-General-Tools and Standards-SSIS/KingswaySoft Skill Level 3 (Secondary Skill 3) Technical Skills (APPS)-Datawarehouse-Extract Transform Load (ETL) Automation Skill Level 3 (Secondary Skill 4) Technical Skills (APPS)-Databases-RDBMS-Microsoft SQL Server

Posted 3 days ago

Apply

8.0 years

18 - 30 Lacs

Thiruvananthapuram, Kerala

On-site

Designation: Senior Dot Net Developer Qualification: Any UG / PG Degree / Engineering Graduates Experience: Minimum 8+ Years Gender: Male / Female Job Location: Trivandrum / Kochi (KERALA) Job Type: Full Time | Day Shift | Sat & Sun Week Off Working Time: 12:01 PM to 9:00 PM Project: European client | Shift: Mid Shift (12:01PM TO 9:00PM) | WFO Job Description: Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability Primary Skills: 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Job Types: Full-time, Permanent Pay: ₹1,800,000.00 - ₹3,000,000.00 per year Benefits: Cell phone reimbursement Food provided Health insurance Internet reimbursement Paid sick time Paid time off Provident Fund Location Type: In-person Schedule: Day shift Monday to Friday Work Location: In person Speak with the employer +91 9489357211

Posted 3 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years ( Total - 5yrs and Relevant - 5 yrs) Manadatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, ADF Lead experience is an added Advantage. Interested Candidates please send their Resume to: gigin.raj@greenbayit.com 8943011666

Posted 3 days ago

Apply

2.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary: The metrics insights and analytics team is responsible for building dashboards and analytical solutions using AI ML based on requirements from business . Provide predictive and prescriptive analytics based using various delivery execution parameters and give actionable insights to users. Automate processes using new age machine learning algorithms. Key Roles and Responsibilities: • Conceptualize, maintain, automate dashboards as per the requirements • Automation of existing processes to improve productivity and time to market • Enable decision making and action plan identification through Metrics analytics • Conduct training and presentations • Connect with various stakeholders to understand business problems and provide solutions • Bring new age solutions and techniques into the way of working Skills: • Minimum 2-5 years of work experience on power BI dashboards/ TABLEAU and python • Minimum 2-5 years of work experience on AI/ML development • Strong Analytical skills, adept in solutioning & problem solving, Inclination towards numbers • Experience of working on Text analytics, NLP • Experienced in data cleansing, pre-processing data and exploration data analysis • Knowledge on Azure ADF, excel MACRO, RPA will be an advantage • Able to perform feature engineering, normalize data and build correlation maps • Proficient in SQL • Hand-on experience in model operationalization and pipeline management • Capable of working with global teams • Good presentation and training skills LTIMindtree https://www.ltimindtree.com/ is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models, accelerate innovation, and maximize growth by harnessing digital technologies. As a digital transformation partner to more than 750 clients, LTIMindtree brings extensive domain and technology expertise to help drive superior competitive differentiation, customer experiences, and business outcomes in a converging world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree — a Larsen & Toubro Group company — combines the industry-acclaimed strengths of erstwhile Larsen and Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit www.ltimindtree.com https://www.ltimindtree.com/ . DEI Statement: LTIMindtree is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, ethnicity, nationality, gender, gender-identity, gender expression, language, age, sexual orientation, religion, marital status, veteran status, socio-economic status, disability, or any other characteristic protected by applicable law.

Posted 3 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Eviden, part of the Atos Group, with an annual revenue of circa € 5 billion is a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise for all industries in more than 47 countries. By uniting unique high-end technologies across the full digital continuum with 47,000 world-class talents, Eviden expands the possibilities of data and technology, now and for generations to come. Roles & Responsibilities Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements Hands-on experience with Microsoft Fabric, including Lakehouse, Data Factory, and Synapse. Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory. Our Offering Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Let’s grow together.

Posted 3 days ago

Apply

8.0 - 14.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Tips: We are hiring for Data Engineer Role. Experience:8-14 Years Locations: Pune, Chennai Notice Period: Immediate Joiners Responsibilities Mandatory Skills: Python, Pyspark, Databricks, Unity Catalog, DLT (Delta Live Tables), Databricks Workflows, Azure/AWS cloud, ADF/Orchestrator, CI/CD. Qualifications B.Tech, M.Tech, B.E., B.Com, B.Sc, B.A, MBA

Posted 3 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years Manadatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, SSIS, ADF , Lead experience Start Date: Aug 6 ,2025 Salary- 18 to 23 lpa Job Purpose (both Onsite / Offshore) Responsible for delivering senior-level innovative, compelling, coherent software solutions for our consumer, internal operations and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. Job Specification / Skills and Competencies Designs, develops and delivers solutions that meet business line and enterprise requirements. Participates in rapid prototyping and POC development efforts. Advances overall enterprise technical architecture and implementation best practices. Assists in efforts to develop and refine functional and non-functional requirements. Participates in iteration and release planning. Informs efforts to develop and refine functional and non-functional requirements. Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with state and federal regulations and laws as they pertain to this position. Strong ability to produce high-quality, properly functioning deliverables the first time. Delivers work product according to established deadlines. Estimates tasks with a level of granularity and accuracy commensurate with the information provided. Works collaboratively in a small team. Excels in a rapid iteration environment with short turnaround times. Deals positively with high levels of uncertainty, ambiguity, and shifting priorities. Accepts a wide variety of tasks and pitches in wherever needed. Constructively presents, discusses and debates alternatives. Takes shared ownership of the product. Communicates effectively both verbally and in writing. Takes direction from team leads and upper management. Ability to work with little to no supervision while performing duties. Proficient in SSIS & ADF Strong in MS SQL Hands on experience in Data Lake Hands-on experience in data mart and data warehousing including variant schemas (Star, Snowflake). 5+ years of experience with advanced queries, stored procedures, views, triggers, etc. 5+ years of experience of performance tuning queries. 5+ years of experience of both DDL and DML. 5+ years of experience designing enterprise database systems using Microsoft SQL Server/Azure SQL preferred. Experience in Lakehouse architecture preferred. Experience with Cloud technologies – AWS, Snowflake is preferred. Deep understanding of one or more source/version control systems. Develops branching and merging strategies. Working understanding of Web API, REST, JSON. Working understanding of unit testing creation. Bachelor’s Degree is required, and/or a minimum of four (4) + related work experience. To adhere to the Information Security Management policies and procedures.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Description Role: Senior Data Engineer Location: Kochi / Trivandrum /Bangalore Experience: 5+years Mandatory skills : Strong in MS SQL and SSIS , Data Lake, Azure SQL, SSIS, ADF , Lead experience Start Date: Aug 6 ,2025 Salary- 18 to 23 LPA Job Purpose (both Onsite / Offshore) Responsible for delivering senior-level innovative, compelling, coherent software solutions for our consumer, internal operations and value chain constituents across a wide variety of enterprise applications through the creation of discrete business services and their supporting components. Job Specification / Skills and Competencies 1. Designs, develops and delivers solutions that meet business line and enterprise requirements. 2. Participates in rapid prototyping and POC development efforts. 3. Advances overall enterprise technical architecture and implementation best practices. 4. Assists in efforts to develop and refine functional and non-functional requirements. 5. Participates in iteration and release planning. 6. Informs efforts to develop and refine functional and non-functional requirements. 7. Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with state and federal regulations and laws as they pertain to this position. 8. Strong ability to produce high-quality, properly functioning deliverables the first time. 9. Delivers work product according to established deadlines. 10. Estimates tasks with a level of granularity and accuracy commensurate with the information provided. 11. Works collaboratively in a small team. 12. Excels in a rapid iteration environment with short turnaround times. 13. Deals positively with high levels of uncertainty, ambiguity, and shifting priorities. 14. Accepts a wide variety of tasks and pitches in wherever needed. 15. Constructively presents, discusses and debates alternatives. 16. Takes shared ownership of the product. 17. Communicates effectively both verbally and in writing. 18. Takes direction from team leads and upper management. 19. Ability to work with little to no supervision while performing duties. 20. Proficient in SSIS & ADF 21. Strong in MS SQL 22. Hands on experience in Data Lake 23. Hands-on experience in data mart and data warehousing including variant schemas (Star, Snowflake). 24. 5+ years of experience with advanced queries, stored procedures, views, triggers, etc. 25. 5+ years of experience of performance tuning queries. 26. 5+ years of experience of both DDL and DML. 27. 5+ years of experience designing enterprise database systems using Microsoft SQL Server/Azure SQL preferred. 28. Experience in Lakehouse architecture preferred. 29. Experience with Cloud technologies – AWS, Snowflake is preferred. 30. Deep understanding of one or more source/version control systems. Develops branching and merging strategies. 31. Working understanding of Web API, REST, JSON. 32. Working understanding of unit testing creation. 33. Bachelor’s Degree is required, and/or a minimum of four (4) + related work experience. 34. To adhere to the Information Security Management policies and procedures.

Posted 3 days ago

Apply

10.0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

HCLTech is looking for a passionate and experienced Azure Data Engineer to join our growing team. If you have strong hands-on experience with Azure Data Factory , Azure Databricks , and Oracle , and are excited to work on impactful data projects, we want to hear from you! 🔹 We're Hiring: Azure Data Engineer 📍 Location: Lucknow 🏢 Company: HCLTech 🕒 Shift: Rotational 💼 Project Type: Support / Development 📅 Experience: 5–10 Years 🎯 Customer Interview: Not Required Key Responsibilities Design, develop, and maintain data pipelines using Azure Data Factory and Azure Databricks. Work with Oracle databases for data extraction, transformation, and loading (ETL). Collaborate with cross-functional teams to support and enhance data solutions. Optimize and troubleshoot data workflows and performance issues. Participate in support and development activities across multiple projects. Why Join Us? Work on cutting-edge Azure technologies. Flexible work location with physical presence in Lucknow. Collaborative and growth-oriented environment. No customer interviews – quick onboarding process. 📩 Apply Now Ready to take the next step in your ADF data engineering career? 📧 Drop your resume on sushma-bisht@hcltech.com

Posted 3 days ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a seasoned Senior ETL/DB Tester, you will be responsible for designing, developing, and executing comprehensive test plans to validate ETL and database processes. Your expertise in SQL and experience with tools like Talend, ADF, Snowflake, and Power BI will be crucial in ensuring data integrity and accuracy across modern data platforms. Your analytical skills, attention to detail, and ability to collaborate with cross-functional teams in a fast-paced data engineering environment will be key in this role. Your main responsibilities will include validating data transformations and integrity, performing manual testing and defect tracking using tools like Zephyr or Tosca, analyzing business and data requirements for test coverage, and writing complex SQL queries for data reconciliation. You will also be expected to identify data-related issues, conduct root cause analysis in collaboration with developers, and track bugs and enhancements using appropriate tools. In addition, you will optimize testing strategies for performance, scalability, and accuracy in ETL processes. Your skills in ETL tools like Talend, ADF, data platforms like Snowflake, and reporting/analytics tools such as Power BI and VPI will be essential for success in this role. Your expertise in API testing and advanced features of Power BI like Dashboards, DAX, and Data Modelling will further strengthen your testing capabilities. Overall, your role as a Senior ETL/DB Tester will require a combination of technical skills, testing proficiency, and collaboration with various teams to ensure the reliability and accuracy of data processes across different data platforms and tools.,

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

indore, madhya pradesh

On-site

As a Power BI Developer at InfoBeans, you will play a crucial role in assisting Business analysts in developing, maintaining, and supporting operational/live reports, dashboards, and scorecards using Microsoft Power BI. Your responsibilities will include the implementation of Row Level Security by defining various constraints for each defined ROLE. In this role, you will have the opportunity to work in a dynamic environment alongside smart and pragmatic team members. You will be part of a learning culture that values teamwork, collaboration, diversity, and rewards excellence, compassion, openness, and ownership. Furthermore, you can expect ever-growing opportunities for professional and personal growth. To excel in this role, we expect you to have expertise in Power BI Desktop, mobile, and service development, along with proficiency in MSBI (SSIS, Tabular SSAS, SSRS) with DAX. Your knowledge should also include SQL Server 2012/2014/2016 - TSQL development, Snowflake, Microstrategy, Informatica Power Center, ADF, and other Azure BI Technologies. Your experience in creating dashboards, volume reports, operating summaries, presentations, and graphs will be highly beneficial. Additionally, you should be skilled in SSRS Integration to Power BI, SSAS, Data Gateway for data refreshing, content Pack Library, Managing Embed Codes, Power BI Mobile, and SQL Server versions. As a proficient data visualization expert, you should have strong application development skills and be knowledgeable in Azure. Your expertise in creating calculated measures and columns with DAX in MS Power BI Desktop, Custom Visuals, Groups usage, publishing reports to app.powerbi.com, and setting up necessary connection details will be essential. You should be an expert in connecting Microsoft Power BI Desktop to various data sources, using advanced calculations, and creating different visualizations using a range of tools such as Slicers, Lines, Pies, Histograms, Maps, Scatter, Bullets, Heat Maps, Tree maps, among others. Your proficiency in these areas will be key to your success in this role.,

Posted 4 days ago

Apply

10.0 - 14.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

Are you a creative engineer who loves a challenge Solve the complex puzzles you've been dreaming of as our Customer Success Engineer. If you have a passion for innovation in tech, we want you on our team! Oracle is a technology leader that's changing how the world does business, and our Customer Success Services (CSS) team supports over 6,000 companies around the world. We're looking for a talented and self-motivated engineer to work on-site in our Oracle ME offices. Join the team of highly skilled technical experts who build and maintain our clients" technical landscapes through tailored support services. We are looking for a Principal Fusion HCM Techno-Functional Consultant who will be responsible for providing consultancy, working with customers, translating ideas and concepts into implementable, supportable designs, and have experience in providing technical solutions aligned with Oracle standards. You will also have experience in maintaining and supporting customers" eBusiness Suite applications and Fusion SAAS, either on-site or remotely. Plays a direct role in building, maintenance, technical support, documentation, and administration of Oracle Cloud applications. **What You Will Do** As a Principal Fusion HCM Techno-Functional Consultant in Oracle CSS, you will: - Ability to be a technical team leader and coach team members in relevant skills. Finding ways to recognize the contributions of others in the team. - Assess and analyze customers" business needs to make sure that Oracle solutions meet the customer's objectives. - Assist customers in their overall Journey to Cloud. - Ensure Oracle cloud technologies are leveraged appropriately using best practices. - Be the Oracle Solution Delivery authority to ensure that customers make informed decisions regarding scope to achieve beneficial solutions cost-effectiveness, quality, and reusability. - Providing technical guidance on Oracle cloud and/or on-premise solutions to customers and other Oracle team members to underpin successful delivery. - Support solutions around multi-cloud and hybrid cloud setups. - Ensure successful handover from Implementation toward operations making sure the implemented solution will fit the customer requirements. - Maintain the Oracle Solution to make sure the customer demands needs will be met. Platforms for Oracle solutions are on-premise, cloud, or hybrid running various workloads (application, middleware, database, and infrastructure). - Working closely with the Technical Account Manager to ensure that the individual work streams are technically well managed. - Be the main contact for new business opportunities by supporting our presales team. Identifies and promotes opportunities for sales of Oracle products and services to support business growth. - Actively lead and contribute to strategic programs and initiatives. - To summarize - helping to use and take the best advantage of all the value our company offers to our customers. **What We Are Looking For** - 10+ years of relevant professional experience. Bachelor's degree in computer science, information systems, software engineering, or related field preferred. - Strong experience in implementing Fusion Applications, at least 4 full cycles of successful implementations. - Demonstrate a good understanding of the Fusion quarterly update process and best practices according to new feature adoption, testing, and change management. - Strong knowledge in roles and security. - Proven Experience on Oracle Transactional Business Intelligence (OTBI), dashboards, all types of data loaders, extracts, fast formula, error handling, SOAP services, BPM, personalization, Sandboxes, page composer, etc. - Design and develop customizations using Visual Builder, ADF, and Process Builder in OIC to Oracle ERP Cloud is a plus. For this position, we are looking for a creative, innovative, and motivated professional with an open and flexible mindset who will work closely with the Customer to ensure alignment between business change, IT architecture, technical solutions, business resources, and processes. As an integral part of a global Organization, the Principal HCM Engineer will be working within an international environment with colleagues around the globe and contribute to global technology-driven initiatives or innovation programs for continuous service improvements.,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

Dear candidates, ValueLabs is currently looking for a BI Lead with a strong background in Power BI and SQL to join our team at the earliest. The ideal candidate should possess 8-12 years of experience and expertise in Power BI, SQL queries, and Azure Data Factory (ADF). As a Technical Lead, your key responsibilities will include creating engaging and interactive reports and dashboards using Power BI Desktop. You will also be tasked with designing and implementing Power BI data models that seamlessly integrate with various data sources. Additionally, you will be expected to automate report delivery and scheduling using tools like Power Automate. The role will also require you to have experience in team management, excellent communication skills, and the ability to collaborate with business stakeholders to understand their reporting requirements and translate them into actionable insights. You will be responsible for developing and maintaining ETL processes using Azure Data Factory and data warehouses using Azure Synapse Analytics. As a part of the role, you will oversee and manage a team of data engineers, ensuring they meet project deadlines and deliver high-quality work. You will also be responsible for developing and implementing team guidelines, policies, and procedures to enhance productivity and performance. Mentoring and coaching team members to improve their skills and career development will be crucial, along with conducting regular one-on-one meetings to discuss progress, address concerns, and set goals. If you are interested in this position, please submit your resume to deepika.malisetti@valuelabs.com. We encourage you to share this job opportunity with anyone who might benefit from it, and references are highly appreciated. Best regards, ValueLabs Team,

Posted 4 days ago

Apply

2.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

What will you do: Improve, and maintain Azure-based data warehouse solutions . Implement, monitor, and optimize workflows using Azure Synapse, ADF, and Databricks. Manage relationships with IT vendors to ensure optimal service delivery and performance. Offer the best practices, advice and recommendations to the Managed Services team around the overall architecture and strategy of Azure-based solutions. Act as the liaison between technical teams and business stakeholders to ensure effective service delivery. Collaborate with cloud architects and engineers to optimize cost, performance, and security. Assist with onboarding new Azure services and integrating them into existing operations. Investigate and resolve complex technical issues and bugs, ensuring the stability and reliability of the applications and data warehouse solutions. Operations Work closely with the IT Service Delivery Lead and support teams to manage daily support and maintenance of application instances and conduct long-term improvement operations to ensure compatibility with evolving mission requirements. What you need: Bachelor’s degree required; Master’s degree in computer science or Business Administration preferred 2 to 5 years of experience in Azure Platform (Synapse, ADF, Databricks, Power BI) Microsoft Azure Fundamentals or higher-level Azure certifications (e.g., AZ-104, AZ-305).Strong understanding of Azure services including Azure Virtual Machines, Azure Active Directory, Azure Monitor, and Azure Resource Manager. Experience in IT Service Management (ITSM), data analysis, and business process automation. Ability to develop good working relationships with technical, business, using strong communication and team-building skills. Ability to analyze numbers, trends, and data to make new conclusions based on findings. Ability to work effectively in a matrix organization structure, focusing on collaboration and influence rather than command and control. Stryker is a global leader in medical technologies and, together with its customers, is driven to make healthcare better. The company offers innovative products and services in MedSurg, Neurotechnology, Orthopaedics and Spine that help improve patient and healthcare outcomes. Alongside its customers around the world, Stryker impacts more than 150 million patients annually.

Posted 4 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Skills Cloud Platforms ( AWS, MS Azure, GC etc.) Containerization and Orchestration ( Docker, Kubernetes etc..) APIs - Change APIs to APIs development Data Pipeline construction using languages like Python, PySpark, and SQL Data Streaming (Kafka and Azure Event Hub etc..) Data Parsing ( Akka and MinIO etc..) Database Management ( SQL and NoSQL, including Clickhouse, PostgreSQL etc..) Agile Methodology ( Git, Jenkins, or Azure DevOps etc..) JS like Connectors/ framework for frontend/backend Collaboration and Communication Skills Aws Cloud,Azure Cloud,Docker,Kubernetes

Posted 4 days ago

Apply

10.0 - 12.0 years

14 - 20 Lacs

Noida

Work from Office

Role & responsibilities Candidate with Database management system knowledge, data modelling in ETL/Snowflakes, solid SQL scripting experience, ADF, data fudging Experience : Total 10 + Yrs Relevant : 5+ Yrs Job Location : Noida Mode : Work From Office Technical Skills: 10+ years of experience on Database Management Systems, configures database parameters and prototype designs against logical data models. Defines data repository requirements, data dictionaries and warehousing requirements. To optimizes database access and allocates/re-allocates database resources for optimum configuration, database performance and cost. ADF Building knowledge and Data Masking Experience/knowledge with Microsoft SQL Replication and CDC technology is must. Experience setting up and configuring HA (High Availability) / Replication/AlwaysOn. Preferably having researched and fine-tuned a setup so that the person understands what/why and can understand our needs and design/implement/configure a setup that meets client needs. Solid SQL scripting experience, in general. *The typical DBA skills / experience including DB maintenance, table/index/etc. maintenance, backups, monitoring, security, data dictionary, integrity checks, configuration, patching, and statistics, etc. Experience with SQL Server 2008/2012/2014/2016, preferably with multiple editions (Standard, Enterprise, etc.). Experience having installed and configured SQL Server instances in a similar capacity Thorough understanding of Performance Tuning both as a System DBA and Application DBA (difference being Application DBA is more about query/application performance tuning and System DBA is about tuning the database itself, how it is configured, etc.). Strong SQL skills Strong Data Modeling skills both Logical/Physical Database Modeling knowledge Strong analytical and problem solving skills Excellent verbal and written communication skills Strong experienced with Microsoft tools and software (Visual Studio 2012/2015, SQL Mgmt. Studio, Microsoft Office, etc.) Experience with Data Warehousing and OLTP database management. Measure, track and meet SLA metrics (analytics cycle time, schedule, accuracy, rework, etc.). Assist in database performance tuning and troubleshooting database issues in both OLTP and EDW environments. Install, configure, and maintain critical SQL Server databases. Dimensional and relational based databases. Supporting internal and customer facing applications Assist in the full range of SQL Server maintenance activities (for example): Backups, restores, recovery models, database shrink operations, DBCC commands, and Replication. Table, Index, etc. design, creation, and maintenance. On-going maintenance activities - working to plan and automate as much as possible. Increase availability and performance while at the same time reducing manual support time. Assist as part of the team designing, building, and maintaining the future state of Merchants database platforms: SQL Server versions, editions, and components. Database server configuration. Database and Data Model standards Business Continuity and High Availability strategy. Overall Data Architecture. Upgrade and Patching strategy. Working on Automation using powershell and T-sql scripts. Knowledge on Python, ARM, Bicep is preferred. Knowledge in snowflakes is preferred. Knowledge in ETL Integration layer ,SSIS and SSRS. To troubleshoot and Fix issues on JAMS JOBS which are running on SSIS, Powershell, T-SQL and Batch files. Knowledge on DevOps and Azure environment Migration of SQL Server and work with application support to support all CDC and ETL Integration Layer. Installation and configuration of SSRS, SSIS and SSAS. Good to have Deep Azure experience Preferred candidate profile Process Skills: General SDLC processes Understanding of utilizing Agile and Scrum software development methodologies Skill in gathering and documenting user requirements and writing technical specifications Behavioral Skills : Good Attitude and Quick learner Well-developed analytical & problem-solving skills Strong oral and written communication skills Excellent team player, able to work with virtual teams Excellent leadership skills with ability lead and guide and groom the team Self-motivated and capable of working independently with minimal management supervision Able to talk to client directly and report

Posted 4 days ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Job Title: Senior Data Engineer Location: Kochi / Trivandrum / Bangalore Experience: 5+ years Salary: ₹18 to ₹23 LPA Notice period: Immediate Joiners Job Summary: We are looking for an experienced Senior Data Engineer to join our team and lead the design, development, and deployment of enterprise data solutions. The ideal candidate is proficient in MS SQL, SSIS, Azure Data Factory, Azure SQL, and Data Lakes and has a strong background in data warehousing, performance tuning, and cloud technologies. Key Responsibilities: Design and implement scalable data solutions using SQL Server, SSIS, ADF, and Azure Lead development of data pipelines, data marts, and Lakehouse architectures Collaborate on rapid prototyping, POCs, and cross-functional project planning Ensure delivery of high-quality, optimized queries, views, and stored procedures Provide mentoring and technical leadership to junior engineers Work in Agile teams to deliver features within deadlines Required Skills: Strong hands-on experience in MS SQL, SSIS, ADF Azure Data Lake and Azure SQL Expertise in Data Mart/Data Warehouse modeling (Star, Snowflake) 5+ years in writing complex queries, stored procedures, performance tuning Experience with cloud platforms (Azure mandatory; AWS/Snowflake preferred) Familiarity with source control, REST APIs, JSON, and unit testing Education: Bachelor’s Degree in Computer Science or related field, or equivalent practical experience Apply Now if you're ready to take the next step in your data engineering career and lead critical data initiatives in a dynamic, tech-driven environment.

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies