Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - Microsoft Fabric Job Date: Jun 9, 2025 Job Requisition Id: 61229 Location: Hyderabad, TG, IN Indore, MP, IN, 452001 Indore, IN Pune, MH, IN Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 4-6 Years Job Description Experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure SQL, ETL Create Pipelines, datasets, dataflows, Integration runtimes and monitoring Pipelines and trigger runs Extract Transformation and Load Data from source system and processing the data in Azure Databricks Create SQL scripts to perform complex queries Create Synapse pipelines to migrate data from Gen2 to Azure SQL Data Migration pipeline to Azure cloud (Azure SQL). Database migration from on-prem SQL server to Azure Dev Environment by using Azure DMS and Data Migration Assistant Experience in using azure data catalog Experience in Big Data Batch Processing Solutions; Interactive Processing Solutions; Real Time Processing Solutions Certifications Good To Have At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 2 weeks ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Tech Lead - Microsoft Fabric Job Date: Jun 9, 2025 Job Requisition Id: 61208 Location: Hyderabad, TG, IN Pune, MH, IN Indore, IN Indore, IN Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 12 Yrs to 14 Yrs Job Description 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Software Engineer - Microsoft Fabric 1 Job Date: Jun 9, 2025 Job Requisition Id: 61230 Location: Hyderabad, TG, IN Indore, IN Indore, MP, IN, 452001 Pune, IN Pune, MH, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 3-5 Years Job Description Experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure SQL, ETL Create Pipelines, datasets, dataflows, Integration runtimes and monitoring Pipelines and trigger runs Extract Transformation and Load Data from source system and processing the data in Azure Databricks Create SQL scripts to perform complex queries Create Synapse pipelines to migrate data from Gen2 to Azure SQL Data Migration pipeline to Azure cloud (Azure SQL). Database migration from on-prem SQL server to Azure Dev Environment by using Azure DMS and Data Migration Assistant Experience in using azure data catalog Experience in Big Data Batch Processing Solutions; Interactive Processing Solutions; Real Time Processing Solutions Certifications Good To Have At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved.
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Role: Azure Data Engineer Experience Range: 5 to 8 years Location: Kolkata. JD: Build and maintain the technology platform for both on premise and cloud-based solutions Responsible for engineering a new cloud data platform analyzing different emerging technology solutions that range from Microsoft Synapse , Azure Data factory, Azure Data Lake , Databricks , Azure Blob Storage , HDInsights, PowerBI, or SQL DWH and more 2+ years of experience with Azure cloud data engineering tools (Azure Data Factory , Data Lake storage, data bricks , Data Explorer , Azure Synapse , Azure Blob Storage ) Capable of communicating clearly with business users and having detailed discussions with the technology team Goal-oriented team player committed to quality and detail Innovative thinker who is positive, proactive, and readily embraces change Experience in completing modern data engineering solutions Good knowledge of Azure PaaS and IaaS Knowledge of data modeling, integration, and design techniques a plus
Posted 2 weeks ago
15.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Sigmoid: Sigmoid empowers enterprises to make smarter, data-driven decisions by blending advanced data engineering with AI consulting. We collaborate with some of the world's leading data-rich organizations across sectors such as CPG-retail, BFSI, life sciences, manufacturing, and more to solve complex business challenges. Our global team specializes in cloud data modernization, predictive analytics, generative AI, and DataOps, supported by 10+ delivery centers and innovation hubs, including a major global presence in Bengaluru and operations across the USA, Canada, UK, Netherlands, Poland, Singapore, and India. Recognized as a leader in the data and analytics space, Sigmoid is backed by Peak XV Partners and has consistently received accolades for innovation and rapid growth. Highlights include being named a 'Leader' in ISG's Specialty Analytics Services for Supply Chain (2024), a two-time 'India Future Unicorn' by Hurun India, and a four-time honoree on both the Inc. 500 and Deloitte Technology Fast 500 lists. Director - Data Analytics: This role will be a leadership position in the data science group at Sigmoid. An ideal person will come from a services industry background with a good mix of experience in solving complex business intelligence and data analytics problems, team management, delivery management and customer handling. This position will give you an immense opportunity to work on challenging business problems faced by fortune 500 companies across the globe. The role is part of the leadership team and includes accountability for a part of her/his team and customers. The person is expected to be someone who can contribute in developing the practice with relevant experience in the domain, nurturing the talent in the team and working with customers to grow accounts. Responsibilities Include Build trust with senior stakeholders through strategic insight and delivery credibility. Ability to translate ambiguous client business problems into BI solutions and ability to implement them. Oversight of multi-client BI and analytics programs with competing priorities and timelines, while collaborating with Data Engineering and other functions on a common goal. Ensure scalable, high-quality deliverables aligned with business impact. Help recruiting and onboarding team members; directly manage 15 - 20 team members. You would be required to own customer deliverables and ensure, along with project managers, that the project schedules are in line with the expectations set to the customers. Experience and Qualifications 15+ years of overall experience with a minimum of 10+ years in data analytics execution. Strong organizational and multitasking skills with the ability to balance multiple priorities. Highly analytical with the ability to collate, analyze and present data and drive clear insights to lead decisions that improve KPIs. Ability to effectively communicate and manage relationships with senior management, other departments and partners. Mastery of BI tools (Power BI, Tableau, Qlik), backend systems (SQL, ETL frameworks) and data modeling. Experience with cloud-native platforms (Snowflake, Databricks, Azure, AWS), data lakes. Expertise in managing compliance, access controls, and data quality frameworks is a plus. Experience working in CPG, Supply Chain, Manufacturing and Marketing domains are a plus. Strong problem-solving skills and ability to prioritize conflicting requirements. Excellent written and verbal communication skills and ability to succinctly summarize the key findings.
Posted 2 weeks ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Summary We are seeking a Senior Data Engineer with hands-on experience building scalable data pipelines using Microsoft Fabric. The role focuses on delivering ingestion, transformation, and enrichment workflows across medallion architecture. Key Responsibilities Develop and maintain data pipelines using Microsoft Fabric Data Factory and OneLake. Design and build ingestion and transformation pipelines for structured and unstructured data. Implement frameworks for metadata tagging, version control, and batch tracking. Ensure security, quality, and compliance of data pipelines. Contribute to CI/CD integration, observability, and documentation. Collaborate with data architects and analysts to meet business requirements. Qualifications 6+ years of experience in data engineering; 2+ years working on Microsoft Fabric or Azure Data services. Hands-on with tools like Azure Data Factory, Fabric, Databricks, or Synapse. Strong SQL and data processing skills (e.g., PySpark, Python). Experience with data cataloging, lineage, and governance frameworks.
Posted 2 weeks ago
0 years
5 - 9 Lacs
Hyderābād
On-site
Are you looking to take your career to the next level? We’re looking for a Data Engineer to join our Data & Analytics Core Data Lake Platform engineering team. We are searching for self-motivated candidates, who will leverage modern Agile and DevOps practices to design, develop, test and deploy IT systems and applications, delivering global projects in multinational teams. P&G Core Data Lake Platform is a central component of P&G data and analytics ecosystem. CDL Platform is used to deliver a broad scope of digital products and frameworks used by data engineers and business analysts. In this role you will have an opportunity to leverage data engineering skillset to deliver solutions enriching data cataloging and data discoverability for our users. With our approach to building solutions that would fit the scale P&G business is operating, we combine data engineering best practices (Databricks) with modern software engineering standards (Azure, DevOps, SRE) to deliver value for P&G. RESPONSIBILITIES: Writing and testing code for Data & Analytics applications and building E2E cloud native (Azure) solutions. Engineering applications throughout its entire lifecycle from development, deployment, upgrade, and replacement/termination Ensuring that development and architecture enforce to established standards, including modern software engineering practices (CICD, Agile, DevOps) Collaborate with internal technical specialists and vendors to develop final products to improve overall performance, efficiency and/or to enable adaptation of new business processes.
Posted 2 weeks ago
0 years
8 - 10 Lacs
Hyderābād
On-site
We are seeking a motivated Software/Platform Engineer with experience in Databricks observability to join our dynamic team in D&A Platforms SRE. The ideal candidate will work and play a role in maintaining the reliability, availability, and performance of our data infrastructure and applications, demonstrating Databricks to ensure flawless operations and efficient performance. You will collaborate closely with development, operations, and data teams to implement best practices in observability and monitoring, enabling a proactive approach to incident management and system optimization. Key Responsibilities Reliability and Performance : Design, implement, and maintain scalable and reliable systems and services Monitor system performance, availability, and reliability, proactively identifying and resolving issues. Observability Implementation : Apply Databricks observability tools to develop and maintain dashboards, alerts, and reporting mechanisms that provide insights into system performance and usage. Establish and improve observability frameworks to supervise key performance indicators (KPIs) and service-level objectives (SLOs). Incident Management : Respond to and fix production incidents, performing root cause analysis and implementing corrective actions to prevent future occurrences. Collaborate with multi-functional teams to ensure effective incident response processes and documentation. Automation and Efficiency : Develop automation scripts and tools to streamline operational tasks, improve deployment processes, and enhance system reliability. Supply to the continuous improvement of deployment pipelines and infrastructure as code (IaC) practices. Collaboration and Documentation : Work closely with development teams to understand application architectures and give to system design discussions. Document processes, best practices, and system architecture to facilitate knowledge sharing and onboarding. Performance Optimization : Analyze system performance and application usage patterns to recommend and implement optimizations that improve efficiency and reduce costs.
Posted 2 weeks ago
0 years
7 - 8 Lacs
Hyderābād
On-site
Are you looking to take your career to the next level? We’re looking for a Software Engineer to join our Data & Analytics Core Data Lake Platform engineering team. We are searching for self-motivated candidates, who will use modern Agile and DevOps practices to craft, develop, test and deploy IT systems and applications, delivering global projects in multinational teams. P&G Core Data Lake Platform is a central component of P&G data and analytics ecosystem. CDL Platform is used to deliver a broad scope of digital products and frameworks used by data engineers and business analysts. In this role you will have an opportunity to demonstrate data engineering abilities to deliver solutions enriching data cataloging and data discoverability for our users. With our approach to building solutions that would fit the scale P&G business is operating, we combine data engineering standard methodologies (Databricks) with modern software engineering standards (Azure, DevOps, SRE) to deliver value for P&G. RESPONSIBILITIES: Writing and testing code for Data & Analytics applications and building E2E cloud native (Azure) solutions. Engineering applications throughout its entire lifecycle from development, deployment, upgrade, and replacement/termination Ensuring that development and architecture implement to established standards, including modern software engineering practices (CICD, Agile, DevOps) Collaborate with internal technical specialists and vendors to develop final products to improve overall performance, efficiency and/or to enable adaptation of new business processes.
Posted 2 weeks ago
7.0 years
3 - 5 Lacs
Chennai
On-site
SRE Tool Evaluation & Deployment (7+ Years) Job Description Join AWACs engineering team to support the transition from Datadog and LogicMonitor to next gen SRE tools. Youll contribute to tool evaluation, POCs, and deployment across AWS, Azure, and Databricks environments. This role requires hands on experience with observability platforms and a strong understanding of cloud-native monitoring practices. Key Responsibilities: Assist in evaluating and testing SRE tool alternatives Support implementation and configuration of selected tools Integrate monitoring with cloud and data platforms Develop dashboards and alerting mechanisms Key Skills: SRE Tools (Prometheus, Grafana, etc.), AWS, Azure, Databricks, SQL Server, SSIS, Monitoring Setup, Cloud Observability About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
10.0 years
1 - 5 Lacs
Chennai
On-site
Implementation (10+ Years) Job Description We are seeking a strategic Observability Lead to spearhead AWACs SRE tooling transformation. This role involves evaluating and recommending modern alternatives to Datadog and LogicMonitor, conducting gap analysis, and leading the implementation of selected tools across AWS, Azure, and Databricks environments. The ideal candidate will bring deep monitoring expertise and leadership in tool selection, integration, and rollout. Key Responsibilities Lead analysis of current monitoring tools (Datadog, LogicMonitor) Identify and evaluate SRE tool alternatives (e.g., Prometheus, Grafana, New Relic, Dynatrace) Architect and implement chosen solutions across cloud and data platforms Collaborate with engineering and data teams to ensure seamless integration Key Skills: SRE Tooling Strategy, AWS, Azure, Databricks, SQL Server, SSIS, Monitoring Architecture, Tool Evaluation, Implementation Leadership About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
0 years
3 - 5 Lacs
Chennai
On-site
Job Description: Were looking for an Engineer to support AWACs SRE tool migration initiative. Youll help configure and maintain new monitoring tools, replacing Datadog and LogicMonitor, and ensure visibility across AWS, Azure, and Databricks environments. Ideal for someone with hands on experience in observability and a passion for modern tooling. Key Responsibilities: Support setup and configuration of new SRE tools Assist in dashboard creation and alert tuning Collaborate with teams to ensure coverage across systems Key Skills SRE Tools, AWS, Azure, Databricks, SQL Server, SSIS, Monitoring Support, Tool Configuration About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
0 years
4 - 4 Lacs
Chennai
On-site
Develop and maintain a metadata driven generic ETL framework for automating ETL code Design, build, and optimize ETL/ELT pipelines using Databricks (PySpark/SQL) on AWS . Ingest data from a variety of structured and unstructured sources (APIs, RDBMS, flat files, streaming). Develop and maintain robust data pipelines for batch and streaming data using Delta Lake and Spark Structured Streaming. Implement data quality checks, validations, and logging mechanisms. Optimize pipeline performance, cost, and reliability. Collaborate with data analysts, BI, and business teams to deliver fit for purpose datasets. Support data modeling efforts (star, snowflake schemas) de norm tables approach and assist with data warehousing initiatives. Work with orchestration tools Databricks Workflows to schedule and monitor pipelines. Follow best practices for version control, CI/CD, and collaborative development Skills Hands-on experience in ETL/Data Engineering roles. Strong expertise in Databricks (PySpark, SQL, Delta Lake), Databricks Data Engineer Certification preferred Experience with Spark optimization, partitioning, caching, and handling large-scale datasets. Proficiency in SQL and scripting in Python or Scala. Solid understanding of data lakehouse/medallion architectures and modern data platforms. Experience working with cloud storage systems like AWS S3 Familiarity with DevOps practices Git, CI/CD, Terraform, etc. Strong debugging, troubleshooting, and performance-tuning skills. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 weeks ago
0 years
5 - 9 Lacs
Bengaluru
On-site
Req ID: 330864 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior Dev Ops Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). "Job Duties: -DEVOps Exp in Establishing and managing CI/CD pipelines to automate the build, test, and deployment processes. Exp in Provision and manage infrastructure resources in the cloud using tools like Terraform. Exp in Azure Databricks, Azure DevOps tools, Terraform / Azure Resource Manager , Containerization and Orchestration with Docker and Kubernetes. Version control Exp - Git or Azure Repos Scripting automation - Azure CLI/Powershell Must have: Proficiency in Cloud Technologies Azure, Azure Databricks, ADF, CI/CD pipelines, terraform, Hashicorp Vault, Github, Git Preferred: Containerization and Orchestration with Docker and Kubernetes,IAM, RBAC, OAuth, Change Managment, SSL certificates Knowledge of security best practices and compliance frameworks like GDPR or HIPAA. Minimum Skills Required: -DEVOps Exp in Establishing and managing CI/CD pipelines to automate the build, test, and deployment processes. Exp in Provision and manage infrastructure resources in the cloud using tools like Terraform. Exp in Azure Databricks, Azure DevOps tools, Terraform / Azure Resource Manager , Containerization and Orchestration with Docker and Kubernetes. Version control Exp - Git or Azure Repos Scripting automation - Azure CLI/Powershell Must have: Proficiency in Cloud Technologies Azure, Azure Databricks, ADF, CI/CD pipelines, terraform, Hashicorp Vault, Github, Git Preferred: Containerization and Orchestration with Docker and Kubernetes,IAM, RBAC, OAuth, Change Managment, SSL certificates Knowledge of security best practices and compliance frameworks like GDPR or HIPAA." About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 2 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Spaulding Ridge is an advisory and IT implementation firm. We help global organizations get financial clarity into the complex, daily sales, and operational decisions that impact profitable revenue generations, efficient operational performance, and reliable financial management. At Spaulding Ridge, we believe all business is personal. Core to our values is our relationships with our clients, our business partners, our team, and the global community. Our employees dedicate their time to helping our clients transform their business, from strategy through implementation and business transformation. What You Will Do And Learn As a Snowflake Architect/ Manager in Data Solutions, you’ll be responsible for designing, implementing, and testing proposed modern analytic solutions. Working closely with our client partners and architects, you’ll develop relationships with key technical resources while delivering tangible business outcomes. Manage the Data engineering lifecycle including research, proof of concepts, architecture, design, development, test, deployment, and maintenance Collaborate with team members to design and implement technology that aligns with client business objectives Build proof of concepts for a modern analytics stack supporting a variety of Cloud-based Business Systems for potential clients Team management experience and ability to manage, mentor and develop talent of assigned junior resources Create actionable recommendations based on identified platform, structural and/or logic problems Communicate and demonstrate a clear understanding of client business needs, goals, and objectives Collaborate with other architects on solution designs and recommendations. Qualifications: 8+ years’ experience developing industry leading business intelligence and analytic solutions Must have thorough knowledge of data warehouse concepts and dimensional modelling Must have experience in writing advanced SQL Must have at least 5+ years of hands-on experience on DBT (Data Build Tool). Mandatory to have most recent hands-on experience on DBT. Must have experience working with DBT on one or more of the modern databases like Snowflake / Amazon Redshift / BigQuery / Databricks / etc. Hands-on experience with Snowflake would carry higher weightage Snowflake SnowPro Core certification would carry higher weightage Experience working in AWS, Azure, GCP or similar cloud data platform would be an added advantage Hands-on experience on Azure would carry higher weightage Must have experience in setting up DBT projects Must have experience in understanding / creating / modifying & optimizing YML files within DBT Must have experience in implementing and managing data models using DBT, ensuring efficient and scalable data transformations Must have experience with various materialization techniques within DBT Must have experience in writing & executing DBT Test cases Must have experience in setting up DBT environments Must have experience in setting up DBT Jobs Must have experience with writing DBT Jinja and Macros Must have experience in creating DBT Snapshots Must have experience in creating & managing incremental models using DBT Must have experience with DBT Docs Should have a good understanding of DBT Seeds Must have experience with DBT Deployment Must Experience with architecting data pipelines using DBT, utilizing advanced DBT features Proficiency in version control systems and CI/CD Must have hands-on experience configuring DBT with one or more version control systems like Azure DevOps / Github / Gitlab / etc. Must have experience in PR approval workflow Participate in code reviews and best practices for SQL and DBT development Experience working with visualization tools such as Tableau, PowerBI, Looker and other similar analytic tools would be an added advantage 2+ years of Business Data Analyst experience 2+ years of experience writing Business requirements, Use cases and/or user stories, for data warehouse or data mart initiatives. Understanding and experience on ETL/ELT is an added advantage 2+ years of consulting experience working on project-based delivery using Software Development Life Cycle (SDLC) 2+ years of years of experience with relational databases (Postgres, MySQL, SQL Server, Oracle, Teradata etc.) 2+ years of experience creating functional test cases and supporting user acceptance testing 2+ years of experience in Agile/Kanban/DevOps Delivery Outstanding analytical, communication, and interpersonal skillsAbility to manage projects and teams against planned work Responsible for managing the day-to-day client relationship on projects Spaulding Ridge’s Commitment to an Inclusive Workplace When we engage the expertise, insights, and creativity of people from all walks of life, we become a better organization, we deliver superior services to clients, and we transform our communities and world for the better. At Spaulding Ridge, we believe our team should reflect the rich diversity of society and we take seriously the responsibility to cultivate a workplace where every bandmate feels accepted, respected, and valued for who they are. We do this by creating a culture of trust and belonging, through practices and policies that support inclusion, and through our employee led Employee Resource Groups (ERGs): CRE (Cultural Race and Ethnicity), Women Elevate, PROUD and Mental Wellness Alliance. The company is committed to offering Equal Employment Opportunity and to providing reasonable accommodation to applicants with physical and/or mental disabilities. If you are interested in applying for employment with Spaulding Ridge and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to our VP of Human Resources, Cara Halladay (challaday@spauldingridge.com). Requests for reasonable accommodation will be considered on a case-by-case basis. Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, gender, sexual orientation, gender identity, protected veteran status or disability.
Posted 2 weeks ago
5.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Dear Candidate, Greetings from TATA CONSULTANCY SERVICES LIMITED!!! TCS is Hiring for Data Engineer Please find the required details below. Role: Azure Data Engineer Experience Range: 5 to 8 years Location: Pan India Experience as a data engineer or as a ETL developer Excellent Technical expertise with data modeling, data mining, and database design Excellent skills in Azure Databricks, Azure Data factory, Azure Synapse Analytics Excellent analytical and problem-solving skills Good programming skills in SQL, Python and Apache Spark Framework , Pyspark (2+ year experience) Good knowledge in data classification, data extraction, data migration from different sources, enhance data quality and reliability. Conduct complex data analysis and report on results.
Posted 2 weeks ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Azure Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking a skilled and motivated Azure Data Engineer to join our dynamic team. The ideal candidate will have hands-on experience with Microsoft Azure cloud services, data engineering, and a strong background in designing and implementing scalable data solutions. Key tasks & accountabilities Design, develop, and maintain scalable data pipelines and workflows using Azure Data Factory, Azure Databricks, and other relevant tools. Implement and optimize data storage solutions in Azure, including Azure PostgreSQL Database, Azure Blob Storage. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and implement solutions that align with business objectives. Ensure data quality, integrity, and security in all data-related processes and implementations. Work with both structured and unstructured data and implement data transformation and cleansing processes. Optimize and fine-tune performance of data solutions to meet both real-time and batch processing requirements. Troubleshoot and resolve issues related to data pipelines, ensuring minimal downtime and optimal performance. Stay current with industry trends and best practices, and proactively recommend improvements to existing data infrastructure. 3. Qualifications, Experience, Skills Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer with a focus on Microsoft Azure technologies. Hands-on experience with Azure services such as Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Data Lake Storage, and Azure Synapse Analytics. Strong proficiency in SQL and experience with data modeling and ETL processes. Familiarity with data integration and orchestration tools. Knowledge of data warehousing concepts and best practices. Experience with version control systems, preferably Git. Excellent problem-solving and communication skills. Level Of Educational Attainment Required B.Tech Previous Work Experience 7+ Years of Experience Technical Expertise Proven experience in Azure Databricks and ADLS architecture and implementation. Strong knowledge of medallion architecture and data lake design. Expertise in SQL, Python, and Spark for building and optimizing data pipelines. Familiarity with data integration tools and techniques, including Azure-native solutions. And above all of this, an undying love for beer! We dream big to create future with more cheers
Posted 2 weeks ago
10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
P-375 At Databricks, we are passionate about enabling data teams to solve the world's toughest problems — from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI infrastructure platform so our customers can use deep data insights to improve their business. Founded by engineers — and customer obsessed — we leap at every opportunity to solve technical challenges, from designing next-gen UI/UX for interfacing with data to scaling our services and infrastructure across millions of virtual machines. Databricks Mosaic AI offers a unique data-centric approach to building enterprise-quality, Machine Learning and Generative AI solutions, enabling organizations to securely and cost-effectively own and host ML and Generative AI models, augmented or trained with their enterprise data. And we're only getting started in Bengaluru , India - and currently in the process of setting up 10 new teams from scratch ! As a Staff Software Engineer at Databricks India, you can get to work across : Backend DDS (Distributed Data Systems) Full Stack The Impact You'll Have Our Backend teams span many domains across our essential service platforms. For instance, you might work on challenges such as: Problems that span from product to infrastructure including: distributed systems, at-scale service architecture and monitoring, workflow orchestration, and developer experience. Deliver reliable and high performance services and client libraries for storing and accessing humongous amount of data on cloud storage backends, e.g., AWS S3, Azure Blob Store. Build reliable, scalable services, e.g. Scala, Kubernetes, and data pipelines, e.g. Apache Spark™, Databricks, to power the pricing infrastructure that serves millions of cluster-hours per day and develop product features that empower customers to easily view and control platform usage. Our DDS team spans across: Apache Spark™ Data Plane Storage Delta Lake Delta Pipelines Performance Engineering As a Full Stack software engineer, you will work closely with your team and product management to bring that delight through great user experience. What We Look For BS (or higher) in Computer Science, or a related field 10+ years of production level experience in one of: Python, Java, Scala, C++, or similar language. Experience developing large-scale distributed systems from scratch Experience working on a SaaS platform or with Service-Oriented Architectures. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide — including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https://www.mybenefitsnow.com/databricks. Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Posted 2 weeks ago
5.0 years
0 Lacs
Greater Bengaluru Area
Remote
About us: The global hiring revolution is shaping a future where talent can thrive everywhere, driving innovation and progress on a global scale. Multiplier is at the forefront of this change. By removing barriers and simplifying global hiring, we’re creating a level playing field where businesses and individuals – (like you) – can compete, grow, and succeed, regardless of geography. Multiplier empowers companies to hire, onboard, manage, and pay talent in 150+ countries, quickly and compliantly. Our mission is to build a world without limits, where ambitious businesses can look beyond borders to build their global dream teams. Our unified employment platform, complete with world-class EOR, AOR, and Global Payroll products, means it has never been easier to seize the global hiring opportunity. We’re backed by some of the best in the business, (Sequoia, DST, and Tiger Global), are led by industry-leading experts, scaling fast, and seeking brilliant like-minded enthusiasts to join our team. The future is borderless. Let’s build it together. A BIT ABOUT THE OPPORTUNITY What you'll do: Design and build from scratch, the data architecture and the data platform necessary support the requirements at Multiplier. Work closely with stakeholders and product managers to deliver all data product requirements for our external and internal customers. Understand internal data sets and sources to be able to build data lakes and warehouse to support the continuous needs. Analyse and utilise external data sets and sources to be able to answer questions and derive insights based on the business requirements. What you'll bring: At least 5 years of experience as a Data Engineer or related field. Experience with data modelling, data warehousing, and building ETL pipelines preferably on the AWS stack. Experience with big data tools such as Databricks, Redshift, Snowflake, or similar platforms Proficiency in open table formats like Apache Iceberg, Delta Lake, or Hudi Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience in working with data analytics tools such as Tableau or Quicksight. Experience with high-level scripting/programming languages: Python, JavaScript, Java etc. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Management skill is a huge plus. What we’ll provide for you: Attractive ESOPs Remote employment with truly remote culture. Ability to contribute to this business at a high level. Working with a compassionate, energetic, inspired, ambitious, and diverse team. Opportunity to grow within a fast-growth business. Competitive benefits, compensation, and culture of recognition. Equipment you need to do your job Unlimited holiday policy. Feel free to apply even if you feel unsure about whether you meet every single requirement in this posting. As long as you're a quick learner, and are excited about changing the status quo for tech recruitment, we're happy to support you as you come up to speed with our tech stack.
Posted 2 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role overview Experience : 10 plus years’ experience in software development with at least the last few years in a leadership role with progressively increasing responsibilities 2. Extensive experience in the following areas Proficient in C#, .NET framework, Python Knowledgeable in Angular / React / Node Designing and building cloud-native solutions (Azure, AWS, Google Cloud Platform) Well versed with LLM solutions, RAG, Vectorization Containerization and orchestration technologies (Docker, Kubernetes) Experience in native and third-party Databricks integrations (Delta Live Tables, Lakeflow, Databricks Workflows / Apache Airflow, Unity Catalog) Experience designing and implementing data security and governance platform adhering to compliance standards (HIPPA, SOC 2) preferred Specific Job Knowledge, Skill and Ability Demonstrated success in effectively communicating at all levels of an organization Deep understanding and knowledge on developing products using Microsoft Technologies Ability to lead through influence rather than direct authority Demonstrated successful time management and organization skills Ability to manage and work with a culturally diverse population Ability to work well and productively, always projecting a positive outlook in a fastpaced, deadline-driven environment Ability to anticipate roadblocks, diagnose problems and generate effective solutions Knows how to organize a software development team to maximize quality and output Will promote and encourage opportunities for personal and professional growth in employees Understands how to use metrics to drive process improvements What would you do here? Duties and Responsibilities Function as a key member of the software engineering team, leading software engineering development initiatives for LogixHealth’s internally developed applications 2. Provide decisive and effective technical leadership for all development efforts Develop solution blueprint for all new features and products, bringing in design patterns and latest development guidelines Lead, mentor, and advise the software engineering team. Drive technical debt reduction, design and code reviews, best practice development Consult with infra team on architecture and cost optimization for applications Lead modernization of application and data platform, introducing automation and AI use across all platforms and development methodologies Lead and contribute to the creation of a self-service platforms for software development, infrastructure, and data analytics. Hands-on to take complex development tasks, prototypes, troubleshooting Collaborate with engineers, product, and business leaders to ensure platforms are integrated with other systems and technologies. Participate in development process by identifying potential weak points. Lead solutions development using technical judgment, input from experts and the involvement of other systems development partners as appropriate Clearly and consistently communicate product vision to the team. Guide the team to achieve this vision
Posted 2 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Hi {fullName} There is an opportunity for AZURE DATA ENGINEER AT NOIDA for which WALKIN interview is there on 12th july 25 between 9:00 AM TO 12:30 PM PLS SHARE below details to mamidi.p@tcs.com with subject line as AZURE DATA ENGINEER if you are interested Email id: Contact no: Total EXP: Preferred Location: CURRENT CTC: EXPECTED CTC: NOTICE PERIOD: CURRENT ORGANIZATION: HIGHEST QUALIFICATION THAT IS FULL TIME : HIGHEST QUALIFICATION UNIVERSITY: ANY GAP IN EDUCATION OR EMPLOYMENT: IF YES HOW MANY YEARS AND REASON FOR GAP: ARE U AVAILABLE FOR WALKIN INTERVIEW at NOIDA ON 12TH JULY 25(YES/NO): We will share a mail to you by tom Night if you are shortlisted. 1 Role** Azure Data Engineer Databricks 2 Required Technical Skill Set** Azure SQL, Azure SQL DW, Azure Data Lake Store , Azure Data Factory Must have Implementation, and operations of OLTP, OLAP, DW technologies such as Azure SQL, Azure SQL DW, , Azure Data Lake Store , Azure Data Factory and understanding of Microsoft Azure PaaS features. Azure Cloud ,Azure Databricks, Data Factory knowledge are good to have, otherwise any cloud exposure Ability to gather requirements from client side and explain to tech team members. Resolve conflicts in terms of bandwidth or design issues. Good understanding of data modeling, data analysis, data governance. Very good communication skills and client handling skills
Posted 2 weeks ago
130.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Dentsply Sirona is the world’s largest manufacturer of professional dental products and technologies, with a 130-year history of innovation and service to the dental industry and patients worldwide. Dentsply Sirona develops, manufactures, and markets a comprehensive solutions offering including dental and oral health products as well as other consumable medical devices under a strong portfolio of world class brands. Dentsply Sirona’s products provide innovative, high-quality and effective solutions to advance patient care and deliver better and safer dentistry. Dentsply Sirona’s global headquarters is located in Charlotte, North Carolina, USA. The company’s shares are listed in the United States on NASDAQ under the symbol XRAY. Bringing out the best in people As advanced as dentistry is today, we are dedicated to making it even better. Our people have a passion for innovation and are committed to applying it to improve dental care. We live and breathe high performance, working as one global team, bringing out the best in each other for the benefit of dental patients, and the professionals who serve them. If you want to grow and develop as a part of a team that is shaping an industry, then we’re looking for the best to join us. Working At Dentsply Sirona You Are Able To Develop faster - with our commitment to the best professional development. Perform better - as part of a high-performance, empowering culture. Shape an industry - with a market leader that continues to drive innovation. Make a difference -by helping improve oral health worldwide. Scope Role has global scope and includes managing and leading data flow and transformation development in the Data Engagement Platform (DEP). This role will lead work of DS employees as well as contractors. Key Responsibilities Develop and maintain high quality data warehouse solution. Maintain accurate and complete technical architectural documents. Collaborate with BI Developers and Business Analysts for successful development of BI reporting and analysis. Work with business groups and technical teams to develop and maintain data warehouse platform for BI reporting. Develop scalable and maintainable data layer for BI applications to meet business objectives. To work in a small, smart, agile team – designing, developing and owning full solution for an assigned data area Develop standards, patterns, best practices for reuse and acceleration. Perform maintenance and troubleshooting activities in Azure data platform. Analyze, plan and develop requirements and standards in reference to scheduled projects. Partake in process to define clear project deliverables. Coordinate the development of standards, patterns, best practices for reuse and acceleration. Typical Background Education: University Degree or equivalent in MIS or similar Years And Type Of Experience 5-10 years working with BI and data warehouse solutions. Key Required Skills, Knowledge And Capabilities Good understanding of business logic and understanding of their needs. Some experience with Databricks and dbt is desirable. Worked with Azure DevOps code repository, version control and task management. Strong proficiency with SQL and its variation among popular databases. Knowledge of best practices when dealing with relational databases Capable of troubleshooting common database issues You have knowledge of data design and analysis of BI systems and processes. Strong analytical and logical thinking Internationally and culturally aware Communicate well verbally and in writing in English Key Leadership Behaviors Dentsply Sirona managers are expected to successfully demonstrate behaviors aligned with the Competency model. See competencies below together with a Key Specific. Behaviors For Success Teamwork – Defines success in terms of the whole team Customer Focus – Is dedicated to meeting the expectations and requirements of internal and external customers and seeking to make improvements with the customer in mind Strategic Thinking – Applies experience, knowledge, and perspective of business and external or global factors to create new perspectives and fresh thinking Talent Management – Actively seeks assignments that stretch her beyond comfort zone Integrity – Raises potential ethical concerns to the right party Problem Solving – Can analyze problems and put together a plan for resolution within her scope of responsibility Drive for Results – Can be counted on to reach goals successfully Accountability – Acts with a clear sense of ownership Innovation and Creativity – Brings creative ideas to work and acts to take advantage of opportunities to improve business Leading Change – Adapts to changing priorities and acts without having the total picture DentsplySirona is an Equal Opportunity/ Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, age, sexual orientation, disability, or protected Veteran status. We appreciate your interest in DentsplySirona. If you need assistance with completing the online application due to a disability, please send an accommodation request to careers@dentsplysirona.com. Please be sure to include “Accommodation Request” in the subject.
Posted 2 weeks ago
10.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Designation : Data Architect. Location : Pune. Experience : 10-15 years. Job Description Role & Responsibilities : The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data Lake, Data Factory, HDInsight, Databricks, Azure Cognitive Services, Azure ML, Azure Event Hub. Assist with creation of a robust, sustainable architecture that supports requirements and provides for expansion with secured access. Experience in building/running large data environment for BFSI clients. Work with customers, end users, technical architects, and application designers to define the data requirements and data structure for BI/Analytic solutions. Designs conceptual and logical models for the data lake, data warehouse, data mart, and semantic layer (data structure, storage, and integration). Lead the database analysis, design, and build effort. Communicates physical database designs to lead data architect/database administrator. Evolves data models to meet new and changing business requirements. Work with business analysts to identify and understand requirements and source data systems. Skills Required Big Data Technologies : Expert in big data technologies on Azure/GCP. ETL Platforms : Experience with ETL platforms like ADF, Glue, Ab Initio, Informatica, Talend, Airflow. Data Visualization : Experience in data visualization tools like Tableau, Power BI, etc. Data Engineering & Management : Experience in a data engineering, metadata management, database modeling and development role. Streaming Data Handling : Strong experience in handling streaming data with Kafka. Data API Understanding : Understanding of Data APIs, Web services. Data Security : Experience in Data security and Data Archiving/Backup, Encryption and define the standard processes for same. DataOps/MLOps : Experience in setting up DataOps and MLOps. Integration : Work with other architects to ensure that all components work together to meet objectives and performance goals as defined in the requirements. Data Science Coordination : Coordinate with the Data Science Teams to identify future data needs and requirements and creating pipelines for them. Soft Skills Soft skills such as communication, leading the team, taking ownership and accountability to successful engagement. Participate in quality management reviews. Managing customer expectation and business user interactions. Deliver key research (MVP, POC) with an efficient turn-around time to help make strong product decisions. Demonstrate key understanding and expertise on modern technologies, architecture, and design. Mentor the team to deliver modular, scalable, and high-performance code. Innovation : Be a change agent on key innovation and research to keep the product, team at the cutting edge of technical and product innovation. (ref:hirist.tech)
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Required Skills Proficiency in multiple programming languages ideally Python Proficiency in at least one cluster computing frameworks (preferably Spark, alternatively Flink or Storm) Proficiency in at least one cloud data lakehouse platforms (preferably AWS data lake services or Databricks, alternatively Hadoop), at least one relational data stores (Postgres, Oracle or similar) and at least one NOSQL data stores (Cassandra, Dynamo, MongoDB or similar) Proficiency in at least one scheduling/orchestration tools (preferably Airflow, alternatively AWS Step Functions or similar) Proficiency with data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc. ), Agile methodology (develop PI plans and roadmaps), TDD (or BDD) and CI/CD tools (Jenkins, Git,) Strong organizational, problem-solving and critical thinking skills; Strong documentation skills Preferred Skills Proficiency in IaC (preferably Terraform, alternatively AWS cloud formation) (ref:hirist.tech)
Posted 2 weeks ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Implementation (10+ Years) Job Description We are seeking a strategic Observability Lead to spearhead AWACs SRE tooling transformation. This role involves evaluating and recommending modern alternatives to Datadog and LogicMonitor, conducting gap analysis, and leading the implementation of selected tools across AWS, Azure, and Databricks environments. The ideal candidate will bring deep monitoring expertise and leadership in tool selection, integration, and rollout. Key Responsibilities Lead analysis of current monitoring tools (Datadog, LogicMonitor) Identify and evaluate SRE tool alternatives (e.g., Prometheus, Grafana, New Relic, Dynatrace) Architect and implement chosen solutions across cloud and data platforms Collaborate with engineering and data teams to ensure seamless integration Key Skills: SRE Tooling Strategy, AWS, Azure, Databricks, SQL Server, SSIS, Monitoring Architecture, Tool Evaluation, Implementation Leadership
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France