Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Thane, Maharashtra
On-site
202503666 Thane, Maharashtra, India Bevorzugt Description Experience: 4+ years of experience Key Responsibilities Help define/improvise actionable and decision driving management information Ensure streamlining, consistency and standardization of MI within the handled domain Build and operate flexible processes/reports that meet changing business needs Would be required to prepare the detailed documentation of schemas Any other duties commensurate with position or level of responsibility Desired Profile Prior experience in insurance companies/insurance sector would be an added advantage Experience of Azure Technologies (include SSAS, SQL Server, Azure Data Lake, Synapse) Hands on experience on SQL and PowerBI. Excellent understanding in developing stored procedures, functions, views and T-SQL programs. Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using ADF to extract the data from multiple source systems. Analyze existing SQL queries for performance improvements. Excellent written and verbal communication skills Ability to create and design MI for the team Expected to handle multiple projects, with stringent timelines Good interpersonal skills Actively influences strategy by creative and unique ideas Exposure to documentation activities Key Competencies Technical Learning - Can learn new skills and knowledge, as per business requirement Action Orientated - Enjoys working; is action oriented and full of energy for the things he/she sees as challenging; not fearful of acting with a minimum of planning; seizes more opportunities than others. Decision Quality - Makes good decisions based upon a mixture of analysis, wisdom, experience, and judgment. Detail Orientation : High attention to detail, especially in data quality, documentation, and reporting Communication & Collaboration : Strong interpersonal skills, effective in team discussions and stakeholder interactions Qualifications B.Com/ BE/BTech/MCA from reputed College/Institute
Posted 1 month ago
5.0 years
0 Lacs
India
On-site
Job Summary: We are looking for an experienced Database & Data Engineer who can own the full lifecycle of our cloud data systems—from database optimization to building scalable data pipelines. This hybrid role demands deep expertise in SQL performance tuning , cloud-native ETL/ELT , and modern Azure data engineering using tools like Azure Data Factory, Databricks, and PySpark . Ideal candidates will be comfortable working across Medallion architecture , transforming raw data into high-quality assets ready for analytics and machine learning. Key Responsibilities: 🔹 Database Engineering Implement and optimize indexing, partitioning, and sharding strategies to improve performance and scalability. Tune and refactor complex SQL queries, stored procedures, and triggers using execution plans and profiling tools. Perform database performance benchmarking, query profiling , and resource usage analysis. Address query bottlenecks, deadlocks, and concurrency issues using diagnostic tools and SQL optimization. Design and implement read/write splitting and horizontal/vertical sharding for distributed systems. Automate backup, restore, high availability, and disaster recovery using native Azure features. Maintain schema versioning and enable automated deployment via CI/CD pipelines and Git . 🔹 Data Engineering Build and orchestrate scalable data pipelines using Azure Data Factory (ADF), Databricks , and PySpark . Implement Medallion architecture with Bronze, Silver, and Gold layers in Azure Data Lake. Process and transform data using PySpark, Pandas, and NumPy . Create and manage data integrations from REST APIs , flat files, databases, and third-party systems. Develop and manage incremental loads , SCD Type 1 & 2 , and advanced data transformation workflows . Leverage Azure services like Synapse, Azure SQL DB, Azure Blob Storage , and Azure Data Lake Gen2 . Ensure data quality, consistency, and lineage across environments. 🔹 Collaboration & Governance Work with cross-functional teams including data science, BI, and business analysts. Maintain standards around data governance, privacy, and security compliance . Contribute to internal documentation and team knowledge base using tools like JIRA, Confluence, and SharePoint . Participate in Agile workflows and help define sprint deliverables for data engineering tasks. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 5+ years of hands-on experience in data engineering and SQL performance optimization in cloud environments. Expertise in Azure Data Factory, Azure Data Lake, Azure SQL, Azure Synapse , and Databricks . Proficient in SQL, Python, PySpark, Pandas, and NumPy . Strong experience in query performance tuning, indexing, and partitioning . Familiar with PostgreSQL (PGSQL) and handling NoSQL databases like Cosmos DB or Elasticsearch . Experience with REST APIs , flat files, and real-time integrations. Working knowledge of version control (Git) and CI/CD practices in Azure DevOps or equivalent. Solid understanding of Medallion architecture , lakehouse concepts, and data reliability best practices. Preferred Qualifications: Microsoft Certified: Azure Data Engineer Associate or equivalent. Familiarity with Docker, Kubernetes , or other containerization tools. Exposure to streaming platforms such as Kafka, Azure Event Hubs, or Azure Stream Analytics. Industry experience in supply chain, logistics, or finance is a plus. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors / 4 year university degree Experience - Between 5+ years of experience Development experience on Azure and Databricks Hands-on experience in developing ETL pipeline using Databricks and ADF(Azure Data Factory) Hands-on experience in Python, PySpark, Spark SQL Hands-on experience in on-premises environments to Azure Cloud Azure cloud exposure(Azure services like Virtual Machines, Load Balancer, SQL Database, Azure DNS, Blob Storage, AzureAD etc.) Good to have hands-on experience on Bigdata platform (Hadoop, Hive) and SQL scripting Good to have experience of Scala, Snowflake, Healthcare domain Good to have experience with CI/CD tools such as GitHub Action Ability to develop innovative approaches on performance optimization & automation Proven excellent verbal communication and presentation skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description Key Responsibilities Lead the design, development, and deployment of Oracle Fusion SaaS solutions, particularly in Supply Chain and Finance. Build and maintain integrations using Oracle Integration Cloud (OIC), REST/SOAP web services, and middleware tools. Customize and extend Fusion applications using BI Publisher, OTBI, FBDI, HDL, and ADF. Translate business requirements into technical specifications and detailed solution designs. Support the full development lifecycle including change management, documentation, testing, and deployment. Participate in formal design/code reviews and ensure adherence to coding standards. Collaborate with IT service providers to ensure quality, performance, and scalability of outsourced work. Provide Level 3 support for critical technical issues. Stay current with emerging Oracle technologies and contribute to continuous improvement initiatives. Responsibilities Qualifications Bachelor’s degree in Computer Science, Information Technology, Business, or a related field (or equivalent experience). Relevant certifications in Oracle Fusion or related technologies are a plus. Compliance with export controls or sanctions regulations may be required. Core Competencies Customer Focus – Builds strong relationships and delivers customer-centric solutions. Global Perspective – Applies a broad, global lens to problem-solving. Manages Complexity – Navigates complex information to solve problems effectively. Manages Conflict – Resolves conflicts constructively and efficiently. Optimizes Work Processes – Continuously improves processes for efficiency and effectiveness. Values Differences – Embraces diverse perspectives and cultures. Technical Competencies Solution Design & Configuration – Designs scalable, secure, and maintainable solutions. Solution Functional Fit Analysis – Evaluates how well components interact to meet business needs. Solution Modeling & Validation Testing – Creates models and tests solutions to ensure they meet requirements. Performance Tuning & Data Modeling – Optimizes application and database performance. Qualifications Skills & Technical Expertise Strong knowledge of Oracle SaaS architecture, data models, and PaaS extensions. Proficiency in Oracle Integration Cloud (OIC), REST/SOAP APIs. Experience with Oracle tools: BI Publisher, OTBI, FBDI, HDL, ADF. Ability to analyze and revise existing systems for improvements. Familiarity with SDLC, version control, and automation tools. Experience 5+ years of hands-on experience in Oracle Fusion SaaS development and technical implementation. Proven experience with Oracle Fusion Supply Chain and Finance modules. Intermediate level of relevant work experience (3–5 years minimum). Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2415054 Relocation Package No Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description Key Responsibilities Lead the design, development, and deployment of Oracle Fusion SaaS solutions, particularly in Supply Chain and Finance. Build and maintain integrations using Oracle Integration Cloud (OIC), REST/SOAP web services, and middleware tools. Customize and extend Fusion applications using BI Publisher, OTBI, FBDI, HDL, and ADF. Translate business requirements into technical specifications and detailed solution designs. Support the full development lifecycle including change management, documentation, testing, and deployment. Participate in formal design/code reviews and ensure adherence to coding standards. Collaborate with IT service providers to ensure quality, performance, and scalability of outsourced work. Provide Level 3 support for critical technical issues. Stay current with emerging Oracle technologies and contribute to continuous improvement initiatives. Responsibilities Qualifications Bachelor’s degree in Computer Science, Information Technology, Business, or a related field (or equivalent experience). Relevant certifications in Oracle Fusion or related technologies are a plus. Compliance with export controls or sanctions regulations may be required. Core Competencies Customer Focus – Builds strong relationships and delivers customer-centric solutions. Global Perspective – Applies a broad, global lens to problem-solving. Manages Complexity – Navigates complex information to solve problems effectively. Manages Conflict – Resolves conflicts constructively and efficiently. Optimizes Work Processes – Continuously improves processes for efficiency and effectiveness. Values Differences – Embraces diverse perspectives and cultures. Technical Competencies Solution Design & Configuration – Designs scalable, secure, and maintainable solutions. Solution Functional Fit Analysis – Evaluates how well components interact to meet business needs. Solution Modeling & Validation Testing – Creates models and tests solutions to ensure they meet requirements. Performance Tuning & Data Modeling – Optimizes application and database performance. Qualifications Experience 5+ years of hands-on experience in Oracle Fusion SaaS development and technical implementation. Proven experience with Oracle Fusion Supply Chain and Finance modules. Intermediate level of relevant work experience (3–5 years minimum). Skills & Technical Expertise Strong knowledge of Oracle SaaS architecture, data models, and PaaS extensions. Proficiency in Oracle Integration Cloud (OIC), REST/SOAP APIs. Experience with Oracle tools: BI Publisher, OTBI, FBDI, HDL, ADF. Ability to analyze and revise existing systems for improvements. Familiarity with SDLC, version control, and automation tools. Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2415084 Relocation Package No Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. About UHG United Health Group is a leading health care company serving more than 85 million people worldwide. The organization is ranked 5th among Fortune 500 companies. UHG serves its customers through two different platforms - United Health Care (UHC) and Optum. UHC is responsible for providing healthcare coverage and benefits services, while Optum provides information and technology enabled health services. India operations of UHG are aligned to Optum. The Optum Global Analytics Team, part of Optum, is involved in developing broad-based and targeted analytics solutions across different verticals for all lines of business. Primary Responsibilities Gather, analyze and document business requirements. At the same time leveraging knowledge of claims, clinical and other healthcare systems ETL jobs Development using Talend, Python, Cloud Based Data-Warehouse, Jenkins, Kafka and an orchestration tool Writing advanced SQL queries Create and interpret functional and technical specifications and design documents Understand the business and how various data elements and subject areas are utilized in order to develop and deliver the reports to business Be an SME either on Claims, member or provider module Provide regular status updates to higher management Design, develop, and implement scalable and high-performing data models and solutions using Snowflake and Oracle Manage and optimize data replication and ingestion processes using Oracle and Snowflake Develop and maintain ETL pipelines using Azure Data Factory (ADF) and Databricks Optimize query performance and reduce latency by leveraging pre-aggregated tables and efficient data processing techniques Collaborate with cross-functional teams to understand data requirements and deliver high-quality data solutions Implement data security measures and ensure compliance with industry standards Automate data governance and security controls to maintain data integrity and compliance Develop and maintain comprehensive documentation for data architecture, data flows, ETL processes, and configurations Continuously optimize the performance of data pipelines and queries to improve efficiency and reduce costs Basic, structured, standard approach to work Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelors or 4 year university degree 5+ years of experience Experience in developing ETL jobs using Snowflake, ADF , Databricks and Python Experience in writing efficient and advanced SQL queries Experience in both producing and consuming data utilizing Kafka Experience working on large scale cloud-based data warehouse- Snowflake, Databricks Good experience in building data pipelines using ADF Knowledge of Agile methodologies, roles, responsibilities and deliverables Proficiency in Python for data processing and automation Demonstrated ability to learn and adapt to new data technologies Preferred Qualifications Certified in Azure Data Engineering (AZ-205) Extensive experience with Azure cloud services (Azure Data Factory, Azure Databricks, Azure SQL Database, etc.) Solid understanding of CI/CD principles and tools (e.g., Jenkins, GitLab CI/CD) Knowledge of SQL and NoSQL databases Proficiency in Python for data processing and automation Proven excellent time management, communication, decision making, and presentation skills Proven good problem-solving skills Proven good communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist. In this role, you will: Design, develop, and optimize data pipelines using Azure Databricks, PySpark, and Prophesy. Implement and maintain ETL/ELT pipelines using Azure Data Factory (ADF) and Apache Airflow for orchestration. Develop and optimize complex SQL queries and Python-based data transformation logic. Work with version control systems (GitHub, Azure DevOps) to manage code and deployment processes. Automate deployment of data pipelines using CI/CD practices in Azure DevOps. Ensure data quality, security, and compliance with best practices. Monitor and troubleshoot performance issues in data pipelines. Collaborate with cross-functional teams to define data requirements and strategies. Requirements To be successful in this role, you should meet the following requirements: 5+ years of experience in data engineering, working with Azure Databricks, PySpark, and SQL. Hands-on experience with Prophesy for data pipeline development. Proficiency in Python for data processing and transformation. Experience with Apache Airflow for workflow orchestration. Strong expertise in Azure Data Factory (ADF) for building and managing ETL processes. Familiarity with GitHub and Azure DevOps for version control and CI/CD automation. Solid understanding of data modelling, warehousing, and performance optimization. Ability to work in an agile environment and manage multiple priorities effectively. Excellent problem-solving skills and attention to detail. Experience with Delta Lake and Lakehouse architecture. Hands-on experience with Terraform or Infrastructure as Code (IaC). Understanding of machine learning workflows in a data engineering context. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description The role is based in Pune, India. Compensation to be paid in INR. Defining and developing data sets, models and cubes Build simple to complex pipelines & data flows in ADF which includes extracting data from the source system into the data warehouse staging area, ensuring data validation, data accuracy, data type conversion, and business rule application. Strong Visualization skills in PBI , expert knowledge in writing advanced DAX in Power BI, Power Queries , Power Automate , M Language , R Programming. Advance knowledge of Azure SQL DB & Synapse Analytics , Azure Data Bricks, Power BI Should be able to analyse and understand complex data. Knowledge of Azure data lake storage is required and Services like Azure Analysis Service & SQL Databases. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Please find below job details : Role : Senior Associate - Tax Data Solution Consultants (Data Engineer) Experience : 4+ years Location : Gurgaon Mode : Hybrid Mandatory Skills : Alteryx, Azure Data Factory (ADF), PowerApps Job Description : Own the end-to-end development of scalable, secure, and high-performance data pipelines using Azure Data Factory. This includes sourcing data from various systems (on-premises and cloud), transforming it efficiently, handling large volumes, and ensuring data integrity across pipelines. Write highly optimized, production-grade SQL queries for complex data transformations, analysis, and reporting. You will work with relational databases (e.g., Azure SQL, SQL Server) to design schemas, write stored procedures, subqueries, and develop solutions that scale and perform reliably. Build business-focused applications, workflows, and dashboards using Power BI, Power Apps, and Power Automate. This includes automating manual processes and connecting to diverse data sources using connectors and APIs. Understand and work with data from ERP systems like SAP, Oracle, etc. Serve as a trusted advisor to clients by clearly articulating solution approaches, gathering evolving requirements, and presenting results. You will lead discussions, facilitate workshops, and ensure alignment between technical outputs and business needs Take full ownership of solution components—from idea to deployment. You’re expected to drive initiatives independently, troubleshoot issues, and lead development efforts without heavy supervision. You will manage timelines, identify risks, and ensure high quality delivery. Maintain documentation of data processes, system configurations, and any modifications made to automation workflows. Provide routine system checks and generate performance reports to ensure optimal platform functionality. Required Qualifications & Skills: Bachelor’s degree in information systems, engineering, or a related discipline. 4 – 7 years of hands-on experience with SQL. Confident writing complex queries, joins, and transformations. Hands-on experience with data management platforms and automation tools, using nocode solutions like Azure Data Factory or Alteryx. Able to build and manage end-to-end pipelines. Strong problem-solving skills with the ability to troubleshoot data issues effectively. Excellent communication and client liaison skills. Ability to work in a fast-paced environment and manage multiple tasks simultaneously. Attention to detail and a strong commitment to data accuracy and quality. Preferred Experience: Certifications in Power Platform, Azure, SQL, or Alteryx. Knowledge of additional Azure services (e.g., Synapse, Logic Apps). Familiarity with third party tax solutions and data integration processes is a plus. What We Offer: A vibrant, flexible work culture focused on innovation, excellence, and support. Opportunity to build solutions from scratch and make real impact. Opportunities for career development and professional training in advanced data and automation technologies. Competitive salary, benefits, and a supportive team environment. Interested candidates can share your updated CV at shivani.sah@promaynov.com Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
This role is for one of Weekday's clients Salary range: Rs 2600000 - Rs 2800000 (ie INR 26-28 LPA) Min Experience: 8 years Location: Mumbai JobType: full-time Requirements About the Role We are seeking a highly skilled and experienced Data Architect to lead the design, development, and optimization of our enterprise data infrastructure. This is a strategic role for an individual passionate about modern data platforms, cloud technologies, and data governance. The ideal candidate will bring deep expertise in Data Engineering , Azure Data Services , Databricks , Power BI , and ETL frameworks to drive scalable and secure data architecture for enterprise analytics and reporting. As a Data Architect, you will collaborate with cross-functional teams including business analysts, data scientists, and application developers to ensure the delivery of high-quality, actionable data solutions. Your work will directly influence data-driven decision-making and operational efficiency across the organization. Key Responsibilities Architect Scalable Data Solutions: Design and implement robust, secure, and scalable data architectures using Microsoft Azure, Databricks, and ADF for large-scale enterprise environments. Data Engineering Leadership: Provide technical leadership and mentoring to data engineering teams on ETL/ELT best practices, data pipeline development, and optimization. Cloud-Based Architecture: Build and optimize data lakes and data warehouses on Azure, leveraging Azure Data Lake, Synapse Analytics, and Azure SQL services. Databricks Expertise: Use Azure Databricks for distributed data processing, real-time analytics, and machine learning data pipelines. ETL Frameworks: Design and maintain ETL workflows using Azure Data Factory (ADF), ensuring efficient movement and transformation of data from multiple sources. Visualization & Reporting: Collaborate with business stakeholders to deliver intuitive and insightful dashboards and reports using Power BI. Data Governance & Quality: Enforce data quality standards, lineage, and governance across all data assets, ensuring compliance and accuracy. Collaboration & Integration: Work with application developers and DevOps teams to integrate data systems with other enterprise applications. Documentation & Standards: Maintain detailed architecture diagrams, data dictionaries, and standard operating procedures for all data systems. Required Skills & Qualifications 8+ years of experience in data engineering, data architecture, or related fields. Proven experience designing and implementing cloud-based data solutions using Microsoft Azure. Hands-on expertise in Azure Databricks, ADF, Azure SQL, Data Lake Storage, and Power BI. Strong proficiency in ETL/ELT development, pipeline orchestration, and performance optimization. Solid understanding of data modeling, warehousing concepts (Kimball/Inmon), and big data technologies. Proficiency in scripting languages such as Python, SQL, and Spark. Experience in managing data security, compliance, and governance in large enterprises. Strong problem-solving skills and a collaborative mindset. Preferred Qualifications Azure certifications such as Azure Data Engineer Associate or Azure Solutions Architect Expert. Experience with CI/CD pipelines for data workloads. Exposure to MDM (Master Data Management) and data catalog tools. Show more Show less
Posted 1 month ago
125.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche’s 125-year history as one of the world’s largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients’ needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we’re driven by a shared passion for technological novelties and optimal IT solutions. Position Overview We are seeking an experienced ETL Architect to design, develop, and optimize data extraction, transformation, and loading (ETL) solutions and to work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. This role requires deep expertise in Python, AWS Cloud, and ETL tools to build and maintain scalable data pipelines and architectures. The ETL Architect will work closely with cross-functional teams to ensure efficient data integration, storage, and accessibility for business intelligence and analytics. Key Responsibilities ETL Design & Development: Architect and implement high-performance ETL pipelines using AWS cloud services, Snowflake, and ETL tools such as Talend, Dbt, Informatica, ADF etc Data Architecture: Design and implement scalable, efficient, and cloud-native data architectures Data Integration & Flow: Ensure seamless data integration across multiple source systems, leveraging AWS Glue, Snowflake, and other ETL tools Performance Optimization: Monitor and tune ETL processes for performance, scalability, and cost-effectiveness Governance & Security: Establish and enforce data quality, governance, and security standards for ETL processes Collaboration: Work with data engineers, analysts, and business stakeholders to define data requirements and ensure effective solutions Documentation & Best Practices: Maintain comprehensive documentation and promote best practices for ETL development and data transformation Troubleshooting & Support: Diagnose and resolve performance issues, failures, and bottlenecks in ETL processes Required Qualifications Education: Bachelor's or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field Experience: 6+ years of experience in ETL development, with 3+ years in an ETL architecture role Expertise in Snowflake or any MPP data warehouse (including Snowflake data modeling, optimization, and security best practices) Strong experience with AWS Cloud services, especially AWS Glue, AWS Lambda, S3, Redshift, and IAM or Azure/GCP cloud services Proficiency in ETL tools such as Informatica, Talend, Apache NiFi, SSIS, or DataStage Strong SQL skills and experience with relational and NoSQL databases Experience in API integrations Proficiency in scripting languages (Python, Shell, PowerShell) for automation Prior experience in Pharmaceutical or Diagnostics or healthcare domain is a plus Soft Skills Strong analytical and problem-solving abilities Excellent communication and documentation skills Ability to work collaboratively in a fast-paced, cloud-first environment Preferred Qualifications Certifications in AWS, Snowflake, or ETL tools Experience in real-time data streaming, microservices-based architectures, and DevOps for data pipelines Knowledge of data governance, compliance (GDPR, HIPAA), and security best practices Who we are A healthier future drives us to innovate. Together, more than 100’000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let’s build a healthier future, together. Roche is an Equal Opportunity Employer. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Hyderābād
On-site
We are looking for a highly self-motivated individual with Azure Data Engineering (Synapse, ADF, T-SQL) as a Technical Lead: Work Experience: Experience should have 8 Years to 12 Years of Azure Data Engineering. 7+ years of experience in ETL and Data Warehousing development. Experience with data modelling and ETL. Strong hands-on experience in ETL process preferably Microsoft technologies. Knowledge and experience in Azure Synapse Analytics, Azure Synapse DWH, building pipelines in AZURE Synapse platform and TSQL and Azure Data Factory. Experience in data warehouse design and maintenance is plus. Experience in agile development processes using Jira and Confluence. Understanding on the SDLC. Understanding on the Agile methodologies. Communication with customer and producing the Daily status report. Should have good oral and written communication. Should be proactive and adaptive. Skills and languages: Proficiency in written and spoken English; working knowledge of another UN language would be an asset. Expected Deliverables: Candidate should help IST in data extraction process from ERP to Datawarehouse. Strong knowledge on TSQL and writing Stored procedures. Should help building data models in different area of ERP based data sets. Candidate should build pipeline to establish integration of data flow among different sources of IT applications.
Posted 1 month ago
0 years
3 - 6 Lacs
Gurgaon
On-site
#freepost Designation: Middleware Administrator (L1) Location: Gurugram Qualification: B.E. / B. Tech/BCA Roles and Responsability Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. Shutdown and start-up of applications, generation of MIS reports, monitoring of application load user account management scripts execution, analysing system events, monitoring of error logs etc Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc. Job Type: Full-time Work Location: In person
Posted 1 month ago
3.0 years
6 - 8 Lacs
Chennai
Remote
Job Title: Azure Data Engineer Experience: 3+ years Location : Remote Job Description : 3+ years of experience as a Data Engineer with strong Azure expertise Proficiency in Azure Data Factory (ADF) and Azure Blob Storage Working knowledge of SQL and data modeling principles Experience working with REST APIs for data integration Hands-on experience with Snowflake data warehouse Exposure to GitHub and Azure DevOps for CI/CD and version control Understanding of DevOps concepts as applied to data workflows Azure certification (e.g., DP-203) is highly desirable Strong problem-solving and communication skills Speak with Employer: Mobile Number: 96293 11599 Email-ID: aswini.u@applogiq.org Job Types: Full-time, Permanent Benefits: Health insurance Schedule: Day shift Work Location: In person
Posted 1 month ago
3.0 years
12 Lacs
India
Remote
Dear Candidates, Greetings from ADAPTON..!!! Role : Azure Data Engineer Working Timing : 01.00 PM to 10.00 PM (Hybrid) Working Days : 5 days of work Salary : As per your Current CTC Work Location : Guindy, Chennai Job Summary: We are seeking a skilled Azure Data Engineer to design, implement, and optimize data solutions on Microsoft Azure. The ideal candidate will have strong expertise in Azure data services, data pipelines, and data integration while ensuring high availability, performance, and security. Key Responsibilities: Design and develop data pipelines using Azure Data Factory (ADF) and Azure Databricks. Implement data storage solutions using Azure Data Lake, Azure SQL Database, and Synapse Analytics. Optimize and manage ETL/ELT processes to ensure data integrity and performance. Build and maintain data models, ensuring scalability and efficiency. Work with Azure Stream Analytics for real-time data processing. Implement and manage Azure Blob Storage, Cosmos DB, and other Azure data services. Develop and maintain Azure SQL, Synapse Analytics, and Power BI dashboards as needed. Ensure data security, compliance, and governance (GDPR, HIPAA, etc.). Collaborate with data scientists, analysts, and business teams to provide data solutions. Monitor and troubleshoot Azure data services for performance and reliability. Required Skills & Qualifications: 3+ years of experience in Azure data engineering. Strong knowledge of Azure Data Factory, Azure Synapse, Azure Data Lake, and Databricks. Proficiency in SQL, Python, or Scala for data processing. Experience with ETL/ELT frameworks and data transformation techniques. Understanding of Azure DevOps, CI/CD, and Infrastructure as Code (Terraform, Bicep). Knowledge of big data technologies (Spark, Delta Lake, etc.). Experience with real-time and batch processing solutions. Strong problem-solving skills and ability to optimize query performance. Perks & Benefits: Competitive Salary | Flexible work environment (Hybrid options) | Career Growth -- Thanks and Regards, Prathipa V Senior HR prathipahr03@gmail.com Job Types: Full-time, Permanent Pay: Up to ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Work from home Schedule: Day shift Monday to Friday Morning shift Supplemental Pay: Yearly bonus Education: Bachelor's (Required) Experience: ADF: 3 years (Required) Data bricks: 3 years (Required) Data lake: 3 years (Required) Work Location: In person
Posted 1 month ago
0 years
9 - 10 Lacs
Bengaluru
On-site
Location: Bengaluru, KA, IN Company: ExxonMobil About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team Design, build, and maintain data systems, architectures, and pipelines to extract insights and drive business decisions. Collaborate with stakeholders to ensure data quality, integrity, and availability. What you will do Support in developing and owning ETL pipelines within cloud data platforms Data extraction and transformation pipeline Automation using Python/Airflow/Azure Data Factory/Qlik/Fivetran Delivery of task monitoring and notification system for data pipeline status Supporting data cleansing, enrichment, and curation / enrichment activities to enable ongoing business use cases Developing and delivering data pipelines through a CI/CD delivery methodology Developing monitoring around pipelines to ensure uptime of data flows Optimization and refinement of current queries against Snowflake Working with Snowflake, MSSQL, Postgres, Oracle, Azure SQL, and other relational databases Work with different cloud databases such as Azure SQL, Azure PostgreSQL, Etc. Working with Change-Data-Capture ETL software to populate Snowflake such as Qlik and Fivetran Identification and remediation of failed and long running queries Development of large aggregate queries across a multitude of schemas About You Skills and Qualifications Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Preferred Qualifications/ Experience Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India . Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Job Segment: Sustainability, Cloud, Database, SQL, Embedded, Energy, Technology
Posted 1 month ago
0 years
5 - 8 Lacs
Bengaluru
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Experience in Microsoft SQL Server database development (TSQL) Experience in building SSIS packages Good experience in creation LLD’s.Experience delivering solutions utilizing the entire Microsoft BI stack (SSAS, SSIS) Experience with SQL Server/T-SQL programming in creation and optimization of stored procedures, triggers and user defined functions Microsoft SQL Server database development (TSQL) experience Experience working in a data warehouse environment and a strong understanding of dimensional data modeling concepts Must be able to build Business Intelligence solutions in a collaborative, agile development environment Strong understanding of Data Ingestion, Data processing, Orchestration, Parallelization, Transformation and ETL fundamentals Sound knowledge of data analysis using any SQL tools Experience in ADF, Synapse and other Azure components Designs develop, automates, and support complex applications to extract, transform, and load data Should have knowledge of error handling and Performance tuning for Data pipelines Skill: SQL, SSIS, ADF, T-SQL, ETL & DW, Good Communication Qualifications Graduate
Posted 1 month ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills: Technology->AWS->Devops Technology->Cloud Integration->Azure Data Factory (ADF),Technology->Cloud Platform->AWS Database, Technology->Cloud Platform->Azure Devops->Azure Pipelines, Technology->DevOps->Continuous integration - Mainframe A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
DATA ENGINEER II 3+ years of experience in building data Pipelines with Python/PySpark Professional experience in Azure ETL stack ( eg. ADLS, ADF, ADB, ASQL/Synapse) 3+ years of experience with SQL Proficient understanding of code versioning tools such as Git and PM tool like Jira Show more Show less
Posted 1 month ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: We are seeking an experienced Azure Data Architect to lead data architecture design and implementation on Azure cloud. The ideal candidate will have deep expertise in Databricks , Azure Functions , and hold a valid Azure Certification . Required Skills: 8–12 years of experience in data architecture and engineering Strong hands-on experience with Azure Databricks and Spark Proficiency in Azure Functions , Data Lake , Azure Synapse , ADF Certified in Microsoft Azure (e.g., AZ-305, DP-203) Solid understanding of data modeling, governance, and performance tuning Show more Show less
Posted 1 month ago
12.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description Candidate Specification - 12+ years of relevant experience in Azure Fullstack Architect . Job Description Design and develop solutions, create POCs, deliver demos, and provide estimations. Guide and train teams on design implementation and best practices. Evaluate technologies, define strategies, and develop reusable solutions. Work with Azure Data Management and integration tools such as Azure Fabric, Data Lake, ADF, and Logic Apps. Manage development and deployment using Azure Functions, DevOps, IAM, Kubernetes (HMG deployment, HPA), and resource administration. Implement AI/ML solutions using Azure Machine Learning, OpenAI Service, and AI capabilities (vision, speech, language, search). Utilize Generative AI tools like Azure Copilot and AI agents to enhance automation and intelligence Skills Required RoleSM/AD - Azure Fullstack Architect Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills AZURE FABRIC DATA LAKE ADF AZURE FULLSTACK ARCHITECT ML Other Information Job CodeGO/JC/21361/2025 Recruiter NameRamya Show more Show less
Posted 1 month ago
12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Candidate Specification - 12+ years of relevant experience in Azure Fullstack Architect . Job Description Design and develop solutions, create POCs, deliver demos, and provide estimations. Guide and train teams on design implementation and best practices. Evaluate technologies, define strategies, and develop reusable solutions. Work with Azure Data Management and integration tools such as Azure Fabric, Data Lake, ADF, and Logic Apps. Manage development and deployment using Azure Functions, DevOps, IAM, Kubernetes (HMG deployment, HPA), and resource administration. Implement AI/ML solutions using Azure Machine Learning, OpenAI Service, and AI capabilities (vision, speech, language, search). Utilize Generative AI tools like Azure Copilot and AI agents to enhance automation and intelligence Skills Required RoleSM/AD - Azure Fullstack Architect Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills AZURE FABRIC DATA LAKE ADF AZURE FULLSTACK ARCHITECT ML Other Information Job CodeGO/JC/21361/2025 Recruiter NameRamya Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Experience in Microsoft SQL Server database development (TSQL) Experience in building SSIS packages Good experience in creation LLD’s.Experience delivering solutions utilizing the entire Microsoft BI stack (SSAS, SSIS) Experience with SQL Server/T-SQL programming in creation and optimization of stored procedures, triggers and user defined functions Microsoft SQL Server database development (TSQL) experience Experience working in a data warehouse environment and a strong understanding of dimensional data modeling concepts Must be able to build Business Intelligence solutions in a collaborative, agile development environment Strong understanding of Data Ingestion, Data processing, Orchestration, Parallelization, Transformation and ETL fundamentals Sound knowledge of data analysis using any SQL tools Experience in ADF, Synapse and other Azure components Designs develop, automates, and support complex applications to extract, transform, and load data Should have knowledge of error handling and Performance tuning for Data pipelines Skill: SQL, SSIS, ADF, T-SQL, ETL & DW, Good Communication Qualifications Graduate Show more Show less
Posted 1 month ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Candidate Specification - 12+ years of relevant experience in Azure Fullstack Architect . Job Description Design and develop solutions, create POCs, deliver demos, and provide estimations. Guide and train teams on design implementation and best practices. Evaluate technologies, define strategies, and develop reusable solutions. Work with Azure Data Management and integration tools such as Azure Fabric, Data Lake, ADF, and Logic Apps. Manage development and deployment using Azure Functions, DevOps, IAM, Kubernetes (HMG deployment, HPA), and resource administration. Implement AI/ML solutions using Azure Machine Learning, OpenAI Service, and AI capabilities (vision, speech, language, search). Utilize Generative AI tools like Azure Copilot and AI agents to enhance automation and intelligence Skills Required RoleSM/AD - Azure Fullstack Architect Industry TypeIT/ Computers - Software Functional AreaITES/BPO/Customer Service Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills AZURE FABRIC DATA LAKE ADF AZURE FULLSTACK ARCHITECT ML Other Information Job CodeGO/JC/21361/2025 Recruiter NameRamya Show more Show less
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Neudesic Passion for technology drives us, but it’s innovation that defines us . From design to development and support to management, Neudesic offers decades of experience, proven frameworks, and a disciplined approach to quickly deliver reliable, quality solutions that help our customers go to market faster. What sets us apart from the rest is an amazing collection of people who live and lead with our core values. We believe that everyone should be Passionate about what they do, Disciplined to the core, Innovative by nature, committed to a Team and conduct themselves with Integrity . If these attributes mean something to you - we'd like to hear from you. We are currently looking for Azure Data Engineers to become a member of Neudesic’ s Data & AI team. Must Have Skills: Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Scala, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Good understanding about SQL, Databases, NO-SQL DBs, Data Warehouse, Hadoop and various data storage options on the cloud. Development experience in orchestration of pipelines Experience in deployment and monitoring techniques Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling Good to Have Skills: Familiarity with DevOps, Agile Scrum methodologies and CI/CD Domain-driven development exposure Analytical / problem solving skills Strong communication skills Good experience with unit, integration and UAT support Able to design and code reusable components and functions Should able to review design, code & provide review comments with justification Zeal to learn new tool/technologies and adoption Power BI and Data Catalog experience * Be aware of phishing scams involving fraudulent career recruiting and fictitious job postings; visit our Phishing Scams page to learn more. Neudesic is an Equal Opportunity Employer Neudesic provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by local laws. Neudesic is an IBM subsidiary which has been acquired by IBM and will be integrated into the IBM organization. Neudesic will be the hiring entity. By proceeding with this application, you understand that Neudesic will share your personal information with other IBM companies involved in your recruitment process, wherever these are located. More Information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here: https://www.ibm.com/us-en/privacy?lnk=flg-priv-usen Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France